Description
Short description of current behavior
When attempting to create an Ollama model using the deepseek-r1:1.5b model through MindsDB, the creation process fails with the following error:
[ollama_engine]: ConnectionError: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/tags (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0xffff978bf5b0>: Failed to establish a new connection: [Errno 111] Connection refused'))
This occurs even when:
The Ollama service is confirmed to be running (ollama serve)
The model is available locally (ollama list shows deepseek-r1:1.5b)
Direct API access works (curl http://localhost:11434/api/tags returns valid response)
Below syntax I am using:
CREATE MODEL ollama_model
PREDICT target
USING
engine = 'ollama_engine',
model_name = 'deepseek-r1:1.5b';
Video or screenshots
Expected behavior
The Ollama model should:
-
Successfully connect to the local Ollama service
-
Validate the specified model exists
-
Create the model in MindsDB's model registry
How to reproduce the error
-
First I am starting Ollama service:
ollama serve -
In another terminal, I am verifying if the service is accessible service is accessible:
curl http://localhost:11434/api/tags -
In MindsDB, create the ML engine [which is successfully getting created]:
CREATE ML_ENGINE ollama_engine
FROM ollama
USING
base_url = 'http://localhost:11434'; -
When I am attempting to create the model:
CREATE MODEL ollama_model
PREDICT target
USING
engine = 'ollama_engine',
model_name = 'deepseek-r1:1.5b'; -
Observed the connection refused error despite Ollama running.
Anything else?
Service status: ollama serve shows active connections on port 11434