Closed
Description
Hi @karthink, I was using this package with ollama
just fine until I had to upgrade my ollama
binary to the newest version (v0.1.30, to fix some error)
Now I'm getting Response Error: nil
, Ollama error (nil): Malformed JSON in response.
Running ollama
in my terminal works fine
Here's my config:
(setq gptel-model "mistral"
gptel-log-level :debug
gptel-backend (gptel-make-ollama "Ollama"
;; Resolved via tailscale
:host "desktop:11434"
:stream t
:models '("mistral"
"mixtral"
"phind-codellama"
"codellama"))
My gptel--known-backends
is:
(("Ollama" . #s(gptel-ollama "Ollama" "desktop:11434" nil "http" t "/api/generate" nil
("mistral" "mixtral" "phind-codellama" "codellama")
"http://desktop:11434/api/generate" nil))
("ChatGPT" . #s(gptel-openai "ChatGPT" "api.openai.com" #<subr F616e6f6e796d6f75732d6c616d626461_anonymous_lambda_11> "https" t "/v1/chat/completions" gptel-api-key
("gpt-3.5-turbo" "gpt-3.5-turbo-16k" "gpt-4" "gpt-4-turbo-preview" "gpt-4-32k" "gpt-4-1106-preview" "gpt-4-0125-preview")
"https://api.openai.com/v1/chat/completions" nil)))
Here's all I see in the *gpt-log*
file:
{
"gptel": "request body",
"timestamp": "2024-03-30 17:51:51"
}
{
"model": "mistral",
"system": "You are a large language model living in Emacs and a helpful assistant. Respond concisely.",
"prompt": "Hi",
"stream": true
}
Let me know if I can provide more helpful info. Thanks!
Metadata
Metadata
Assignees
Labels
No labels