8000 GitHub - bisbic/ollama-poc
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

bisbic/ollama-poc

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

POC using ollama, chatbot-ollama and different models

Starting the ollama server and chatbot-ollama

sudo docker compose up -d

Pulling fundational models

Search models here

# Get a shell into ollama container
docker exec -it ollama-poc-ollama-1 bash
# List models
ollama list
# Pull or update models
ollama pull llama3.1:8b
ollama pull codellama:7b

Running chatbot-ollama in browser

localhost:3000

Query ollama server

curl http://localhost:11434/api/generate -d '{
"model": "llama3.1:8b",
"prompt":"Which programming languajes do you know?",
"stream": false
}'

Using codex cli with ollama models

  1. (Install codex cli from here)[https://github.com/openai/codex-cli]
  2. Set configuration file ~/.codex/config.json to:
{
  "model": "qwen2.5-coder:7b",
  "provider": "ollama",
  "providers": {
    "ollama": {
      "name": "Ollama",
      "baseURL": "http://localhost:11434/v1",
      "envKey": "OLLAMA_API_KEY"
    }
  },
  "history": {
    "maxSize": 1000,
    "saveHistory": true,
    "sensitivePatterns": []
  }
}
  1. Run codex cli
codex --model qwen2.5-coder:7b --provider ollama

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0