8000 GitHub - dev-null321/soloLlama: soloLlama is a web UI for ollama
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

dev-null321/soloLlama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

soloLlama

A tiny self-hosted web UI for chatting with local Llama / Mistral / Gemma models managed by Ollama.
Written in pure Go—builds to a single executable that embeds the entire HTML/JS front-end.


Features

Description
Streaming answers token-by-token
Conversation context (messages array passed to Ollama)
Upload a .gguf file + optional Modelfile and create an Ollama tag from the UI
One-click model switcher
Runs on Windows, macOS, Linux; cross-compile from any host

Prerequisites

Requirement Notes
Go 1.21.6+ https://go.dev/dl
Ollama Needs to be reachable on localhost:11434.
• native install on macOS/Linux
Windows: run in WSL 2 or Docker: docker run -d -p 11434:11434 ollama/ollama
Linux If you don't have xdg-utils installed please install it.
soloLlama http://localhost:8081

Build

# clone
https://github.com/dev-null321/soloLlama.git
cd soloLlama

# build native binary (Linux/macOS host)
go build -o soloLlama   # builds for your host OS

# cross-compile to Windows 64-bit
GOOS=windows GOARCH=amd64 go build -o soloLlama.exe
soloLlama.mov

About

soloLlama is a web UI for ollama

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0