8000 GitHub - nrvo/serge: A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
/ serge Public
forked from serge-chat/serge

A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.

License

Notifications You must be signed in to change notification settings

nrvo/serge

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Serge - LLaMA made easy 🦙

License Discord

Serge is a chat interface crafted with llama.cpp for running GGUF models. No API keys, entirely self-hosted!

  • 🌐 SvelteKit frontend
  • 💾 Redis for storing chat history & parameters
  • ⚙️ FastAPI + LangChain for the API, wrapping calls to llama.cpp using the python bindings

🎥 Demo:

demo.webm

⚡️ Quick start

🐳 Docker:

docker run -d \
    --name serge \
    -v weights:/usr/src/app/weights \
    -v datadb:/data/db/ \
    -p 8008:8008 \
    ghcr.io/serge-chat/serge:latest

🐙 Docker Compose:

services:
  serge:
    image: ghcr.io/serge-chat/serge:latest
    container_name: serge
    restart: unless-stopped
    ports:
      - 8008:8008
    volumes:
      - weights:/usr/src/app/weights
      - datadb:/data/db/

volumes:
  weights:
  datadb:

Then, just visit http://localhost:8008/, You can find the API documentation at http://localhost:8008/api/docs

🖥️ Windows

Ensure you have Docker Desktop installed, WSL2 configured, and enough free RAM to run models.

☁️ Kubernetes

Instructions for setting up Serge on Kubernetes can be found in the wiki.

🧠 Supported Models

Category Models
CodeLLaMA 7B, 13B
LLaMA 7B, 13B, 70B
Mistral 7B-Instruct, 7B-OpenOrca
Vicuna 7B-v1.5, 13B-v1.5
Zephyr 7B-Alpha, 7B-Beta

Additional weights can be added to the serge_weights volume using docker cp:

docker cp ./my_weight.bin serge:/usr/src/app/weights/

⚠️ Memory Usage

LLaMA will crash if you don't have enough available memory for the model:

💬 Support

Need help? Join our Discord

⭐️ Stargazers

Stargazers over time

🧾 License

Nathan Sarrazin and Contributors. Serge is free and open-source software licensed under the MIT License and Apache-2.0.

🤝 Contributing

If you discover a bug or have a feature idea, feel free to open an issue or PR.

To run Serge in development mode:

git clone https://github.com/serge-chat/serge.git
docker compose -f docker-compose.dev.yml up -d --build

About

A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Svelte 60.3%
  • Python 21.9%
  • CSS 6.5%
  • Shell 3.1%
  • TypeScript 2.7%
  • Smarty 1.8%
  • Other 3.7%
0