8000 GitHub - centopw/Pyma: Allow user to have a conversation with different AI models, build on OLlama
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

centopw/Pyma

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pyma

Allow user to have a conversation with different AI models, build on OLlama

Features

  • Choose from different AI models from Ollama like Llama 2, Mistral, etc.
  • Have a natural conversation with the selected AI assistant
  • Useful commands like search, run code, etc.
  • Saves chat history to file

Usage

  1. Clone the repository
    git clone https://github.com/centopw/Pyma.git
  1. Install the requirements
    pip install -r requirements.txt
  1. Run the main.py file
    python /app/main.py

Commands

  • /exit - Exit the program
  • /search - Searches DuckDuckGo and returns top result
  • ~ More are coming soon!

Configuration

The main configuration is in constants.py.

Set API_URL to the endpoint for model APIs. Customize other constants if needed.

Code Structure

  • main.py - Main file to run
  • constants.py - Constants used in the program
  • chat.py - Chat class to handle chat logic
  • utils.py - Utility functions

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

About

Allow user to have a conversation with different AI models, build on OLlama

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

0