- π Table of Contents
- π Overview
- π€ Demos
- π« Features
- π©βπ» Usage
- π Roadmap
- π Changelog
- π€ Contributing
- π License
- π Acknowledgments
π About
README-AI is a command-line tool that generates robust README.md files for your software and data projects. By simply providing a remote repository URL or path to your codebase, this tool auto-generates documentation for your entire project, leveraging the capabilities OpenAI's GPT language model APIs.
π― Motivations
Simplifies the process of writing and maintaining high-quality project documentation, enhancing developer productivity and workflow. The ultimate goal of readme-ai is to improve the adoption and usability of open-source software, enabling all skill levels to better understand complex codebases and easily use open-source tools.
This project is currently under development and has an opinionated configuration. While readme-ai provides an excellent starting point for documentation, its important to review all text generated by the OpenAI API to ensure it accurately represents your codebase.
Command-Line Interface
β£ Run readme-ai in your terminal via PyPI, Docker, and more!
cli-demo.mov
βΆ Badges
β£ A slogan to highlight your poject is generated by prompting OpenAI's GPT engine. β£ Codebase dependencies and metadata are visualized using Shields.io badges. |
|
β£ Use the CLI option --badges
to select the style of badges for your README! 3 styles are currently supported:
Command: none as its the default style for readme-ai |
Command: |
Command: |
βΈ Features Table
β£ An overview paragraph and features table are generated using detailed prompts, embedded with project metadata. |
|
βΉ Dynamic Usage Guides
β£ Generates instructions for installing, running, and testing your project. Instructions are created by identifying the codebase's top language and referring to our language_setup.toml configuration file. |
|
β» Templates (coming soon)
β£ Developing CLI option letting users select from a variety of README styles β£ Templates for use-cases such as data, machine learning, web development, and more! |
|
βΌ Example README Files
Output File | Repository | Languages | |
---|---|---|---|
1οΈβ£ | readme-python.md | readme-ai | Python |
2οΈβ£ | readme-typescript.md | chatgpt-app-react-typescript | TypeScript, React |
3οΈβ£ | readme-javascript.md | assistant-chat-gpt-javascript | JavaScript, React |
4οΈβ£ | readme-kotlin.md | file.io-android-client | Kotlin, Java, Android |
5οΈβ£ | readme-rust-c.md | rust-c-app | C, Rust |
6οΈβ£ | readme-go.md | go-docker-app | Go |
7οΈβ£ | readme-java.md | java-minimal-todo | Java |
8οΈβ£ | readme-fastapi-redis.md | async-ml-inference | Python, FastAPI, Redis |
9οΈβ£ | readme-mlops.md | mlops-course | Python, Jupyter |
π | readme-pyflink.md | flink-flow | PyFlink |
Dependencies
Please ensure you have the following dependencies installed on your system:
- Python version 3.9 or higher
- Package manager (i.e. pip, conda, poetry) or Docker
- OpenAI API paid account and API key
Repository
A remote repository URL or local directory path to your project is needed to use readme-ai. The following platforms are currently supported:
- GitHub
- GitLab
- Bitbucket
- File System
OpenAI API
An OpenAI API account and API key are needed to use readme-ai. The steps below outline this process:
π OpenAI API - Setup Instructions
- Go to the OpenAI website.
- Click the "Sign up for free" button.
- Fill out the registration form with your information and agree to the terms of service.
- Once logged in, click on the "API" tab.
- Follow the instructions to create a new API key.
- Copy the API key and keep it in a secure place.
β οΈ OpenAI API - Cautionary Guidelines
-
Review Sensitive Information: Before running the application, ensure that all content in your repository is free of sensitive information. Please note that readme-ai does not filter out sensitive data from the README file, and it does not modify any files in your repository.
-
API Usage Costs: The OpenAI API is not free, and you will be charged for each request made. Costs can accumulate rapidly, so it's essential to be aware of your usage. You can monitor your API usage and associated costs by visiting the OpenAI API Usage Dashboard.
-
Paid Account Recommended: Setting up a paid account with OpenAI is highly recommended to avoid potential issues. Without a payment method on file, your API usage will be restricted to base GPT-3 models. This limitation can result in less accurate README file generation and may lead to API errors due to request limits.
-
Runtime Considerations: README file generation typically takes less than a minute. If the process exceeds a few minutes (e.g., 3 minutes), it's advisable to terminate readme-ai to prevent extended processing times.
Using Pip
Pip is the recommended installation method for most users.
pip install readmeai
Using Docker
Docker is recommended for users wanting to run the application in a containerized environment.
docker pull zeroxeli/readme-ai:latest
Manually
1οΈβ£ Clone the readme-ai repository.
git clone https://github.com/eli64s/readme-ai
2οΈβ£ Navigate to readme-ai directory.
cd readme-ai
3οΈβ£ Install dependencies using a method below.
Using Bash
bash setup/setup.sh
Using Conda
conda create -n readmeai python=3.9 -y && \
conda activate readmeai && \
pip install -r requirements.txt
Using Poetry
poetry shell && \
poetry install
Command-Line Arguments
To generate a README.md file, use the readmeai
command in your terminal, along with the arguments below.
Short Flag | Long Flag | Description | Status |
---|---|---|---|
-k |
--api-key |
Your language model API secret key. | Optional |
-b |
--badges |
Select 'shields' or 'square' to change badge style. | Optional |
-f |
--offline-mode |
Run offline without calling the OpenAI API. | Optional |
-m |
--model |
Large language model engine (gpt-3.5-turbo) | Optional |
-o |
--output |
The output path for your README.md file. | Optional |
-r |
--repository |
The URL or path to your code repository. | Required |
-t |
--temperature |
The temperature (randomness) of the model. | Optional |
-l |
--language |
The language of text to write README in. | Coming Soon! |
-s |
--style |
The README template style to build. | Coming Soon! |
Custom Settings
To customize the README file generation process, you can modify the project's configuration file:
- api - OpenAI language model API configuration settings.
- git - Default git repository settings used if no repository is provided.
- paths - Directory paths and files used by the readme-ai application.
- prompts - Large language model prompts used to generate the README file.
- md - Dynamic Markdown section code templates used to build the README file.
Using Pip
# Option 1: Run readmeai command with all required command-line arguments.
readmeai --api-key "YOUR_API_KEY" --output readme-ai.md --repository https://github.com/eli64s/readme-ai
# Option 2: Run readmeai command with OpenAI API key set as environment variable.
export OPENAI_API_KEY="YOUR_API_KEY"
readmeai -o readme-ai.md -r https://github.com/eli64s/readme-ai -b shields
Using Docker
# Option 1: Run Docker container with all required command-line arguments.
docker run -it \
-e OPENAI_API_KEY="YOUR_API_KEY" \
-v "$(pwd)":/app zeroxeli/readme-ai:latest \
readmeai -o readme-ai.md -r https://github.com/eli64s/readme-ai
# Option 2: Run Docker container with OpenAI API key set as environment variable.
export OPENAI_API_KEY="YOUR_API_KEY"
docker run -it \
-e OPENAI_API_KEY=$OPENAI_API_KEY \
-v "$(pwd)":/app zeroxeli/readme-ai:latest \
readmeai -o readme-ai.md -r https://github.com/eli64s/readme-ai
Using Conda
conda activate readmeai
export OPENAI_API_KEY="YOUR_API_KEY"
python3 -m readmeai.cli.commands -o readme-ai.md -r https://github.com/eli64s/readme-ai
Using Poetry
poetry shell
export OPENAI_API_KEY="YOUR_API_KEY"
poetry run python3 -m readmeai.cli.commands -o readme-ai.md -r https://github.com/eli64s/readme-ai
Using Streamlit
Use the app directly in your browser via Streamlit Community Cloud.
Execute the test suite using the command below.
bash scripts/test.sh
- Publish project as a Python library via PyPI for easy installation.
- Make project available as a Docker image on Docker Hub.
- Integrate and deploy app with Streamlit to make tool more widely accessible.
- Refactor our large language model engine to enable more robust README generation.
- Explore LangChain π¦οΈπ as an alternative to using the OpenAI API directly.
- Explore LlamaIndex π¦ framework and Retrieval augmented generation (RAG) paradigm.
- Add support for generating README files in any language (i.e. CN, ES, FR, JA, KO, RU).
- Design README output templates for a variety of use-cases (i.e. data, web-dev, minimal, etc.)
- Develop GitHub Actions script to automatically update the README file when new code is pushed.
Looking to contribute to readme-ai? Here is what you can do to help:
- π Look for opportunities to make the code more efficient and readable.
- π€ Exception handling and bug fixes during README generation.
- π Improve documentation and add more examples to the README.
- π‘ Add support for generating README files in any language (i.e. CN, ES, FR, JA, KO, RU).
- π¨ Create new templates for different use-cases (i.e. data, web-dev, minimal, etc.).
- The README is constructed in sections, defined in the [config.toml] file.
- Follow the existing format to get started.
Badge Icons