8000 Tags · OpenBMB/RepoAgent · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Tags: OpenBMB/RepoAgent

Tags

v0.2.0

Toggle v0.2.0's commit message
chore(deps): update all dependencies to latest versions

v0.1.5

Toggle v0.1.5's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
feat(settings): remove Config Manager and optimize initialization (#83)

- Implemented lazy loading.
- Introduce lazy loading to optimize performance by delaying resource initialization until needed.
- Remove `config_manager` and config file saving functionality.
- Update default model names and remove context window handling.
- Introduce `llama_index.llms.openai` for LLM communication to enhance integration and compatibility with updated LLM protocols.
- Add `ChatPromptTemplate` to manage prompts more efficiently and improve prompt consistency.
- Streamline implementation within `ChatEngine` for improved readability and maintainability.
- Fix issue where logger failed to capture full error stack trace in certain cases.
- Remove `tiktoken` dependency, directly retrieve token usage from API response.
- Add import sorting for improved readability and consistency.
- Remove deprecated test cases.
- Removed deprecated `config.toml.template` file
- Removed `sync_func` to prevent TypeError and ensure file saving.
- Removed model name restriction.
- Updated README to mention GitHub Actions support.
- Migrated to llamaindex for enhanced functionality.
- Integrated `llama_index.vector_stores.chroma` and `llama_index.embeddings.openai` for improved vector storage and embeddings.
- Added semantic chunking for better document segmentation.
- Introduced `ChatPromptTemplate` and `PromptTemplate` for customizable chat prompts.
- Added `initialize_with_params` method for dynamic initialization
- Changed `print_hierarchy` from a subcommand to an option in the `run` command.
- Defined `chat-with-repo` as an optional dependency in `pyproject.toml`.

v0.1.4

Toggle v0.1.4's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
fix(settings): resolve `PydanticUserError` in `repo_agent/settings.py` (

#80)

- Update dependencies to latest versions
- Switch to llama-index-llms-openai
- Specify minimum Python version as 3.11
- Export to requirements.txt

v0.1.3

Toggle v0.1.3's commit message
docs: update image URLs in `README` and `README_CN`

- Bump version to v0.1.3
- Add release.yml for upload release to PyPI.
0