Tags: OpenBMB/RepoAgent
Tags
feat(settings): remove Config Manager and optimize initialization (#83) - Implemented lazy loading. - Introduce lazy loading to optimize performance by delaying resource initialization until needed. - Remove `config_manager` and config file saving functionality. - Update default model names and remove context window handling. - Introduce `llama_index.llms.openai` for LLM communication to enhance integration and compatibility with updated LLM protocols. - Add `ChatPromptTemplate` to manage prompts more efficiently and improve prompt consistency. - Streamline implementation within `ChatEngine` for improved readability and maintainability. - Fix issue where logger failed to capture full error stack trace in certain cases. - Remove `tiktoken` dependency, directly retrieve token usage from API response. - Add import sorting for improved readability and consistency. - Remove deprecated test cases. - Removed deprecated `config.toml.template` file - Removed `sync_func` to prevent TypeError and ensure file saving. - Removed model name restriction. - Updated README to mention GitHub Actions support. - Migrated to llamaindex for enhanced functionality. - Integrated `llama_index.vector_stores.chroma` and `llama_index.embeddings.openai` for improved vector storage and embeddings. - Added semantic chunking for better document segmentation. - Introduced `ChatPromptTemplate` and `PromptTemplate` for customizable chat prompts. - Added `initialize_with_params` method for dynamic initialization - Changed `print_hierarchy` from a subcommand to an option in the `run` command. - Defined `chat-with-repo` as an optional dependency in `pyproject.toml`.