-
Notifications
You must be signed in to change notification settings - Fork 1.5k
LiteLLM model list not updating #3613
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Roo Code is designed to automatically fetch and list available models from your LiteLLM server, eliminating the need for manual configuration [1][2]. However, if the model list is not updating as expected, it might be due to a caching mechanism or a bug. Unfortunately, the documentation does not specify any caching mechanisms or error handling related to LiteLLM model list updates [1]. To address this issue, you might want to check the Roo Code issue tracker for any reported bugs related to LiteLLM model list updates. If the problem persists, consider submitting a bug report to help the maintainers investigate further. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other |
In my case there is just no model. With OpenAI compatibile provider, litellm works well. |
In my case, With OpenAI compatibile provider don't fetch models at all. It used to work. |
@slytechnical is working on this |
ohhh ok cool thanks @mrubens |
App Version
3.16.6
API Provider
Not Applicable / Other
Model Used
N/A
🔁 Steps to Reproduce
Using LiteLLM as a local LLM proxy, Roo does not query LiteLLM to get updated model lists on VSCode start or when adding a new LiteLLM profile in Roo. Roo seems to query LiteLLM for a model list once and persist that information permanently.
💥 Outcome Summary (Optional)
No response
📄 Relevant Logs or Errors
The text was updated successfully, but these errors were encountered: