8000 Local model API request fails if prompt ingestion takes more than 10 minutes · Issue #3621 · RooCodeInc/Roo-Code · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Local model API request fails if prompt ingestion takes more than 10 minutes #3621

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
FrancoFun opened this issue May 14, 2025 · 2 comments · May be fixed by #4076
Open

Local model API request fails if prompt ingestion takes more than 10 minutes #3621

FrancoFun opened this issue May 14, 2025 · 2 comments · May be fixed by #4076
Assignees
Labels
bug Something isn't working Issue - In Progress Someone is actively working on this. Should link to a PR soon.

Comments

@FrancoFun
Copy link

App Version

3.16.6

API Provider

LM Studio

Model Used

Qwen3-32B

🔁 Steps to Reproduce

I am trying to use a local Qwen3-32B model via llama.cpp. To do so, I use the LMStudio integration that I point to the local server. Everything works fine, but after 10 minutes (600 seconds), the connection is dropped and I get an API Request Failed message. The inference is on cpu and quite slow, but I would be happy to let it crunch while I'm doing something else. If I use the tiny Qwen3 0.6B model, the inference is fast enough and everything works as expected (although with very mediocre results).

When it fails, llama.cpp finishes processing the prompt anyway. It succeeds on retry, the prompt being already cached.

💥 Outcome Summary (Optional)

No response

📄 Relevant Logs or Errors

@FrancoFun FrancoFun added the bug Something isn't working label May 14, 2025
@hannesrudolph hannesrudolph moved this from New to Issue [Needs Scoping] in Roo Code Roadmap May 16, 2025
@hannesrudolph hannesrudolph added the Issue - Needs Scoping Valid, but needs effort estimate or design input before work can start. label May 16, 2025
@hannesrudolph hannesrudolph moved this from New to Issue [Needs Scoping] in Roo Code Roadmap May 20, 2025
@FrancoFun
Copy link
Author
FrancoFun commented May 24, 2025

This may come from the default 10 minutes OpenAI module timeout:
https://github.com/openai/openai-python?tab=readme-ov-file#timeouts

A custom timeout would need to be passed for this provider. Other OpenAI compatible providers like Ollama and OpenAI Compatible appear to face the same issue when used with a slower local model and a large prompt.

@github-project-automation github-project-automation bot moved this from Issue [Needs Scoping] to Done in Roo Code Roadmap May 24, 2025
@github-project-automation github-project-automation bot moved this from Issue [Needs Scoping] to Done in Roo Code Roadmap May 24, 2025
@FrancoFun
Copy link
Author

Sorry. Closed by accident.

@FrancoFun FrancoFun reopened this May 25, 2025
@github-project-automation github-project-automation bot moved this from Done to New in Roo Code Roadmap May 25, 2025
@github-project-automation github-project-automation bot moved this from Done to New in Roo Code Roadmap May 25, 2025
@hannesrudolph hannesrudolph added Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. and removed Issue - Needs Scoping Valid, but needs effort estimate or design input before work can start. labels May 27, 2025
@hannesrudolph hannesrudolph moved this from Triage to Issue [In Progress] in Roo Code Roadmap May 28, 2025
@hannesrudolph hannesrudolph added Issue - In Progress Someone is actively working on this. Should link to a PR soon. and removed Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. labels May 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working Issue - In Progress Someone is actively working on this. Should link to a PR soon.
Projects
Status: Issue [In Progress]
Development

Successfully merging a pull request may close this issue.

2 participants
0