-
Notifications
You must be signed in to change notification settings - Fork 577
Use context7 with open-source LLM and third-party MCP clients #165
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Whether or not the MCP server is used is entirely up to the LLM. You can try being more explicit in your prompt or use rules files to tell the LLM to use context7. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Dear @enesakar and authors/contributors,
Thank you so much for this repo! And sorry for opening an issue ..
I'm new to MCP and I am having fun playing with context7. However, I noticed it works mostly smoothly with Claude desktop, but when I try it with other MCP clients, things didn't go very well.
For example, I was trying it with mcphost (https://github.com/mark3labs/mcphost) with Ollama on qwen2.5 (which is said to support MCP), even with the example prompt in this repo:
Create a basic Next.js project with app router. use context7
, it sometimes doesn't realize it needs to use the MCP and sometimes it will use the MCP but still returned something irrelevant.I'm wondering do you know is it a problem of the LLM or more of a problem with the MCP client? My goal is to use context7 with an open-source local LLM, are you aware of a good model and good MCP client to be used with context7?
Thank you so much for your time and help!
The text was updated successfully, but these errors were encountered: