-
Notifications
You must be signed in to change notification settings - Fork 122
Add OpenRouter chat model #448
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me! Some typechecking errors from mypy in the failing github action - I think many will be from overriding the OpenaiChatModel
methods and require ignore comments to silence.
Thanks for the feedback and the suggestions @jackmpcollins . I've updated the PR. Mypy is not expected to complain anymore. |
Hey @jackmpcollins , would it be possible approve the workflow run to validate that I've fixed the linting errors and review/merge this PR? |
Thanks for the PR! |
@piiq Released now as part of https://github.com/jackmpcollins/magentic/releases/tag/v0.40.0 |
This PR introduces an
OpenRouterChatModel
to enable support for OpenRouter-specific features not accessible throughOpenaiChatModel
, even when used via the Chat interface.Motivation:
Key OpenRouter capabilities were inaccessible using the base
OpenaiChatModel
, particularly those critical for tool usage and structured output with models like Qwen and DeepSeek.Features Implemented
Notes:
OpenaiChatModel
Additional benefits- OpenRouter was not working with the @prompt decorator, not it is.
Here are a few screenshots demoing the functionality:
@prompt
decoratorModel return type

Usage with

RetryChatModel
Streaming response

Reasoning tokens + streaming response
