-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Add LiteLLM provider for OpenAI API compatible models #2606
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
*, | ||
api_key: str | None = None, | ||
api_base: str | None = None, | ||
custom_llm_provider: str | None = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What do you think about calling this provider_name
like in HuggingFaceProvider
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
custom_llm_provider
seems to be more aligning with LiteLLM's support for setting one's own provider.
4e48bb7
to
785ffbe
Compare
@DouweM Pipeline is failing due to:
I am not sure what to do. In the |
The MR adds support for LiteLLM in Pydantic AI. The idea is to use a
LiteLLMProvider
forOpenAIModel
since LiteLLM API is OpenAI API compatible.Closes #1496