Skip to content

Conversation

mochow13
Copy link

The MR adds support for LiteLLM in Pydantic AI. The idea is to use a LiteLLMProvider for OpenAIModel since LiteLLM API is OpenAI API compatible.

Closes #1496

*,
api_key: str | None = None,
api_base: str | None = None,
custom_llm_provider: str | None = None,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you think about calling this provider_name like in HuggingFaceProvider?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

custom_llm_provider seems to be more aligning with LiteLLM's support for setting one's own provider.

@mochow13 mochow13 force-pushed the mottakin/litellm-integration branch from 4e48bb7 to 785ffbe Compare August 27, 2025 22:19
@mochow13
Copy link
Author

@DouweM Pipeline is failing due to:

Name                               Stmts   Miss Branch BrPart   Cover   Missing
-------------------------------------------------------------------------------
tests/models/test_model_names.py      68      6      8      0  92.11%   78-98, 140-141
-------------------------------------------------------------------------------
TOTAL                              28027      6   4472      0  99.98%

236 files skipped due to complete coverage.
Coverage failure: total of 99.98 is less than fail-under=100.00
make: *** [test] Error 2

I am not sure what to do. In the test_model_names.py file, there is no code related to OpenRouter or Together. So confused whether we should put any tests for LiteLLM or not.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Adding LiteLLM as model wrap just like how google-adk does it.
3 participants