OpenAI web search and reasoning #2783
Unanswered
BlairWoods
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hey Team
I've got an interesting challenge that I can't seem to find an answer for. We're utilising LiteLlm so we can use OpenAI's GPT-5 model for one of our agents. The specific challenge is that variables that are accepted by the OpenAI python SDK don't seem to work with LiteLlm. Here is some code to explain this further:
This code will throw an error for the "reasoning" and "include" parameters in the model_parameters object:
("litellm.BadRequestError: OpenAIException - Unknown parameter: 'reasoning'.",)
("litellm.BadRequestError: OpenAIException - Unknown parameter: 'include'.",)
However, if I just include "tools" in the model_parameters object, it works. All of this works correctly if I go directly though the native OpenAI python SDK. Being able to control the reasoning level is important for this use case. The workaround is to create an agent with a tool that utilised the native OpenAI python SDK. It would be great to be able to do this directly through an agent without using the OpenAI python SDK.
My questions for you all:
Thanks in advance :)
Beta Was this translation helpful? Give feedback.
All reactions