Help with tool invocation using McpClient #6834
-
Using OpenAI, McpClient and MEAI to configure tool calling using the MCP server using Microsoft.Extensions.AI;
using ModelContextProtocol.Client;
using OpenAI;
using OpenAI.Chat;
using ChatMessage = Microsoft.Extensions.AI.ChatMessage;
IChatClient client = new ChatClientBuilder(new OpenAIClient(Environment.GetEnvironmentVariable("OPENAI_API_KEY")!)
.GetChatClient("gpt-4o")
.AsIChatClient())
.UseFunctionInvocation()
.Build();
var mcpClient = await McpClientFactory.CreateAsync(
new SseClientTransport(new () {
Endpoint = new Uri("https://learn.microsoft.com/api/mcp"),
TransportMode = HttpTransportMode.AutoDetect,
Name = "msdocs" }));
var tools = await mcpClient.ListToolsAsync();
Console.WriteLine("Tools Found: " + string.Join(", ", tools.Select(t => t.Name)));
var chatOptions = new ChatOptions
{
AllowMultipleToolCalls = true,
Instructions = "Use the following tools to answer the user's question. If you don't know the answer, use the 'Search Microsoft Docs' tool to find relevant information.",
Tools = [.. tools]
};
ChatResponse response = await client.GetResponseAsync(
[new ChatMessage(ChatRole.User, "What is the Microsoft Docs platform?")],
chatOptions);
Console.WriteLine(response.Text);
Console.WriteLine("\n Tools uage: "+ ((ChatCompletion)response.RawRepresentation!).ToolCalls.Count); I can see there 2 tools found in the MCP server, however after getting the response the ToolCalls.Count is retuned as wondering if this is the best way to integrate the McpClient with MEAI. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
The If you want to see what's going on, try changing:
to instead be:
You'll end up seeing all the requests/responses flowing from the function invocation IChatClient through to the OpenAI one. When I do that, and run your example, it highlights that the model isn't making any tool call requests. However, if I change the prompt from
and
In short, your use is fine; your attempt to view the internal workings via the RawRepresentation is just giving you back only a skewed answer. |
Beta Was this translation helpful? Give feedback.
The
UseFunctionInvocation
will employ middleware that will handle any tool call requests, invoking those tools, and sending back another request to the server, all hidden behind your single GetResponseAsync. The final ChatResponse you get back will be for the nth request/response, after all function calls have been processed. Thus, the OpenAI ChatCompletion you're getting back in that final response is for a response in which no tool calls happened.If you want to see what's going on, try changing:
to instead be:
You'll end up seeing all the req…