Microsoft Agent Framework consists of two core packages: Microsoft.Extensions.AI (abstractions) and Microsoft.Agents.AI (workflow orchestration). Together, they provide production-ready AI agent infrastructure for .NET.
This guide teaches you to use these frameworks for PMCR-O agents, including tool calling, structured output, and error recovery.
Package Overview
Microsoft.Extensions.AI
Provides abstractions for AI clients:
- IChatClient: Unified interface for LLM interactions
- ChatClientBuilder: Fluent API for configuring clients
- ChatOptions: Request configuration (temperature, response format, etc.)
Microsoft.Agents.AI
Provides workflow orchestration:
- Agent workflows: Multi-step agent execution
- Tool invocation: Function calling middleware
- Error recovery: Retry and fallback patterns
Step 1: Install Packages
cd YourAgentService
dotnet add package Microsoft.Agents.AI
dotnet add package Microsoft.Extensions.AI
dotnet add package OllamaSharp
Step 2: Register IChatClient
The IChatClient interface abstracts LLM interactions. Here's how to register it with Ollama:
using Microsoft.Extensions.AI;
using OllamaSharp;
var builder = WebApplication.CreateBuilder(args);
// Get Ollama connection string (injected by Aspire)
var ollamaUri = builder.Configuration.GetConnectionString("ollama")
?? "http://localhost:11434";
var modelId = "qwen2.5-coder:7b";
// Configure HttpClient for Ollama
builder.Services.AddHttpClient("ollama", client =>
{
client.BaseAddress = new Uri(ollamaUri);
client.Timeout = Timeout.InfiniteTimeSpan; // LLM inference can take time
})
.AddStandardResilienceHandler(options =>
{
options.AttemptTimeout.Timeout = TimeSpan.FromMinutes(3);
options.TotalRequestTimeout.Timeout = TimeSpan.FromMinutes(5);
options.Retry.MaxRetryAttempts = 2;
});
// Register IChatClient with function invocation middleware
builder.Services.AddSingleton<IChatClient>(sp =>
{
var httpClient = sp.GetRequiredService<IHttpClientFactory>().CreateClient("ollama");
var baseClient = new OllamaApiClient(httpClient, modelId);
return new ChatClientBuilder(baseClient)
.UseFunctionInvocation() // ✅ Critical: Enables tool calling
.Build();
});
Step 3: Use IChatClient in Your Agent
Inject IChatClient into your agent service:
using Microsoft.Extensions.AI;
public class PlannerAgent : AgentService.AgentServiceBase
{
private readonly IChatClient _chatClient;
private readonly ILogger<PlannerAgent> _logger;
public PlannerAgent(IChatClient chatClient, ILogger<PlannerAgent> logger)
{
_chatClient = chatClient;
_logger = logger;
}
public override async Task<AgentResponse> ExecuteTask(
AgentRequest request,
ServerCallContext context)
{
_logger.LogInformation("🧭 I AM the Planner. I am analyzing: {Intent}", request.Intent);
// Use IChatClient to interact with LLM
var response = await _chatClient.CompleteChatAsync(
new ChatHistory
{
new ChatMessage(ChatRole.System, "I AM the Planner. I create minimal viable plans."),
new ChatMessage(ChatRole.User, request.Intent)
},
new ChatOptions
{
Temperature = 0.7,
ResponseFormat = ChatResponseFormat.Json // Structured output
}
);
return new AgentResponse
{
Content = response.Content,
Success = true
};
}
}
Step 4: Enable Tool Calling
The .UseFunctionInvocation() middleware enables tool calling. Define tools using function definitions:
// Define a tool
var fileReadTool = new FunctionDefinition
{
Name = "read_file",
Description = "Reads the contents of a file",
Parameters = new
{
type = "object",
properties = new
{
path = new { type = "string", description = "File path to read" }
},
required = new[] { "path" }
}
};
// Register tool with chat client
var chatClient = new ChatClientBuilder(baseClient)
.UseFunctionInvocation()
.WithFunction("read_file", async (args) =>
{
var path = args["path"]?.ToString() ?? "";
return await File.ReadAllTextAsync(path);
})
.Build();
Step 5: Structured Output with JSON Schema
Microsoft.Extensions.AI supports structured output via ChatResponseFormat.Json:
var chatOptions = new ChatOptions
{
ResponseFormat = ChatResponseFormat.Json,
AdditionalProperties = new Dictionary<string, object?>
{
["schema"] = JsonSerializer.Serialize(new
{
type = "object",
properties = new
{
plan = new { type = "string" },
steps = new
{
type = "array",
items = new
{
type = "object",
properties = new
{
action = new { type = "string" },
rationale = new { type = "string" }
}
}
},
estimated_complexity = new { type = "string", @enum = new[] { "low", "medium", "high" } }
},
required = new[] { "plan", "steps" }
})
}
};
var response = await _chatClient.CompleteChatAsync(history, chatOptions);
var plan = JsonSerializer.Deserialize<PlanResponse>(response.Content);
Step 6: Workflow Orchestration (Microsoft.Agents.AI)
For multi-agent workflows, use Microsoft.Agents.AI:
using Microsoft.Agents.AI;
// Create workflow builder
var workflow = new AgentWorkflowBuilder()
.AddStep("planner", async (context) =>
{
// Planner agent logic
return await plannerAgent.ExecuteAsync(context);
})
.AddStep("maker", async (context) =>
{
// Maker agent logic (runs after planner)
return await makerAgent.ExecuteAsync(context);
})
.AddStep("checker", async (context) =>
{
// Checker agent logic (runs after maker)
return await checkerAgent.ValidateAsync(context);
})
.WithErrorRecovery((error, context) =>
{
// On error, route to reflector
return "reflector";
})
.Build();
// Execute workflow
var result = await workflow.ExecuteAsync(seedIntent);
Key Benefits
- Runtime Abstraction: Switch between Ollama, OpenAI, Azure OpenAI without changing agent code
- Tool Calling: Native function invocation middleware
- Structured Output: JSON schema validation built-in
- Error Recovery: Workflow-level retry and fallback
- Production-Ready: Resilience handlers, timeouts, circuit breakers
Next Steps
- Read Creating Your First PMCR-O Agent
- Read Structured Output with JSON Schema
- Explore the complete article library
Build Your Own Strange Loop
The PMCR-O framework is open. Star the repository. Fork it. Seed your own intent.