← Back to Portfolio

PMCR-O and Vector Databases: Enhancing Memory with pgvector

PMCR-O agents don't just execute tasks—they remember. Every plan, every artifact, every reflection becomes part of an externalized cognitive trail. But how do you store and retrieve this knowledge efficiently at scale?

The answer is pgvector—PostgreSQL's vector extension that enables semantic search over millions of cognitive trails. This guide shows you how to integrate pgvector with PMCR-O for production-ready RAG (Retrieval-Augmented Generation).

Why Vector Databases for PMCR-O?

Traditional databases store data by exact matches. Vector databases store data by meaning. When a PMCR-O agent needs to recall a similar problem it solved before, semantic search finds it even if the wording differs.

💡 The Cognitive Trail Advantage: pgvector enables agents to query their own history semantically. "How did I solve authentication before?" finds relevant solutions even if the exact keywords don't match.

Step 1: Set Up PostgreSQL with pgvector

Step 1 of 5

Using Docker (recommended) or install PostgreSQL with pgvector extension:

Bash
# Using Docker with pgvector image
docker run -d \
  --name pmcro-postgres \
  -e POSTGRES_DB=knowledge \
  -e POSTGRES_USER=pmcro \
  -e POSTGRES_PASSWORD=pmcro123 \
  -p 5432:5432 \
  pgvector/pgvector:pg16

# Verify pgvector extension
docker exec pmcro-postgres psql -U pmcro -d knowledge -c "CREATE EXTENSION IF NOT EXISTS vector;"

Step 2: Create the Knowledge Service

Step 2 of 5

Create a new webapi project for the Knowledge service (matching your actual architecture):

Bash
dotnet new webapi -n ProjectName.KnowledgeService
cd ProjectName.KnowledgeService

# Add required packages
dotnet add package Npgsql.EntityFrameworkCore.PostgreSQL
dotnet add package Pgvector.EntityFrameworkCore
dotnet add package Pgvector
dotnet add package OllamaSharp
dotnet add package Microsoft.Extensions.AI

# Add project references
dotnet add reference ../ProjectName.ServiceDefaults/ProjectName.ServiceDefaults.csproj
dotnet add reference ../ProjectName.Shared/ProjectName.Shared.csproj

Step 3: Configure Entity Framework with pgvector

Step 3 of 5

Create the database context with vector support. Add Data/KnowledgeDbContext.cs:

C#
using Microsoft.EntityFrameworkCore;
using Npgsql.EntityFrameworkCore.PostgreSQL;
using Pgvector;
using Pgvector.EntityFrameworkCore;

namespace ProjectName.KnowledgeService.Data;

public class KnowledgeDbContext : DbContext
{
    public KnowledgeDbContext(DbContextOptions<KnowledgeDbContext> options) : base(options) { }

    public DbSet<KnowledgeItem> KnowledgeItems => Set<KnowledgeItem>();

    protected override void OnModelCreating(ModelBuilder modelBuilder)
    {
        // Enable pgvector extension
        modelBuilder.HasPostgresExtension("vector");

        modelBuilder.Entity<KnowledgeItem>(entity =>
        {
            entity.HasKey(k => k.Id);
            
            // Configure vector column (768 dimensions for nomic-embed-text)
            entity.Property(k => k.Embedding)
                .HasColumnType("vector(768)")
                .HasColumnName("embedding");

            // Create IVFFlat index for fast similarity search
            entity.HasIndex(k => k.Embedding)
                .HasMethod("ivfflat")
                .HasOperators("vector_cosine_ops")
                .HasAnnotation("Npgsql:StorageParameter:lists", 100);
        });
    }
}

public class KnowledgeItem
{
    public Guid Id { get; set; }
    public string Content { get; set; } = "";
    public Vector Embedding { get; set; } = new Vector(Array.Empty<float>());
    public string? Metadata { get; set; }
    public DateTime Timestamp { get; set; } = DateTime.UtcNow;
    public string? Source { get; set; } // e.g., "planner", "maker", "reflector"
}

Step 4: Implement Embedding Service

Step 4 of 5

Create a service to generate embeddings using Ollama. Add Services/EmbeddingService.cs:

C#
using OllamaSharp;
using Pgvector;

namespace ProjectName.KnowledgeService.Services;

public class EmbeddingService
{
    private readonly OllamaApiClient _ollamaClient;
    private readonly ILogger<EmbeddingService> _logger;
    private const string EmbeddingModel = "nomic-embed-text";

    public EmbeddingService(OllamaApiClient ollamaClient, ILogger<EmbeddingService> logger)
    {
        _ollamaClient = ollamaClient;
        _logger = logger;
    }

    public async Task<Vector> GenerateEmbeddingAsync(string text, CancellationToken cancellationToken = default)
    {
        try
        {
            // Call Ollama embedding API
            var response = await _ollamaClient.EmbeddingsAsync(
                new EmbeddingRequest
                {
                    Model = EmbeddingModel,
                    Prompt = text
                },
                cancellationToken);

            // Convert to Vector type
            var embedding = response.Embedding?.ToArray() ?? Array.Empty<float>();
            return new Vector(embedding);
        }
        catch (Exception ex)
        {
            _logger.LogError(ex, "Failed to generate embedding for text: {Text}", text);
            throw;
        }
    }
}

Step 5: Implement Knowledge Vault Service

Step 5 of 5

Create the service that stores and retrieves knowledge. Add Services/KnowledgeVaultService.cs:

C#
using Microsoft.EntityFrameworkCore;
using ProjectName.KnowledgeService.Data;
using Pgvector;

namespace ProjectName.KnowledgeService.Services;

public class KnowledgeVaultService
{
    private readonly KnowledgeDbContext _db;
    private readonly EmbeddingService _embeddingService;
    private readonly ILogger<KnowledgeVaultService> _logger;

    public KnowledgeVaultService(
        KnowledgeDbContext db,
        EmbeddingService embeddingService,
        ILogger<KnowledgeVaultService> logger)
    {
        _db = db;
        _embeddingService = embeddingService;
        _logger = logger;
    }

    /// <summary>
    /// Store a cognitive trail item with automatic embedding generation
    /// </summary>
    public async Task<Guid> StoreAsync(string content, string? source = null, string? metadata = null)
    {
        // Generate embedding
        var embedding = await _embeddingService.GenerateEmbeddingAsync(content);

        // Store in database
        var item = new KnowledgeItem
        {
            Id = Guid.NewGuid(),
            Content = content,
            Embedding = embedding,
            Source = source,
            Metadata = metadata,
            Timestamp = DateTime.UtcNow
        };

        _db.KnowledgeItems.Add(item);
        await _db.SaveChangesAsync();

        _logger.LogInformation("📦 Stored knowledge item: {Id} from source: {Source}", item.Id, source);
        return item.Id;
    }

    /// <summary>
    /// Semantic search: Find similar cognitive trails
    /// </summary>
    public async Task<List<KnowledgeItem>> SearchAsync(string query, int topK = 5, float threshold = 0.7f)
    {
        // Generate embedding for query
        var queryEmbedding = await _embeddingService.GenerateEmbeddingAsync(query);

        // Vector similarity search using cosine similarity
        var results = await _db.KnowledgeItems
            .Select(k => new
            {
                Item = k,
                Similarity = EF.Functions.VectorCosineSimilarity(k.Embedding, queryEmbedding)
            })
            .Where(x => x.Similarity >= threshold)
            .OrderByDescending(x => x.Similarity)
            .Take(topK)
            .Select(x => x.Item)
            .ToListAsync();

        _logger.LogInformation("🔍 Found {Count} similar items for query: {Query}", results.Count, query);
        return results;
    }

    /// <summary>
    /// Retrieve cognitive trails by source (e.g., all Planner decisions)
    /// </summary>
    public async Task<List<KnowledgeItem>> GetBySourceAsync(string source, int limit = 100)
    {
        return await _db.KnowledgeItems
            .Where(k => k.Source == source)
            .OrderByDescending(k => k.Timestamp)
            .Take(limit)
            .ToListAsync();
    }
}

Step 6: Configure Program.cs

Set up the Knowledge service in Program.cs (matching your actual architecture):

C#
var builder = WebApplication.CreateBuilder(args);

// Add service defaults (OpenTelemetry, health checks, etc.)
builder.AddServiceDefaults();

// PostgreSQL with pgvector (Aspire provides connection string as "knowledge")
var connectionString = builder.Configuration.GetConnectionString("knowledge") 
    ?? "Host=localhost;Database=knowledge;Username=pmcro;Password=pmcro123";

builder.Services.AddDbContext<KnowledgeDbContext>(options =>
    options.UseNpgsql(connectionString, npgsqlOptions =>
    {
        npgsqlOptions.UseVector();
    }));

// Ollama for embeddings
var ollamaUri = builder.Configuration.GetConnectionString("ollama") 
    ?? "http://localhost:11434";

if (!Uri.TryCreate(ollamaUri, UriKind.Absolute, out var uri))
{
    uri = new Uri("http://localhost:11434");
}

builder.Services.AddSingleton(_ => new OllamaApiClient(uri));
builder.Services.AddScoped<EmbeddingService>();
builder.Services.AddScoped<KnowledgeVaultService>();

// Add controllers
builder.Services.AddControllers();

var app = builder.Build();

// Configure the HTTP request pipeline
app.MapDefaultEndpoints();

if (app.Environment.IsDevelopment())
{
    app.UseSwagger();
    app.UseSwaggerUI();
}

app.MapControllers();

// Initialize database
using (var scope = app.Services.CreateScope())
{
    var db = scope.ServiceProvider.GetRequiredService<KnowledgeDbContext>();
    await db.Database.MigrateAsync();
}

app.Run();

Step 7: Create REST API Endpoints

Add Controllers/KnowledgeController.cs for HTTP access:

C#
using Microsoft.AspNetCore.Mvc;
using ProjectName.KnowledgeService.Services;

namespace ProjectName.KnowledgeService.Controllers;

[ApiController]
[Route("api/[controller]")]
public class KnowledgeController : ControllerBase
{
    private readonly KnowledgeVaultService _vault;
    private readonly ILogger<KnowledgeController> _logger;

    public KnowledgeController(KnowledgeVaultService vault, ILogger<KnowledgeController> logger)
    {
        _vault = vault;
        _logger = logger;
    }

    /// <summary>
    /// Store a cognitive trail item
    /// </summary>
    [HttpPost("store")]
    public async Task<IActionResult> Store([FromBody] StoreRequest request)
    {
        var id = await _vault.StoreAsync(
            request.Content,
            request.Source,
            request.Metadata);

        return Ok(new { Id = id, Message = "Knowledge item stored successfully" });
    }

    /// <summary>
    /// Semantic search for similar cognitive trails
    /// </summary>
    [HttpGet("search")]
    public async Task<IActionResult> Search(
        [FromQuery] string query,
        [FromQuery] int topK = 5,
        [FromQuery] float threshold = 0.7f)
    {
        var results = await _vault.SearchAsync(query, topK, threshold);
        return Ok(results);
    }

    /// <summary>
    /// Get cognitive trails by source (e.g., all Planner decisions)
    /// </summary>
    [HttpGet("source/{source}")]
    public async Task<IActionResult> GetBySource(string source, [FromQuery] int limit = 100)
    {
        var results = await _vault.GetBySourceAsync(source, limit);
        return Ok(results);
    }
}

public record StoreRequest(string Content, string? Source = null, string? Metadata = null);

Step 8: Integrate with PMCR-O Agents

Update your Planner service to store cognitive trails. In PlannerService/Services/PlannerAgentService.cs:

C#
public override async Task<AgentResponse> ExecuteTask(
    AgentRequest request,
    ServerCallContext context)
{
    _logger.LogInformation("🧭 I AM the Planner. Analyzing: {Intent}", request.Intent);

    // Query knowledge vault for similar past solutions
    var httpClient = _httpClientFactory.CreateClient();
    var knowledgeResponse = await httpClient.GetAsync(
        $"http://localhost:5002/api/knowledge/search?query={Uri.EscapeDataString(request.Intent)}&topK=3");

    var pastSolutions = await knowledgeResponse.Content.ReadFromJsonAsync<List<KnowledgeItem>>() ?? new();

    // Build context from past solutions
    var contextPrompt = pastSolutions.Any()
        ? $"Similar past solutions:\n{string.Join("\n", pastSolutions.Select(s => $"- {s.Content}"))}"
        : "No similar past solutions found.";

    // Generate plan
    var systemPrompt = @"I AM the Planner.
I TRANSFER complex intent into minimal viable plans.
I EVOLVE through reflection on execution outcomes.";

    var history = new ChatHistory
    {
        new ChatMessage(ChatRole.System, systemPrompt),
        new ChatMessage(ChatRole.User, $"{request.Intent}\n\nContext: {contextPrompt}")
    };

    var response = await _chatClient.CompleteChatAsync(history, new ChatOptions
    {
        ResponseFormat = ChatResponseFormat.Json
    });

    // Store this plan in knowledge vault for future reference
    await httpClient.PostAsJsonAsync("http://localhost:5002/api/knowledge/store", new
    {
        Content = response.Content,
        Source = "planner",
        Metadata = JsonSerializer.Serialize(new { Intent = request.Intent, Timestamp = DateTime.UtcNow })
    });

    return new AgentResponse
    {
        Content = response.Content,
        Success = true
    };
}

Performance Optimization

IVFFlat Index Configuration

The IVFFlat index speeds up similarity search. Adjust the lists parameter based on your data size:

SQL
-- For small datasets (<100K items): lists = 100
-- For medium datasets (100K-1M): lists = 1000
-- For large datasets (>1M): lists = 10000

CREATE INDEX ON "KnowledgeItems" 
USING ivfflat (embedding vector_cosine_ops) 
WITH (lists = 100);

Batch Embedding Generation

For bulk ingestion, generate embeddings in batches:

C#
public async Task<List<Guid>> StoreBatchAsync(List<string> contents, string? source = null)
{
    // Generate embeddings in parallel
    var embeddingTasks = contents.Select(c => _embeddingService.GenerateEmbeddingAsync(c));
    var embeddings = await Task.WhenAll(embeddingTasks);

    // Batch insert
    var items = contents.Zip(embeddings).Select((tuple, index) => new KnowledgeItem
    {
        Id = Guid.NewGuid(),
        Content = tuple.First,
        Embedding = tuple.Second,
        Source = source,
        Timestamp = DateTime.UtcNow
    }).ToList();

    _db.KnowledgeItems.AddRange(items);
    await _db.SaveChangesAsync();

    return items.Select(i => i.Id).ToList();
}

Real-World Use Cases

1. Cognitive Trail Persistence

Store every PMCR-O cycle outcome for future reference:

  • Planner decisions: What plans worked? What failed?
  • Maker artifacts: Code patterns that succeeded
  • Checker validations: Common error patterns
  • Reflector insights: Meta-level learnings

2. RAG-Enhanced Agent Responses

Agents query their own history before generating responses:

C#
// Before generating a plan, check if we've solved this before
var similarPlans = await _knowledgeVault.SearchAsync(
    "authentication system with JWT tokens",
    topK: 3,
    threshold: 0.8f);

if (similarPlans.Any())
{
    // Use past solution as context
    var context = $"We solved this before: {similarPlans[0].Content}";
    // Generate new plan with context
}

3. Multi-Agent Knowledge Sharing

Different agents can learn from each other's cognitive trails:

  • Maker learns from Checker's validation patterns
  • Reflector learns from Planner's decision history
  • Orchestrator learns from all agents' outcomes

Monitoring & Analytics

Track knowledge vault health and usage:

C#
[HttpGet("stats")]
public async Task<IActionResult> GetStats()
{
    var totalItems = await _db.KnowledgeItems.CountAsync();
    var bySource = await _db.KnowledgeItems
        .GroupBy(k => k.Source)
        .Select(g => new { Source = g.Key, Count = g.Count() })
        .ToListAsync();

    return Ok(new
    {
        TotalItems = totalItems,
        BySource = bySource,
        AverageSimilarity = await CalculateAverageSimilarityAsync()
    });
}

Production Deployment

For production use:

  • Connection Pooling: Configure Npgsql connection pool size
  • Index Maintenance: Rebuild IVFFlat indexes periodically
  • Backup Strategy: Regular pg_dump with vector data
  • Monitoring: Track embedding generation latency
  • Scaling: Consider read replicas for search queries

🎉 What You've Built

Your PMCR-O system now has:

  • Semantic Memory: Agents can find similar past solutions
  • Externalized Cognition: Cognitive trails persist beyond sessions
  • RAG Capabilities: Context-aware responses from history
  • Multi-Agent Learning: Agents learn from each other's trails

Next Steps

Extend your knowledge vault with:

  • Metadata Filtering: Search by source, timestamp, or custom tags
  • Hybrid Search: Combine vector similarity with keyword matching
  • Time-Based Decay: Weight recent cognitive trails higher
  • Knowledge Consolidation: Periodically merge similar trails

🔗 Related Resources:

Shawn Delaine Bellazan

About Shawn Delaine Bellazan

Resilient Architect & PMCR-O Framework Creator

Shawn is the creator of the PMCR-O framework, a self-referential AI architecture that embodies the strange loop it describes. With 15+ years in enterprise software development, Shawn specializes in building resilient systems at the intersection of philosophy and technology. His work focuses on autonomous AI agents that evolve through vulnerability and expression.