The semantic layer MCP is missing

The semantic layer
for AI agents.

MCP without typed context fails. We provide the schema layer—structured memory, versioned contracts, searchable context. No data plane required.

$ claude "create a payment service with process, refund, status"
→ Generated payment.proto (PaymentService, 3 methods)
→ Validated against buf lint rules
→ Published to registry.protobuf.ai/payments@v1

$ curl streams.protobuf.ai/payments/context/customer-123
{"last_payment": "2m ago", "total_spent": 12450, "risk_score": 0.12}

What you don't need

🚫
Kafka clusters
🚫
CLI tooling
🚫
Schema configs
🚫
Broken pipelines

Just POST /streams/{name}/messages and GET /context/{key}

Built for the agent era

The protobuf ecosystem is fragmenting. We're the neutral layer that works with all of it.

🔌

MCP Native

First-class Model Context Protocol support. Claude can generate, validate, and publish schemas directly.

📦

Schema Registry

Version-controlled schemas with semantic search. Find schemas by meaning, not just name.

registry.protobuf.ai
🔍

Vector Search

Semantic similarity across your context. Find relevant past interactions, not just exact matches.

🚀

Stream Processing

Benthos-powered transforms. Raw messages become materialized context views.

Ready to build?

Open source. MIT licensed. The neutral layer for protobuf in the agent era.

Get in touch

Building with protobuf? Exploring schema tooling? Let's talk.