LLM Control Plane

Identity, trust, and governance for LLM applications

The LLM Control Plane is the architectural layer that turns LLM runtimes and tools into user-scoped, policy-aware systems. Built on the Gatewaystack project, it sits between ChatGPT Apps SDK / MCP and your backend, enforcing authentication, scopes, and secure user data access.

While integrating my app Inner into ChatGPT via the OpenAI Apps SDK, it became clear that modern AI applications need a consistent layer for user identity, scope enforcement, and data governance.

LLMs can call tools, and tools can talk to your backend — but without a policy layer in the middle, you can't safely decide which user is calling what, or what data those tools should have access to.

The LLM Control Plane provides that missing layer, built from the same components powering the open-source Gatewaystack initiative.

1
User
consent · scopes
2
LLM runtime
ChatGPT Apps SDK · Anthropic MCP
OIDC · JWT (RS256)
3
LLM Control Plane
Gatewaystack · trust & policy gateway
API calls · verified audience
4
Your backend & data
Cloud Run · Firestore · APIs
Trust chain: UserLLM runtimeLLM Control Plane (Gatewaystack)Your backend. The control plane validates identity, JWTs, scopes, and enforces user-scoped access.

What the LLM Control Plane enables