LLM Control Plane
Identity, trust, and governance for LLM applications
The LLM Control Plane is the architectural layer that turns LLM runtimes and tools into user-scoped, policy-aware systems. Built on the Gatewaystack project, it sits between ChatGPT Apps SDK / MCP and your backend, enforcing authentication, scopes, and secure user data access.
While integrating my app Inner into ChatGPT via the OpenAI Apps SDK, it became clear that modern AI applications need a consistent layer for user identity, scope enforcement, and data governance.
LLMs can call tools, and tools can talk to your backend — but without a policy layer in the middle, you can't safely decide which user is calling what, or what data those tools should have access to.
The LLM Control Plane provides that missing layer, built from the same components powering the open-source Gatewaystack initiative.
What the LLM Control Plane enables
- OIDC / OAuth2 integration
- RS256 JWT validation
- User identity & session binding
- Per-tool scopes & consent
- User-scoped data access
- ChatGPT Apps SDK / MCP aware
- Tool governance & routing
- Cloud Run / Firestore ready