ctx| MCP

MCP docs

Model Context Protocol — endpoint, transport, authentication, and the ctx_advisor tool.

ctx| exposes a Model Context Protocol server at /mcp. Any MCP-compatible agent (Cursor, Claude Code, GitHub Copilot, custom tooling) can connect and query the knowledge graph during planning and execution.

Endpoint

https://app.ctxpipe.ai/mcp?orgSlug=<your-org-slug>

The orgSlug query parameter is required. Requests without it are rejected with a -32600 JSON-RPC error. Your org slug is visible in the ctx| dashboard under Settings -> Organisation.

For self-hosted instances, replace the base URL with your own.

Transport

ctx| uses Streamable HTTP transport - the stateless, HTTP-native MCP transport introduced in MCP 2025-03-26. This means:

  • No persistent WebSocket or SSE connection is required.
  • Each tool call is an independent HTTP POST.
  • Progress notifications are streamed back inline using notifications/progress while a long-running tool invocation is in flight.

Most MCP clients from mid-2025 onwards support Streamable HTTP. If your client only supports legacy SSE transport, raise an issue - we track client adoption.

Authentication

Pass an API key in the Authorization header:

{
  "mcpServers": {
    "ctxpipe": {
      "url": "https://app.ctxpipe.ai/mcp?orgSlug=your-org",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}

Generate API keys from Settings -> API Keys in the ctx| dashboard. Keys are scoped to your user and carry the permissions of your membership in the target organisation.

ctx| is a fully compliant OAuth 2.0 authorisation server. The protected-resource metadata is discoverable at:

GET /.well-known/oauth-protected-resource/mcp

And the authorisation server metadata at:

GET /.well-known/oauth-authorization-server

OAuth clients can register dynamically - no manual client registration step is needed. The MCP endpoint is listed as a valid audience in issued tokens.

Device Authorization (CLI / headless agents)

For non-interactive environments, use the OAuth 2.0 Device Authorization Grant:

POST /.auth/api/v1/auth/device/code
POST /.auth/api/v1/auth/device/token

The device verification URI is /.auth/device. This flow is suitable for CI pipelines and CLI tooling where a browser redirect is not available.

Tools

ctx_advisor

The primary interface for agent guidance. Call it during planning and throughout execution - not just at the start.

Input schema

{
  prompt: string  // min length: 1
}

Behaviour

  1. Creates a new conversation thread with source "mcp".
  2. Routes the prompt through the LangGraph chatGraph pipeline, which queries the knowledge graph, applies your organisation's instruction hierarchy (AGENTS.md, skills, ADRs), and synthesises a response.
  3. Streams progress back via notifications/progress while the graph is running, so clients that support progress tokens see incremental output.
  4. Returns the final synthesised answer as a text content block.

When to call it

SituationSignal to call
Beginning a new featureAlways - establish architectural context first
Encountering an unfamiliar subsystemAsk before reading wide areas of code
Evaluating implementation approachesGet feedback on trade-offs against org patterns
Validating a decisionConfirm against ADRs and existing conventions
Major decision changes mid-taskRe-call with updated context

Example prompt

We're adding rate limiting to the MCP endpoint. The backend uses Hono.
What patterns does this organisation use for cross-cutting middleware, and are
there any architectural constraints I should be aware of?

Progress streaming

When the MCP client provides a _meta.progressToken, ctx_advisor sends incremental deltas via notifications/progress as text is generated. Each notification has:

{
  method: "notifications/progress",
  params: {
    progressToken: string | number,
    progress: number,       // monotonically increasing counter
    message: string,        // text delta since last notification
  }
}

Clients that don't supply a progress token receive only the final response.