Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.amps.ai/llms.txt

Use this file to discover all available pages before exploring further.

Two MCP servers, one install pattern. Pick the surface you need, drop the snippet into your tool’s MCP config, restart, and run the verify prompt at the bottom. The Documentation MCP is hosted at https://docs.amps.ai/mcp. Connect via URL; nothing to install. The Execution MCP is in development and will publish as @amps-ai/mcp on npm. The snippets below cover both, with the Execution MCP marked accordingly.

Documentation MCP

Claude Desktop

Edit the Claude Desktop config file. The location depends on the operating system. macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json Linux: ~/.config/Claude/claude_desktop_config.json Add the amps-docs entry under mcpServers:
{
  "mcpServers": {
    "amps-docs": {
      "url": "https://docs.amps.ai/mcp"
    }
  }
}
If the file does not exist, create it with the snippet above. If mcpServers already has other entries, add amps-docs alongside them. Do not replace the block. Quit Claude Desktop fully (right-click the menubar icon, Quit) and reopen it. A hammer icon appears in the chat input bar. Click it to confirm both Documentation MCP tools are listed: search_amps_ai_documentation and query_docs_filesystem_amps_ai_documentation.

Cursor

Open Cursor settings: Cmd + , (macOS) or Ctrl + , (Windows / Linux), then Features > MCP. Click Add new MCP server. Or edit ~/.cursor/mcp.json directly:
{
  "mcpServers": {
    "amps-docs": {
      "url": "https://docs.amps.ai/mcp"
    }
  }
}
Reload Cursor. The Documentation MCP tools are available to Cursor’s agent and chat.

Windsurf

Edit the Windsurf MCP config at ~/.codeium/windsurf/mcp_config.json:
{
  "mcpServers": {
    "amps-docs": {
      "url": "https://docs.amps.ai/mcp"
    }
  }
}
Reload the editor. Windsurf’s Cascade agent picks up the tools on the next session.

Continue (VS Code extension)

Continue supports MCP through its experimental.modelContextProtocolServers config. Edit ~/.continue/config.json and add the experimental block:
{
  "experimental": {
    "modelContextProtocolServers": [
      {
        "transport": {
          "type": "streamableHttp",
          "url": "https://docs.amps.ai/mcp"
        }
      }
    ]
  }
}
Reload VS Code. Continue’s chat agent resolves Amps queries against the live docs.

Custom MCP clients

The Documentation MCP server is a standard JSON-RPC MCP endpoint over HTTP / streamable HTTP. Any client built against the Model Context Protocol specification can connect. The base URL is https://docs.amps.ai/mcp. Initialise with a standard initialize call, then tools/list to enumerate the two tools, or tools/call to invoke them. A minimal probe with curl:
curl -sL -X POST \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -d '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"my-agent","version":"1.0"}}}' \
  https://docs.amps.ai/mcp
The response describes both tools and their input schemas. From there, follow the spec for tools/call. See the tools reference for working examples.

Execution MCP

Coming soon. The Execution MCP is in development. The snippets below show the planned shape; the @amps-ai/mcp package is not yet published.
The Execution MCP runs as a stdio process spawned by your MCP client, or as a hosted HTTP service for multi-tenant deployments. Auth is via the integrator’s Amps API key in env, plus a per-request user identifier.

Claude Desktop (planned)

{
  "mcpServers": {
    "amps": {
      "command": "npx",
      "args": ["-y", "@amps-ai/mcp"],
      "env": {
        "AMPS_API_KEY": "sk_live_...",
        "AMPS_USER_ID": "user_..."
      }
    }
  }
}
AMPS_USER_ID identifies which of your integrator account’s end-users the agent is acting on behalf of. For multi-tenant deployments, the user identity travels in a per-request header instead.

Cursor, Windsurf, Continue (planned)

Same shape: command: "npx", args: ["-y", "@amps-ai/mcp"], env with the API key and user identifier. The exact config path is the same one used for the Documentation MCP, in each tool.

Hosted HTTP (planned)

For Vercel AI SDK and other hosted-agent runtimes, run the Execution MCP as a long-lived HTTP service. One process serves many concurrent users. Per-request x-user-id headers carry user identity from the integrator’s app to the MCP server.

Verify it works

Paste this prompt into your AI tool and watch it call the Documentation MCP server:
What modes does the Amps API support for batteries, and which one optimises for self-consumption?
A connected agent calls search_amps_ai_documentation for “battery modes”, reads the canonical actions page, and answers with the canonical names: charge, discharge, idle, auto.balanced, auto.reserve, auto.export. auto.balanced is the self-consumption answer. If the agent answers without calling the tools, the server is not connected. Check the config file path, restart the client, and confirm the hammer icon (or the equivalent in your tool) is present.

What next

Tools reference

Input shapes and call patterns for every MCP tool.

Examples

Concrete agent workflows that exercise the tools.

Documentation MCP

What the docs MCP is and why it matters.

Battery cheat sheet

Canonical Amps vocabulary your agent will encounter in responses.