Documentation

CommanderMCP is the universal middleware that connects your AI agents to the real world. Turn any API into an AI-ready toolset instantly using our powerful custom integration engine.

💳 Subscription Plan Required
🔑 OpenAI API Key Required (for Agent Logic)

🧩 The Platform Core

The Universal Adapter

Unlike traditional platforms that rely on hardcoded integrations, CommanderMCP uses a Generic REST Adapter. This means we don't need to "build" an integration for GoHighLevel, Square, or Salesforce internally.

Instead, you define the Shape of the API (Authentication, Headers, Endpoints), and our server dynamically translates requests from your LLM (ChatGPT, Claude) into valid API calls.

🔍 Observability & API Logs

AI can be unpredictable. We provide complete X-Ray vision into what your agents are actually doing.

Live API Logs

Every tool execution is recorded. View exactly what JSON the AI generated, what the external API returned, and the latency of every request.

Inline Chat Debug Bar

When testing in our built-in chat, a debug bar appears under every tool call. Click it to expand a diff view of the Input vs. Output. This is critical for debugging "hallucinations" or malformed requests.

🛠️ Building Integrations

We provide three powerful ways to add custom tools to your project.

1. Template Library

Don't want to start from scratch? Our community and team maintain a library of Integration Presets.

When you select a template (e.g., "Generic CRM" or "Payment Gateway"), the system pre-fills the complex configuration (Base URLs, Header logic, Pagination rules). You simply input your API Key or Location ID, and the tools are instantly live.

2. AI Assisted Builder (Magic Mode) ✨

This is the fastest way to add a completely new tool that isn't in our library.

  • Find Documentation: Go to the developer docs of the service you want to add (e.g., "Notion API Create Page").
  • Copy & Paste: Copy the raw text or curl example from their documentation page.
  • Processing: Paste it into our "AI Builder" text area. Our internal LLM parses the text, extracts the Endpoint, Method (POST/GET), required JSON body parameters, and data types.
  • Result: It generates a valid JSON Schema automatically. You can review it, save it, and your AI agent can now use that tool.
Feature

3. OpenAPI Spec Import

Have a Swagger or OpenAPI 3.0 URL? You can import endpoints in bulk.

Simply paste the URL to the openapi.json file. CommanderMCP will fetch the definition, parse all available endpoints, and let you select which ones to import as tools. This is perfect for modern SaaS platforms that publish their API specs publicly.

🤖 Chat Agents & Voice

CommanderMCP isn't just a backend; it allows you to create full-featured AI personas.

Built-in Chat Agents

Create dedicated "Chat Concepts" or Personas (e.g., "Support Bot", "Sales Rep"). Each agent can be assigned:

  • System Prompt: Instructions on how to behave ("You are a helpful assistant...").
  • Tool Access: Which integrations this specific agent is allowed to use.
  • Memory: Persistent context awareness.
Voice & SMS

Twilio Integration

Take your Chat Agent offline by connecting it to a real phone number via Twilio.

Voice Mode: When a customer calls your Twilio number, our system answers. It transcribes the user's speech, sends it to your Agent, executes any necessary tools (like checking a calendar or looking up an order), and speaks the response back using Text-to-Speech.

SMS Mode: The same agent can handle text message conversations, maintaining context just like a web chat.

🔌 Connectivity Options

Once your tools are defined, how do you use them?

Option A: MCP (Model Context Protocol)

Our server is fully compliant with the Model Context Protocol standard. This makes it compatible with any MCP-enabled client, including:

  • AI Editors: Cursor, Windsurf, Zed
  • Desktop Agents: Claude Desktop App
  • Enterprise Platforms: Any custom business solution supporting MCP
# Example Generic MCP Configuration
command: "uvx"
args: ["mcp-server-commander", "--token", "YOUR_PROJECT_TOKEN"]

Option B: ChatGPT (Custom GPTs)

Allows any standard ChatGPT user to connect your MCP server tools to a custom GPT by simply importing the JSON configuration.

  1. Get the Configuration:
    Go to your Project Dashboard > Schema.
    Copy the MCP Server JSON provided by our platform.
  2. Configure the GPT:
    Log in to ChatGPT and select Explore GPTs > + Create.
    Switch to the Configure tab and click the Add Actions button.
  3. Import Tools:
    Paste the copied MCP Server JSON directly into the Schema box.
    ChatGPT will parse the file and recognize your available tools.
  4. Ready to Use:
    You can now chat with the GPT, and it will trigger your API actions in real-time.

Option C: Unified REST API

Building your own frontend? You don't need to implement 10 different SDKs. Use our unified endpoint to call any tool you've configured.

POST https://api.commandermcp.com/v1/execute
Authorization: Bearer YOUR_PROJECT_TOKEN
Content-Type: application/json

{
  "tool": "search_contacts",
  "arguments": { "query": "[email protected]" }
}
© 2026CommanderMCP Application