MCPO (MCP-to-OpenAPI)

MCPO (MCP-to-OpenAPI)

Open WebUI

Converts any MCP server into a standard REST API with automatic OpenAPI documentation. Makes MCP tools compatible with existing HTTP-based tools and workflows.

By Open WebUI. A simple, secure MCP-to-OpenAPI proxy server. Expose any MCP server as a REST API with automatic OpenAPI documentation. 4,000+ GitHub stars.

4,034164 views447Local (stdio)

What it does

  • Convert MCP servers to REST APIs
  • Generate automatic OpenAPI documentation
  • Add authentication to MCP tools
  • Support stdio, SSE, and HTTP MCP servers
  • Create interactive API documentation
  • Proxy MCP calls over HTTP

Best for

Integrating MCP tools with existing API workflowsMaking MCP servers accessible to web applicationsAdding security to stdio-based MCP serversCreating standardized interfaces for AI tools
Zero configuration requiredWorks with any MCP server4,000+ GitHub stars

About MCPO (MCP-to-OpenAPI)

MCPO (MCP-to-OpenAPI) is a community-built MCP server published by Open WebUI that provides AI assistants with tools and capabilities via the Model Context Protocol. MCPO (MCP-to-OpenAPI): simple, secure proxy to expose any MCP server as a REST API with automatic OpenAPI docs. 4,000+ G It is categorized under developer tools.

How to install

You can install MCPO (MCP-to-OpenAPI) in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

MCPO (MCP-to-OpenAPI) is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

⚡️ mcpo

Expose any MCP tool as an OpenAPI-compatible HTTP server—instantly.

mcpo is a dead-simple proxy that takes an MCP server command and makes it accessible via standard RESTful OpenAPI, so your tools "just work" with LLM agents and apps expecting OpenAPI servers.

No custom protocol. No glue code. No hassle.

🤔 Why Use mcpo Instead of Native MCP?

MCP servers usually speak over raw stdio, which is:

  • 🔓 Inherently insecure
  • ❌ Incompatible with most tools
  • 🧩 Missing standard features like docs, auth, error handling, etc.

mcpo solves all of that—without extra effort:

  • ✅ Works instantly with OpenAPI tools, SDKs, and UIs
  • 🛡 Adds security, stability, and scalability using trusted web standards
  • 🧠 Auto-generates interactive docs for every tool, no config needed
  • 🔌 Uses pure HTTP—no sockets, no glue code, no surprises

What feels like "one more step" is really fewer steps with better outcomes.

mcpo makes your AI tools usable, secure, and interoperable—right now, with zero hassle.

🚀 Quick Usage

We recommend using uv for lightning-fast startup and zero config.

uvx mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_command

Or, if you’re using Python:

pip install mcpo
mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_command

To use an SSE-compatible MCP server, simply specify the server type and endpoint:

mcpo --port 8000 --api-key "top-secret" --server-type "sse" -- http://127.0.0.1:8001/sse

You can also provide headers for the SSE connection:

mcpo --port 8000 --api-key "top-secret" --server-type "sse" --header '{"Authorization": "Bearer token", "X-Custom-Header": "value"}' -- http://127.0.0.1:8001/sse

To use a Streamable HTTP-compatible MCP server, specify the server type and endpoint:

mcpo --port 8000 --api-key "top-secret" --server-type "streamable-http" -- http://127.0.0.1:8002/mcp

You can also run mcpo via Docker with no installation:

docker run -p 8000:8000 ghcr.io/open-webui/mcpo:main --api-key "top-secret" -- your_mcp_server_command

Example:

uvx mcpo --port 8000 --api-key "top-secret" -- uvx mcp-server-time --local-timezone=America/New_York

That’s it. Your MCP tool is now available at http://localhost:8000 with a generated OpenAPI schema — test it live at http://localhost:8000/docs.

🤝 To integrate with Open WebUI after launching the server, check our docs.

🌐 Serving Under a Subpath (--root-path)

If you need to serve mcpo behind a reverse proxy or under a subpath (e.g., /api/mcpo), use the --root-path argument:

mcpo --port 8000 --root-path "/api/mcpo" --api-key "top-secret" -- your_mcp_server_command

All routes will be served under the specified root path, e.g. http://localhost:8000/api/mcpo/memory.

🔄 Using a Config File

You can serve multiple MCP tools via a single config file that follows the Claude Desktop format.

Enable hot-reload mode with --hot-reload to automatically watch your config file for changes and reload servers without downtime:

Start via:

mcpo --config /path/to/config.json

Or with hot-reload enabled:

mcpo --config /path/to/config.json --hot-reload

Example config.json:

{
  "mcpServers": {
    "memory": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-memory"]
    },
    "time": {
      "command": "uvx",
      "args": ["mcp-server-time", "--local-timezone=America/New_York"],
      "disabledTools": ["convert_time"] // Disable specific tools if needed
    },
    "mcp_sse": {
      "type": "sse", // Explicitly define type
      "url": "http://127.0.0.1:8001/sse",
      "headers": {
        "Authorization": "Bearer token",
        "X-Custom-Header": "value"
      }
    },
    "mcp_streamable_http": {
      "type": "streamable-http",
      "url": "http://127.0.0.1:8002/mcp"
    } // Streamable HTTP MCP Server
  }
}

Each tool will be accessible under its own unique route, e.g.:

Each with a dedicated OpenAPI schema and proxy handler. Access full schema UI at: http://localhost:8000/<tool>/docs (e.g. /memory/docs, /time/docs)

🔐 OAuth 2.1 Authentication

mcpo supports OAuth 2.1 authentication for MCP servers that require it. The implementation defaults to dynamic client registration, so most servers only need minimal configuration:

{
  "mcpServers": {
    "oauth-protected-server": {
      "type": "streamable-http",
      "url": "http://localhost:8000/mcp",
      "oauth": {
        "server_url": "http://localhost:8000"
      }
    }
  }
}

OAuth Configuration Options

Basic Options:

  • server_url (required): OAuth server base URL
  • storage_type: "file" (persistent) or "memory" (session-only, default: "file")
  • callback_port: Local port for OAuth callback (default: 3030)
  • use_loopback: Auto-open browser for auth (default: true)

Advanced Options (rarely needed): For servers that don't support dynamic client registration, you can specify static client metadata:

{
  "mcpServers": {
    "legacy-oauth-server": {
      "type": "streamable-http", 
      "url": "http://api.example.com/mcp",
      "oauth": {
        "server_url": "http://api.example.com",
        "client_metadata": {
          "client_name": "My MCPO Client",
          "redirect_uris": ["http://localhost:3030/callback"]
        }
      }
    }
  }
}

Note: Avoid setting scope, authorization_endpoint, or token_endpoint in the config. These are automatically discovered from the server's OAuth metadata during the dynamic registration flow.

On first connection, mcpo will:

  1. Perform dynamic client registration (if supported)
  2. Open your browser for authorization
  3. Capture the OAuth callback automatically
  4. Store tokens securely (in ~/.mcpo/tokens/ for file storage)
  5. Use tokens for all subsequent requests

OAuth is supported for streamable-http server types. See OAUTH_GUIDE.md for detailed documentation.

🔧 Requirements

  • Python 3.8+
  • uv (optional, but highly recommended for performance + packaging)

🛠️ Development & Testing

To contribute or run tests locally:

  1. Set up the environment:

    # Clone the repository
    git clone https://github.com/open-webui/mcpo.git
    cd mcpo
    
    # Install dependencies (including dev dependencies)
    uv sync --dev
    
  2. Run tests:

    uv run pytest
    
  3. Running Locally with Active Changes:

    To run mcpo with your local modifications from a specific branch (e.g., my-feature-branch):

    # Ensure you are on your development branch
    git checkout my-feature-branch
    
    # Make your code changes in the src/mcpo directory or elsewhere
    
    # Run mcpo using uv, which will use your local, modified code
    # This command starts mcpo on port 8000 and proxies your_mcp_server_command
    uv run mcpo --port 8000 -- your_mcp_server_command
    
    # Example with a test MCP server (like mcp-server-time):
    # uv run mcpo --port 8000 -- uvx mcp-server-time --local-timezone=America/New_York
    

    This allows you to test your changes interactively before committing or creating a pull request. Access your locally running mcpo instance at http://localhost:8000 and the auto-generated docs at http://localhost:8000/docs.

🪪 License

MIT

🤝 Contributing

We welcome and strongly encourage contributions from the community!

Whether you're fixing a bug, adding features, improving documentation, or just sharing ideas—your input is incredibly valuable and helps make mcpo better for everyone.

Getting started is easy:

  • Fork the repo
  • Create a new branch
  • Make your changes
  • Open a pull request

Not sure where to start? Feel free to open an issue or ask a question—we’re happy to help you find a good first task.

✨ Star History

Star History Chart

✨ Let's build the future of interoperable AI tooling together!

Alternatives

Related Skills

Browse all skills
ui-design-system

UI design system toolkit for Senior UI Designer including design token generation, component documentation, responsive design calculations, and developer handoff tools. Use for creating design systems, maintaining visual consistency, and facilitating design-dev collaboration.

18
ai-sdk

Answer questions about the AI SDK and help build AI-powered features. Use when developers: (1) Ask about AI SDK functions like generateText, streamText, ToolLoopAgent, embed, or tools, (2) Want to build AI agents, chatbots, RAG systems, or text generation features, (3) Have questions about AI providers (OpenAI, Anthropic, Google, etc.), streaming, tool calling, structured output, or embeddings, (4) Use React hooks like useChat or useCompletion. Triggers on: "AI SDK", "Vercel AI SDK", "generateText", "streamText", "add AI to my app", "build an agent", "tool calling", "structured output", "useChat".

6
mcporter

Use the mcporter CLI to list, configure, auth, and call MCP servers/tools directly (HTTP or stdio), including ad-hoc servers, config edits, and CLI/type generation.

5
api-documenter

Master API documentation with OpenAPI 3.1, AI-powered tools, and modern developer experience practices. Create interactive docs, generate SDKs, and build comprehensive developer portals. Use PROACTIVELY for API documentation or developer portal creation.

4
openai-knowledge

Use when working with the OpenAI API (Responses API) or OpenAI platform features (tools, streaming, Realtime API, auth, models, rate limits, MCP) and you need authoritative, up-to-date documentation (schemas, examples, limits, edge cases). Prefer the OpenAI Developer Documentation MCP server tools when available; otherwise guide the user to enable `openaiDeveloperDocs`.

4
cli-builder

Guide for building TypeScript CLIs with Bun. Use when creating command-line tools, adding subcommands to existing CLIs, or building developer tooling. Covers argument parsing, subcommand patterns, output formatting, and distribution.

3