Ultra (Multi-AI Provider)

Ultra (Multi-AI Provider)

realmikechong

Provides access to OpenAI O3, Google Gemini 2.5 Pro, and Azure OpenAI models through a single interface with built-in usage tracking and cost estimation.

Unified server providing access to OpenAI O3, Google Gemini 2.5 Pro, and Azure OpenAI models with automatic usage tracking, cost estimation, and nine specialized development tools for code analysis, debugging, and documentation generation.

268745 views19Local (stdio)

What it does

  • Query OpenAI O3 and GPT models
  • Access Google Gemini 2.5 Pro
  • Use Azure OpenAI services
  • Track token usage and costs automatically
  • Generate code documentation
  • Analyze and debug code

Best for

Developers using Claude or Cursor who need multiple AI modelsTeams wanting to compare outputs across different LLMsProjects requiring cost tracking for AI usage
Zero setup with npx ultra-mcpBuilt-in web dashboardLocal usage analytics

About Ultra (Multi-AI Provider)

Ultra (Multi-AI Provider) is a community-built MCP server published by realmikechong that provides AI assistants with tools and capabilities via the Model Context Protocol. Ultra (Multi-AI Provider) unifies OpenAI, Gemini, and Azure models, tracking usage, estimating costs, and offering 9 dev It is categorized under ai ml, developer tools.

How to install

You can install Ultra (Multi-AI Provider) in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

Ultra (Multi-AI Provider) is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

Ultra MCP

All Models. One Interface. Zero Friction.

npm version npm downloads

๐Ÿš€ Ultra MCP - A Model Context Protocol server that exposes OpenAI, Gemini, Azure OpenAI, and xAI Grok AI models through a single MCP interface for use with Claude Code and Cursor.

img

Stop wasting time having meetings with human. Now it's time to ask AI models do this.

Inspiration

This project is inspired by:

  • Agent2Agent (A2A) by Google - Thank you Google for pioneering agent-to-agent communication protocols
  • Zen MCP - The AI orchestration server that enables Claude to collaborate with multiple AI models

Why Ultra MCP?

While inspired by zen-mcp-server, Ultra MCP offers several key advantages:

๐Ÿš€ Easier to Use

  • No cloning required - Just run npx ultra-mcp to get started
  • NPM package - Install globally with npm install -g ultra-mcp
  • Interactive setup - Guided configuration with npx ultra-mcp config
  • Zero friction - From zero to AI-powered coding in under a minute

๐Ÿ“Š Built-in Usage Analytics

  • Local SQLite database - All usage data stored locally using libSQL
  • Automatic tracking - Every LLM request is tracked with token counts and costs
  • Usage statistics - View your AI usage with npx ultra-mcp db:stats
  • Privacy first - Your data never leaves your machine

๐ŸŒ Modern Web Dashboard

  • Beautiful UI - React dashboard with Tailwind CSS
  • Real-time stats - View usage trends, costs by provider, and model distribution
  • Easy access - Just run npx ultra-mcp dashboard
  • Configuration UI - Manage API keys and model priorities from the web

๐Ÿ”ง Additional Benefits

  • Simplified tools - Maximum 4 parameters per tool (vs zen's 10-15)
  • Smart defaults - Optimal model selection out of the box
  • TypeScript first - Full type safety and better developer experience
  • Regular updates - Active development with new features weekly

Features

  • ๐Ÿค– Multi-Model Support: Integrate OpenAI (GPT-5), Google Gemini (2.5 Pro), Azure OpenAI, and xAI Grok models
  • ๐Ÿ”Œ MCP Protocol: Standard Model Context Protocol interface
  • ๐ŸŽฏ Discoverable Prompts: All 25 tools available as prompts in Claude Code (New in v0.7.0)
  • ๐Ÿง  Deep Reasoning Tools: Access GPT-5 for complex problem-solving
  • ๐Ÿ” Investigation & Research: Built-in tools for thorough investigation and research
  • ๐ŸŒ Google Search Integration: Gemini 2.5 Pro with real-time web search
  • โšก Real-time Streaming: Live model responses via Vercel AI SDK
  • ๐Ÿ”ง Zero Config: Interactive setup with smart defaults
  • ๐Ÿ”‘ Secure Configuration: Local API key storage with conf library
  • ๐Ÿงช TypeScript: Full type safety and modern development experience

Quick Start

Installation

# Install globally via npm
npm install -g ultra-mcp

# Or run directly with npx
npx -y ultra-mcp config

Configuration

Set up your API keys interactively:

npx -y ultra-mcp config

This will:

  1. Show current configuration status
  2. Present a provider-first menu to select which AI provider to configure
  3. Guide you through setting API keys, base URLs, and preferred models
  4. Store configuration securely on your system
  5. Auto-load settings when the server starts

New in v0.5.10:

  • ๐ŸŽฏ Provider-first configuration - Select specific provider to configure
  • ๐Ÿค– OpenAI-Compatible support - Configure Ollama (local) or OpenRouter (400+ models)
  • ๐Ÿ“‹ Model selection - Choose your preferred model from categorized lists

Running the Server

# Run the MCP server
npx -y ultra-mcp

# Or after building locally
bun run build
node dist/cli.js

CLI Commands

Ultra MCP provides several powerful commands:

config - Interactive Configuration

npx -y ultra-mcp config

Configure API keys interactively with a user-friendly menu system.

dashboard - Web Dashboard

npx -y ultra-mcp dashboard

# Custom port
npx -y ultra-mcp dashboard --port 4000

# Development mode
npx -y ultra-mcp dashboard --dev

Launch the web dashboard to view usage statistics, manage configurations, and monitor AI costs.

install - Install for Claude Code

npx -y ultra-mcp install

Automatically install Ultra MCP as an MCP server for Claude Code.

doctor - Health Check

npx -y ultra-mcp doctor

# Test connections to providers
npx -y ultra-mcp doctor --test

Check installation health and test API connections.

chat - Interactive Chat

npx -y ultra-mcp chat

# Specify model and provider
npx -y ultra-mcp chat -m gpt-5 -p openai
npx -y ultra-mcp chat -m grok-4 -p grok

Chat interactively with AI models from the command line.

Database Commands

db:show - Show Database Info

npx -y ultra-mcp db:show

Display database file location and basic statistics.

db:stats - Usage Statistics

npx -y ultra-mcp db:stats

Show detailed usage statistics for the last 30 days including costs by provider.

db:view - Database Viewer

npx -y ultra-mcp db:view

Launch Drizzle Studio to explore the usage database interactively.

Integration with Claude Code

Automatic Installation (Recommended)

# Install Ultra MCP for Claude Code
npx -y ultra-mcp install

This command will:

  • Detect Claude Code installation
  • Add Ultra MCP as an MCP server
  • Configure for user or project scope
  • Verify API key configuration

Manual Installation

Add to your Claude Code settings:

{
  "mcpServers": {
    "ultra-mcp": {
      "command": "npx",
      "args": ["-y", "ultra-mcp@latest"]
    }
  }
}

Integration with Cursor

First configure your API keys:

npx -y ultra-mcp config

Then add to your Cursor MCP settings:

{
  "mcpServers": {
    "ultra-mcp": {
      "command": "npx",
      "args": ["-y", "ultra-mcp@latest"]
    }
  }
}

Ultra MCP will automatically use the API keys you configured with the config command.

MCP Tools & Prompts

Ultra MCP provides powerful AI tools accessible through Claude Code and Cursor. New in v0.7.0: All tools are now also available as discoverable prompts in Claude Code.

๐ŸŽฏ Prompts Support (New in v0.7.0)

All Ultra MCP tools are now exposed as discoverable prompts in Claude Code, making them even easier to use:

  • 25 discoverable prompts corresponding to all existing tools
  • Parameter guidance built into each prompt template
  • Natural language interface for all AI capabilities
  • Automatic discovery by Claude Code and other MCP clients

How to use prompts:

  1. Type / in Claude Code to see available prompts
  2. Select any Ultra MCP prompt (e.g., "Deep Reasoning", "Code Review", "Debug Issue")
  3. Fill in the parameters through the guided interface
  4. Claude automatically generates the appropriate instruction

This makes Ultra MCP's powerful AI capabilities more accessible than ever!

๐Ÿง  Deep Reasoning (deep-reasoning)

Leverage advanced AI models for complex problem-solving and analysis.

  • Default: GPT-5 for OpenAI/Azure, Gemini 2.5 Pro with Google Search, Grok-4 for xAI
  • Use Cases: Complex algorithms, architectural decisions, deep analysis

๐Ÿ” Investigate (investigate)

Thoroughly investigate topics with configurable depth levels.

  • Depth Levels: shallow, medium, deep
  • Google Search: Enabled by default for Gemini
  • Use Cases: Research topics, explore concepts, gather insights

๐Ÿ“š Research (research)

Conduct comprehensive research with multiple output formats.

  • Output Formats: summary, detailed, academic
  • Use Cases: Literature reviews, technology comparisons, documentation

๐Ÿ“‹ List Models (list-ai-models)

View all available AI models and their configuration status.

Example Usage

// In Claude Code or Cursor with MCP
await use_mcp_tool('ultra-mcp', 'deep-reasoning', {
  provider: 'openai',
  prompt: 'Design a distributed caching system for microservices',
  reasoningEffort: 'high',
});

Development

# Clone the repository
git clone https://github.com/RealMikeChong/ultra-mcp
cd ultra-mcp

# Install dependencies
bun install

# Build TypeScript
bun run build

# Run tests
bun run test

# Development mode with watch
bun run dev

# Test with MCP Inspector
npx @modelcontextprotocol/inspector node dist/cli.js

Architecture

Ultra MCP acts as a bridge between multiple AI model providers and MCP clients:

  1. MCP Protocol Layer: Implements Model Context Protocol for Claude Code/Cursor communication
  2. Model Providers: Integrates OpenAI, Google (Gemini), Azure OpenAI, and xAI Grok via Vercel AI SDK
  3. Unified Interface: Single MCP interface to access multiple AI models
  4. Configuration Management: Secure local storage with schema validation

Key Components

  • src/cli.ts - CLI entry point with commander
  • src/server.ts - MCP server implementation
  • src/config/ - Configuration management with schema validation
  • src/handlers/ - MCP protocol handlers
  • src/providers/ - Model provider implementations
  • src/utils/ - Shared utilities for streaming and error handling

Configuration Storage

Ultra MCP stores configuration in your system's default config directory:

  • macOS: ~/Library/Preferences/ultra-mcp-nodejs/
  • Linux: ~/.config/ultra-mcp/
  • Windows: %APPDATA%\ultra-mcp-nodejs\

Environment Variables

You can also set API keys and base URLs via environment variables:

  • OPENAI_API_KEY

README truncated. View full README on GitHub.

Alternatives

Related Skills

Browse all skills
ai-sdk

Answer questions about the AI SDK and help build AI-powered features. Use when developers: (1) Ask about AI SDK functions like generateText, streamText, ToolLoopAgent, embed, or tools, (2) Want to build AI agents, chatbots, RAG systems, or text generation features, (3) Have questions about AI providers (OpenAI, Anthropic, Google, etc.), streaming, tool calling, structured output, or embeddings, (4) Use React hooks like useChat or useCompletion. Triggers on: "AI SDK", "Vercel AI SDK", "generateText", "streamText", "add AI to my app", "build an agent", "tool calling", "structured output", "useChat".

6
mcp-developer

Use when building MCP servers or clients that connect AI systems with external tools and data sources. Invoke for MCP protocol compliance, TypeScript/Python SDKs, resource providers, tool functions.

1
ui-design-system

UI design system toolkit for Senior UI Designer including design token generation, component documentation, responsive design calculations, and developer handoff tools. Use for creating design systems, maintaining visual consistency, and facilitating design-dev collaboration.

18
building-mcp-server-on-cloudflare

Builds remote MCP (Model Context Protocol) servers on Cloudflare Workers with tools, OAuth authentication, and production deployment. Generates server code, configures auth providers, and deploys to Workers. Use when: user wants to "build MCP server", "create MCP tools", "remote MCP", "deploy MCP", add "OAuth to MCP", or mentions Model Context Protocol on Cloudflare. Also triggers on "MCP authentication" or "MCP deployment".

4
api-documenter

Master API documentation with OpenAPI 3.1, AI-powered tools, and modern developer experience practices. Create interactive docs, generate SDKs, and build comprehensive developer portals. Use PROACTIVELY for API documentation or developer portal creation.

4
openai-knowledge

Use when working with the OpenAI API (Responses API) or OpenAI platform features (tools, streaming, Realtime API, auth, models, rate limits, MCP) and you need authoritative, up-to-date documentation (schemas, examples, limits, edge cases). Prefer the OpenAI Developer Documentation MCP server tools when available; otherwise guide the user to enable `openaiDeveloperDocs`.

4