
PAL MCP Server
Acts as a proxy that lets you use multiple AI models (OpenAI, Gemini, Claude, etc.) within a single MCP session and connect external AI CLIs together.
11,218189 views960Local (stdio)
What it does
- Query multiple AI models in one session
- Connect external AI CLIs like Gemini CLI and Codex
- Spawn isolated CLI subagents with specialized roles
- Switch between OpenAI, Gemini, Grok, Ollama and other providers
- Bridge different AI tools within the same workflow
- Access custom endpoints and on-device models
Best for
Developers wanting to compare responses across AI modelsTeams using multiple AI CLIs in complex workflowsAI-assisted development with specialized role agents
11,000+ GitHub starsCLI-to-CLI bridging with subagentsSupports 8+ AI providers