Mindbridge
Provides a unified interface to route requests across multiple LLM providers (OpenAI, Anthropic, Google, DeepSeek, Ollama, etc.) and compare responses between different models.
28362 views8Local (stdio)
What it does
- Route queries to any supported LLM provider
- Compare responses across multiple models simultaneously
- Switch between different AI models mid-conversation
- Auto-detect and configure available providers
- Access local Ollama models alongside cloud APIs
Best for
Agent builders needing model flexibilityDevelopers comparing AI model outputsTeams avoiding vendor lock-inApplications requiring specialized reasoning models
6+ LLM providers supportedBuilt-in second opinion toolOpenAI-compatible API layer