
Funnel
A proxy server that sits between AI assistants and multiple MCP servers, intelligently filtering and discovering tools to reduce context token usage from hundreds of tools down to just 3-4 core discovery tools.
TypeScript-based proxy server that aggregates multiple MCP servers while dramatically reducing context token usage through intelligent tool filtering and dynamic discovery, exposing only 3-4 core tools instead of hundreds to enable AI assistants to search for and dynamically enable only the tools they need at runtime.
What it does
- Aggregate multiple MCP servers into one interface
- Filter tools dynamically based on assistant needs
- Reduce context token consumption by 95%+
- Enable runtime tool discovery and activation
- Proxy requests between assistants and MCP servers
Best for
About Funnel
Funnel is a community-built MCP server published by chris-schra that provides AI assistants with tools and capabilities via the Model Context Protocol. Funnel is a TypeScript proxy server that aggregates MCP servers, intelligently filtering tools to optimize context token It is categorized under developer tools.
How to install
You can install Funnel in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
Funnel is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
../../README.md
Alternatives
Related Skills
Browse all skillsUI design system toolkit for Senior UI Designer including design token generation, component documentation, responsive design calculations, and developer handoff tools. Use for creating design systems, maintaining visual consistency, and facilitating design-dev collaboration.
Guide for building TypeScript CLIs with Bun. Use when creating command-line tools, adding subcommands to existing CLIs, or building developer tooling. Covers argument parsing, subcommand patterns, output formatting, and distribution.
Use when working with the OpenAI API (Responses API) or OpenAI platform features (tools, streaming, Realtime API, auth, models, rate limits, MCP) and you need authoritative, up-to-date documentation (schemas, examples, limits, edge cases). Prefer the OpenAI Developer Documentation MCP server tools when available; otherwise guide the user to enable `openaiDeveloperDocs`.
Master API documentation with OpenAPI 3.1, AI-powered tools, and modern developer experience practices. Create interactive docs, generate SDKs, and build comprehensive developer portals. Use PROACTIVELY for API documentation or developer portal creation.
Integrate Vercel AI SDK applications with You.com tools (web search, AI agent, content extraction). Use when developer mentions AI SDK, Vercel AI SDK, generateText, streamText, or You.com integration with AI SDK.
Use when building MCP servers or clients that connect AI systems with external tools and data sources. Invoke for MCP protocol compliance, TypeScript/Python SDKs, resource providers, tool functions.