
One Search
Provides unified web search across multiple search engines (SearxNG, Tavily, DuckDuckGo, Google, etc.) and scrapes/extracts content from websites using local browser automation.
Provides a unified search and web scraping platform that integrates multiple search providers like SearxNG and Tavily, along with Firecrawl for advanced web content extraction, enabling flexible web data retrieval and structured information gathering.
What it does
- Search across multiple search engines simultaneously
- Scrape and extract content from websites
- Perform local browser-based searches
- Extract structured data from multiple URLs
- Crawl websites for comprehensive data gathering
Best for
About One Search
One Search is a community-built MCP server published by yokingma that provides AI assistants with tools and capabilities via the Model Context Protocol. One Search lets you scrape any website using advanced web scraping tools for efficient web page scraping and structured It is categorized under search web.
How to install
You can install One Search in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
One Search is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
🚀 OneSearch MCP Server: Web Search & Crawl & Scraper & Extract
A Model Context Protocol (MCP) server implementation that integrates with multiple search providers for web search, local browser search, and scraping capabilities with agent-browser.
Features
- Web Search, scrape, crawl and extract content from websites.
- Support multiple search engines and web scrapers: SearXNG, Tavily, DuckDuckGo, Bing, Google, Zhipu (智谱), Exa, Bocha (博查), etc.
- Local web search (browser search), support multiple search engines: Bing, Google, Baidu, Sogou, etc.
- Use
agent-browserfor browser automation. - Free, no API keys required.
- Use
- Enabled tools:
one_search,one_scrape,one_map,one_extract
Migration from v1.1.0 and Earlier
Breaking Changes in v1.1.0:
- Firecrawl Removed: The Firecrawl integration has been removed in favor of
agent-browser, which provides similar functionality without requiring external API services. - New Browser Requirement: You must install Chromium browser (see Prerequisites section).
- Environment Variables:
FIRECRAWL_API_URLandFIRECRAWL_API_KEYare no longer used.
What Changed:
one_scrapeandone_mapnow useagent-browserinstead of Firecrawlone_extracttool is now fully implemented for structured data extraction from multiple URLs- All browser-based operations are now handled locally, providing better privacy and no API costs
Migration Steps:
- Install Chromium browser (see Prerequisites)
- Remove
FIRECRAWL_API_URLandFIRECRAWL_API_KEYfrom your environment variables - Update to the latest version:
npm install -g one-search-mcp@latest
Prerequisites
Browser Requirement: This server uses agent-browser for web scraping and local search, which requires a Chromium-based browser.
Good News: The server will automatically detect and use browsers already installed on your system:
- ✅ Google Chrome
- ✅ Microsoft Edge
- ✅ Chromium
- ✅ Google Chrome Canary
If you don't have any of these browsers installed, you can:
# Option 1: Install Google Chrome (Recommended)
# Download from: https://www.google.com/chrome/
# Option 2: Install Microsoft Edge
# Download from: https://www.microsoft.com/edge
# Option 3: Install Chromium via agent-browser
npx agent-browser install
# Option 4: Install Chromium directly
# Download from: https://www.chromium.org/getting-involved/download-chromium/
Installation
Using Claude Code CLI (Recommended)
# Add to Claude Code with default settings (local search)
claude mcp add one-search-mcp -- npx -y one-search-mcp
# Add with custom search provider (e.g., SearXNG)
claude mcp add one-search-mcp -e SEARCH_PROVIDER=searxng -e SEARCH_API_URL=http://127.0.0.1:8080 -- npx -y one-search-mcp
# Add with Tavily API
claude mcp add one-search-mcp -e SEARCH_PROVIDER=tavily -e SEARCH_API_KEY=your_api_key -- npx -y one-search-mcp
Manual Installation
# Install globally (Optional)
npm install -g one-search-mcp
# Or run directly with npx
npx -y one-search-mcp
Using Docker
Docker image includes all dependencies (Chromium browser) pre-installed, no additional setup required.
Pull the image:
# From GitHub Container Registry
docker pull ghcr.io/yokingma/one-search-mcp:latest
# Or from Docker Hub
docker pull zacma/one-search-mcp:latest
Configure with Claude Desktop:
{
"mcpServers": {
"one-search-mcp": {
"command": "docker",
"args": ["run", "-i", "--rm", "ghcr.io/yokingma/one-search-mcp:latest"],
"env": {
"SEARCH_PROVIDER": "local"
}
}
}
}
With custom search provider:
{
"mcpServers": {
"one-search-mcp": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-e", "SEARCH_PROVIDER=tavily",
"-e", "SEARCH_API_KEY=your_api_key",
"ghcr.io/yokingma/one-search-mcp:latest"
]
}
}
}
Environment Variables
Search Engine:
- SEARCH_PROVIDER (Optional): The search provider to use, supports
searxng,duckduckgo,bing,tavily,google,zhipu,exa,bocha,local, default islocal. - SEARCH_API_URL (Optional): The URL of the SearxNG API, or Google Custom Search Engine ID for
google. - SEARCH_API_KEY (Optional): The API key for the search provider, required for
tavily,bing,google,zhipu,exa,bocha.
// supported search providers
export type SearchProvider = 'searxng' | 'duckduckgo' | 'bing' | 'tavily' | 'google' | 'zhipu' | 'exa' | 'bocha' | 'local';
Search Provider Configuration
| Provider | API Key Required | API URL Required | Notes |
|---|---|---|---|
local | No | No | Free, uses browser automation |
duckduckgo | No | No | Free, no API key needed |
searxng | Optional | Yes | Self-hosted meta search engine |
bing | Yes | No | Bing Search API |
tavily | Yes | No | Tavily API |
google | Yes | Yes (Search Engine ID) | Google Custom Search |
zhipu | Yes | No | 智谱 AI |
exa | Yes | No | Exa AI |
bocha | Yes | No | 博查 AI |
Configuration for Other MCP Clients
Claude Desktop
Add to your Claude Desktop configuration file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"one-search-mcp": {
"command": "npx",
"args": ["-y", "one-search-mcp"],
"env": {
"SEARCH_PROVIDER": "local"
}
}
}
}
Cursor
Add to your mcp.json file:
{
"mcpServers": {
"one-search-mcp": {
"command": "npx",
"args": ["-y", "one-search-mcp"],
"env": {
"SEARCH_PROVIDER": "local"
}
}
}
}
Windsurf
Add to your ./codeium/windsurf/model_config.json file:
{
"mcpServers": {
"one-search-mcp": {
"command": "npx",
"args": ["-y", "one-search-mcp"],
"env": {
"SEARCH_PROVIDER": "local"
}
}
}
}
Self-hosting SearXNG (Optional)
If you want to use SearXNG as your search provider, you can deploy it locally using Docker:
Prerequisites:
- Docker installed and running (version 20.10.0 or higher)
- At least 4GB of RAM available
Quick Start:
# Clone SearXNG Docker repository
git clone https://github.com/searxng/searxng-docker.git
cd searxng-docker
# Start SearXNG
docker compose up -d
After deployment, SearXNG will be available at http://127.0.0.1:8080 by default.
Configure OneSearch to use SearXNG:
# Set environment variables
export SEARCH_PROVIDER=searxng
export SEARCH_API_URL=http://127.0.0.1:8080
For more details, see the official SearXNG Docker documentation.
Troubleshooting
Browser not found error
If you see an error like "Browser not found", the server couldn't detect any installed Chromium-based browser. Please install one of the following:
- Google Chrome: https://www.google.com/chrome/
- Microsoft Edge: https://www.microsoft.com/edge
- Chromium: https://www.chromium.org/getting-involved/download-chromium/
Or install via agent-browser:
npx agent-browser install
License
MIT License - see LICENSE file for details.
Alternatives
Related Skills
Browse all skillsCreate user-centered, accessible interface copy (microcopy) for digital products including buttons, labels, error messages, notifications, forms, onboarding, empty states, success messages, and help text. Use when writing or editing any text that appears in apps, websites, or software interfaces, designing conversational flows, establishing voice and tone guidelines, auditing product content for consistency and usability, reviewing UI strings, or improving existing interface copy. Applies UX writing best practices based on four quality standards — purposeful, concise, conversational, and clear. Includes accessibility guidelines, research-backed benchmarks (sentence length, comprehension rates, reading levels), expanded error patterns, tone adaptation frameworks, and comprehensive reference materials.
Identify decision-maker contacts at target companies using web research for LinkedIn queries, email patterns, and phone research methods based on ICP buyer personas. Use when you need to find contacts at qualified prospects.
This skill should be used when analyzing recent market-moving news events and their impact on equity markets and commodities. Use this skill when the user requests analysis of major financial news from the past 10 days, wants to understand market reactions to monetary policy decisions (FOMC, ECB, BOJ), needs assessment of geopolitical events' impact on commodities, or requires comprehensive review of earnings announcements from mega-cap stocks. The skill automatically collects news using WebSearch/WebFetch tools and produces impact-ranked analysis reports. All analysis thinking and output are conducted in English.
Intelligent search for agents. Multi-source retrieval with confidence scoring - web, academic, and Tavily in one unified API.
Prepare for a sales call with account context, attendee research, and suggested agenda. Works standalone with user input and web research, supercharged when you connect your CRM, email, chat, or transcripts. Trigger with "prep me for my call with [company]", "I'm meeting with [company] prep me", "call prep [company]", or "get me ready for [meeting]".
Research a company or person and get actionable sales intel. Works standalone with web search, supercharged when you connect enrichment tools or your CRM. Trigger with "research [company]", "look up [person]", "intel on [prospect]", "who is [name] at [company]", or "tell me about [company]".