One Search

One Search

yokingma

Provides unified web search across multiple search engines (SearxNG, Tavily, DuckDuckGo, Google, etc.) and scrapes/extracts content from websites using local browser automation.

Provides a unified search and web scraping platform that integrates multiple search providers like SearxNG and Tavily, along with Firecrawl for advanced web content extraction, enabling flexible web data retrieval and structured information gathering.

87848 views18Local (stdio)

What it does

  • Search across multiple search engines simultaneously
  • Scrape and extract content from websites
  • Perform local browser-based searches
  • Extract structured data from multiple URLs
  • Crawl websites for comprehensive data gathering

Best for

Researchers needing comprehensive web search resultsData analysts extracting structured informationContent creators gathering information from multiple sourcesDevelopers building search-powered applications
No API keys required for local search20+ search engines supportedLocal browser automation for privacy

About One Search

One Search is a community-built MCP server published by yokingma that provides AI assistants with tools and capabilities via the Model Context Protocol. One Search lets you scrape any website using advanced web scraping tools for efficient web page scraping and structured It is categorized under search web.

How to install

You can install One Search in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

One Search is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

🚀 OneSearch MCP Server: Web Search & Crawl & Scraper & Extract

A Model Context Protocol (MCP) server implementation that integrates with multiple search providers for web search, local browser search, and scraping capabilities with agent-browser.

Features

  • Web Search, scrape, crawl and extract content from websites.
  • Support multiple search engines and web scrapers: SearXNG, Tavily, DuckDuckGo, Bing, Google, Zhipu (智谱), Exa, Bocha (博查), etc.
  • Local web search (browser search), support multiple search engines: Bing, Google, Baidu, Sogou, etc.
    • Use agent-browser for browser automation.
    • Free, no API keys required.
  • Enabled tools: one_search, one_scrape, one_map, one_extract

Migration from v1.1.0 and Earlier

Breaking Changes in v1.1.0:

  • Firecrawl Removed: The Firecrawl integration has been removed in favor of agent-browser, which provides similar functionality without requiring external API services.
  • New Browser Requirement: You must install Chromium browser (see Prerequisites section).
  • Environment Variables: FIRECRAWL_API_URL and FIRECRAWL_API_KEY are no longer used.

What Changed:

  • one_scrape and one_map now use agent-browser instead of Firecrawl
  • one_extract tool is now fully implemented for structured data extraction from multiple URLs
  • All browser-based operations are now handled locally, providing better privacy and no API costs

Migration Steps:

  1. Install Chromium browser (see Prerequisites)
  2. Remove FIRECRAWL_API_URL and FIRECRAWL_API_KEY from your environment variables
  3. Update to the latest version: npm install -g one-search-mcp@latest

Prerequisites

Browser Requirement: This server uses agent-browser for web scraping and local search, which requires a Chromium-based browser.

Good News: The server will automatically detect and use browsers already installed on your system:

  • ✅ Google Chrome
  • ✅ Microsoft Edge
  • ✅ Chromium
  • ✅ Google Chrome Canary

If you don't have any of these browsers installed, you can:

# Option 1: Install Google Chrome (Recommended)
# Download from: https://www.google.com/chrome/

# Option 2: Install Microsoft Edge
# Download from: https://www.microsoft.com/edge

# Option 3: Install Chromium via agent-browser
npx agent-browser install

# Option 4: Install Chromium directly
# Download from: https://www.chromium.org/getting-involved/download-chromium/

Installation

Using Claude Code CLI (Recommended)

# Add to Claude Code with default settings (local search)
claude mcp add one-search-mcp -- npx -y one-search-mcp

# Add with custom search provider (e.g., SearXNG)
claude mcp add one-search-mcp -e SEARCH_PROVIDER=searxng -e SEARCH_API_URL=http://127.0.0.1:8080 -- npx -y one-search-mcp

# Add with Tavily API
claude mcp add one-search-mcp -e SEARCH_PROVIDER=tavily -e SEARCH_API_KEY=your_api_key -- npx -y one-search-mcp

Manual Installation

# Install globally (Optional)
npm install -g one-search-mcp

# Or run directly with npx
npx -y one-search-mcp

Using Docker

Docker image includes all dependencies (Chromium browser) pre-installed, no additional setup required.

Pull the image:

# From GitHub Container Registry
docker pull ghcr.io/yokingma/one-search-mcp:latest

# Or from Docker Hub
docker pull zacma/one-search-mcp:latest

Configure with Claude Desktop:

{
  "mcpServers": {
    "one-search-mcp": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "ghcr.io/yokingma/one-search-mcp:latest"],
      "env": {
        "SEARCH_PROVIDER": "local"
      }
    }
  }
}

With custom search provider:

{
  "mcpServers": {
    "one-search-mcp": {
      "command": "docker",
      "args": [
        "run", "-i", "--rm",
        "-e", "SEARCH_PROVIDER=tavily",
        "-e", "SEARCH_API_KEY=your_api_key",
        "ghcr.io/yokingma/one-search-mcp:latest"
      ]
    }
  }
}

Environment Variables

Search Engine:

  • SEARCH_PROVIDER (Optional): The search provider to use, supports searxng, duckduckgo, bing, tavily, google, zhipu, exa, bocha, local, default is local.
  • SEARCH_API_URL (Optional): The URL of the SearxNG API, or Google Custom Search Engine ID for google.
  • SEARCH_API_KEY (Optional): The API key for the search provider, required for tavily, bing, google, zhipu, exa, bocha.
// supported search providers
export type SearchProvider = 'searxng' | 'duckduckgo' | 'bing' | 'tavily' | 'google' | 'zhipu' | 'exa' | 'bocha' | 'local';

Search Provider Configuration

ProviderAPI Key RequiredAPI URL RequiredNotes
localNoNoFree, uses browser automation
duckduckgoNoNoFree, no API key needed
searxngOptionalYesSelf-hosted meta search engine
bingYesNoBing Search API
tavilyYesNoTavily API
googleYesYes (Search Engine ID)Google Custom Search
zhipuYesNo智谱 AI
exaYesNoExa AI
bochaYesNo博查 AI

Configuration for Other MCP Clients

Claude Desktop

Add to your Claude Desktop configuration file:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json

{
  "mcpServers": {
    "one-search-mcp": {
      "command": "npx",
      "args": ["-y", "one-search-mcp"],
      "env": {
        "SEARCH_PROVIDER": "local"
      }
    }
  }
}

Cursor

Add to your mcp.json file:

{
  "mcpServers": {
    "one-search-mcp": {
      "command": "npx",
      "args": ["-y", "one-search-mcp"],
      "env": {
        "SEARCH_PROVIDER": "local"
      }
    }
  }
}

Windsurf

Add to your ./codeium/windsurf/model_config.json file:

{
  "mcpServers": {
    "one-search-mcp": {
      "command": "npx",
      "args": ["-y", "one-search-mcp"],
      "env": {
        "SEARCH_PROVIDER": "local"
      }
    }
  }
}

Self-hosting SearXNG (Optional)

If you want to use SearXNG as your search provider, you can deploy it locally using Docker:

Prerequisites:

  • Docker installed and running (version 20.10.0 or higher)
  • At least 4GB of RAM available

Quick Start:

# Clone SearXNG Docker repository
git clone https://github.com/searxng/searxng-docker.git
cd searxng-docker

# Start SearXNG
docker compose up -d

After deployment, SearXNG will be available at http://127.0.0.1:8080 by default.

Configure OneSearch to use SearXNG:

# Set environment variables
export SEARCH_PROVIDER=searxng
export SEARCH_API_URL=http://127.0.0.1:8080

For more details, see the official SearXNG Docker documentation.

Troubleshooting

Browser not found error

If you see an error like "Browser not found", the server couldn't detect any installed Chromium-based browser. Please install one of the following:

Or install via agent-browser:

npx agent-browser install

License

MIT License - see LICENSE file for details.

Alternatives

Related Skills

Browse all skills
ux-writing

Create user-centered, accessible interface copy (microcopy) for digital products including buttons, labels, error messages, notifications, forms, onboarding, empty states, success messages, and help text. Use when writing or editing any text that appears in apps, websites, or software interfaces, designing conversational flows, establishing voice and tone guidelines, auditing product content for consistency and usability, reviewing UI strings, or improving existing interface copy. Applies UX writing best practices based on four quality standards — purposeful, concise, conversational, and clear. Includes accessibility guidelines, research-backed benchmarks (sentence length, comprehension rates, reading levels), expanded error patterns, tone adaptation frameworks, and comprehensive reference materials.

31
sales-contact-finding

Identify decision-maker contacts at target companies using web research for LinkedIn queries, email patterns, and phone research methods based on ICP buyer personas. Use when you need to find contacts at qualified prospects.

7
market-news-analyst

This skill should be used when analyzing recent market-moving news events and their impact on equity markets and commodities. Use this skill when the user requests analysis of major financial news from the past 10 days, wants to understand market reactions to monetary policy decisions (FOMC, ECB, BOJ), needs assessment of geopolitical events' impact on commodities, or requires comprehensive review of earnings announcements from mega-cap stocks. The skill automatically collects news using WebSearch/WebFetch tools and produces impact-ranked analysis reports. All analysis thinking and output are conducted in English.

5
openclaw-search

Intelligent search for agents. Multi-source retrieval with confidence scoring - web, academic, and Tavily in one unified API.

4
call-prep

Prepare for a sales call with account context, attendee research, and suggested agenda. Works standalone with user input and web research, supercharged when you connect your CRM, email, chat, or transcripts. Trigger with "prep me for my call with [company]", "I'm meeting with [company] prep me", "call prep [company]", or "get me ready for [meeting]".

2
account-research

Research a company or person and get actionable sales intel. Works standalone with web search, supercharged when you connect enrichment tools or your CRM. Trigger with "research [company]", "look up [person]", "intel on [prospect]", "who is [name] at [company]", or "tell me about [company]".

2