Provides privacy-focused web searches through the 4get meta search engine, which aggregates results from multiple sources without tracking users.

Integrates with the 4get meta search engine to provide privacy-focused web, image, and news searches through a search aggregator that maintains user privacy while accessing results from multiple sources with built-in rate limiting resilience and caching optimization.

5273 views3Local (stdio)

What it does

  • Search the web across multiple search engines
  • Find images from various sources
  • Search news articles
  • Access cached and optimized results

Best for

Privacy-conscious users avoiding trackingResearch requiring diverse search sourcesContent creators needing image searches
Privacy-focused with no user trackingBuilt-in rate limiting and cachingAggregates multiple search engines

About 4get

4get is a community-built MCP server published by yshalsager that provides AI assistants with tools and capabilities via the Model Context Protocol. 4get is a privacy-focused private search engine that aggregates web, image, and news results while protecting your data It is categorized under search web.

How to install

You can install 4get in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

4get is released under the GPL-3.0 license.

4get MCP Server

A MCP server that provides seamless access to the 4get Meta Search engine API for LLM clients via FastMCP.

Codacy Badge Codacy Badge PyPI version PyPI Downloads

GitHub release GitHub Downloads

made-with-python Open Source Love

PayPal LiberaPay

✨ Features

  • 🔍 Multi Search Functions: Web, image, and news search with comprehensive result formatting
  • ⚡ Smart Caching: TTL-based response caching with configurable size limits
  • 🔄 Retry Logic: Exponential backoff for rate-limited and network errors
  • 🏗️ Production Ready: Connection pooling, comprehensive error handling, and validation
  • 📊 Rich Responses: Featured answers, related searches, pagination support, and more
  • 🧪 Well Tested: Extensive test suite including integration tests with real API, unit tests, and more
  • ⚙️ Highly Configurable: 11+ environment variables for fine-tuning
  • 🎯 Engine Shorthands: Pick a 4get scraper via the engine parameter without memorizing query strings

📋 Requirements

  • Python 3.13+
  • uv for dependency management

Quick Start

# Install dependencies
uv sync

# Run the server
uv run -m mcp_4get

# Or use mise
mise run

⚙️ Configuration

The server is highly configurable via environment variables. All settings have sensible defaults for the public https://4get.ca instance.

Core Settings

VariableDescriptionDefault
FOURGET_BASE_URLBase URL for the 4get instancehttps://4get.ca
FOURGET_PASSOptional pass token for rate-limited instancesunset
FOURGET_USER_AGENTOverride User-Agent headermcp-4get/<version>
FOURGET_TIMEOUTRequest timeout in seconds20.0

Caching & Performance

VariableDescriptionDefault
FOURGET_CACHE_TTLCache lifetime in seconds600.0
FOURGET_CACHE_MAXSIZEMaximum cached responses128
FOURGET_CONNECTION_POOL_MAXSIZEMax concurrent connections10
FOURGET_CONNECTION_POOL_MAX_KEEPALIVEMax persistent connections5

Retry & Resilience

VariableDescriptionDefault
FOURGET_MAX_RETRIESMaximum retry attempts3
FOURGET_RETRY_BASE_DELAYBase retry delay in seconds1.0
FOURGET_RETRY_MAX_DELAYMaximum retry delay in seconds60.0

🚀 Running the Server

Local Development

uv run -m mcp_4get

Production Deployment

# With custom configuration
export FOURGET_BASE_URL="https://my-4get-instance.com"
export FOURGET_PASS="my-secret-token"
export FOURGET_CACHE_TTL="300"
export FOURGET_MAX_RETRIES="5"

uv run -m mcp_4get

MCP Server Integration

You can integrate the 4get MCP server with popular IDEs and AI assistants. Here are configuration examples:

Cursor IDE

Add this to your Cursor MCP configuration (~/.cursor/mcp.json):

{
  "mcpServers": {
    "4get": {
      "command": "uvx",
      "args": [
        "mcp_4get@latest"
      ],
      "env": {
        "FOURGET_BASE_URL": "https://4get.ca"
      }
    }
  }
}

OpenAI Codex

Add this to your Codex MCP configuration (~/.codex/config.toml):

[mcp_servers.4get]
command = "uvx"
args = ["mcp_4get@latest"]
env = { FOURGET_BASE_URL = "https://4get.ca" }

Note: Replace /path/to/your/mcp-4get with the actual path to your project directory.

🔧 MCP Tools

The server exposes three powerful search tools with comprehensive response formatting:

fourget_web_search

fourget_web_search(
    query: str,
    page_token: str = None,        # Use 'npt' from previous response
    extended_search: bool = False, # Enable extended search mode
    engine: str = None,             # Pick a scraper from the supported engine list
    extra_params: dict = None      # Language, region, etc.
)

Response includes: web[], answer[], spelling, related[], npt

fourget_image_search

fourget_image_search(
    query: str,
    page_token: str = None,   # Use 'npt' from previous response
    engine: str = None,       # Pick a scraper from the supported engine list
    extra_params: dict = None # Size, color, type filters
)

Response includes: image[], npt

fourget_news_search

fourget_news_search(
    query: str,
    page_token: str = None,   # Use 'npt' from previous response
    engine: str = None,       # Pick a scraper from the supported engine list
    extra_params: dict = None # Date range, source filters
)

Response includes: news[], npt

Engine shorthands

All MCP tools accept an optional engine argument that maps directly to the 4get scraper query parameter. This shorthand overrides any scraper value you may include in extra_params.

ValueEngine
ddgDuckDuckGo
braveBrave
mullvad_braveMullvad (Brave)
yandexYandex
googleGoogle
google_cseGoogle CSE
mullvad_googleMullvad (Google)
startpageStartpage
qwantQwant
ghosteryGhostery
yepYep
grepprGreppr
crowdviewCrowdview
mwmblMwmbl
mojeekMojeek
baiduBaidu
coccocCoc Coc
solofieldSolofield
marginaliaMarginalia
wibywiby
curlieCurlie

If you need to pass additional 4get query parameters (such as country or language), continue to supply them through extra_params.

📄 Pagination

All tools support pagination via the npt (next page token):

# Get first page
result = await client.web_search("python programming")

# Get next page if available
if result.get('npt'):
    next_page = await client.web_search("ignored", page_token=result['npt'])

🐍 Using the Async Client Directly

You can reuse the bundled async client outside MCP for direct API access:

import asyncio
from mcp_4get.client import FourGetClient
from mcp_4get.config import Config

async def main() -> None:
    client = FourGetClient(Config.from_env())
    data = await client.web_search(
        "model context protocol",
        options={"scraper": "mullvad_brave"},
    )
    for result in data.get("web", []):
        print(result["title"], "->", result["url"])

asyncio.run(main())

This allows you to integrate 4get search capabilities directly into your Python applications without going through the MCP protocol.

🛡️ Error Handling & Resilience

Automatic Retry Logic

  • Rate Limiting (429): Exponential backoff with jitter
  • Network Errors: Connection failures and timeouts
  • Non-retryable: HTTP 404/500 errors fail immediately

Error Types

  • FourGetAuthError: Rate limited or invalid authentication
  • FourGetAPIError: API returned non-success status
  • FourGetTransportError: Network or HTTP protocol errors
  • FourGetError: Generic client errors

Configuration Validation

All settings are validated on startup with clear error messages for misconfigurations.

📊 Response Format

Based on the real 4get API, responses include rich metadata:

{
  "status": "ok",
  "web": [
    {
      "title": "Example Result",
      "description": "Result description...",
      "url": "https://example.com",
      "date": 1640995200,
      "type": "web"
    }
  ],
  "answer": [
    {
      "title": "Featured Answer",
      "description": [{"type": "text", "value": "Answer content..."}],
      "url": "https://source.com",
      "table": {"Key": "Value"}
    }
  ],
  "spelling": {
    "type": "no_correction",
    "correction": null
  },
  "related": ["related search", "terms"],
  "npt": "pagination_token_here"
}

Development

This project uses several tools to streamline the development process:

mise

mise is used for managing project-level dependencies and environment variables. mise helps ensure consistent development environments across different machines.

To get started with mise:

  1. Install mise by following the instructions on the official website.
  2. Run mise install in the project root to set up the development environment.

**Environment Variab


README truncated. View full README on GitHub.

Alternatives

Related Skills

Browse all skills
google-official-seo-guide

Official Google SEO guide covering search optimization, best practices, Search Console, crawling, indexing, and improving website search visibility based on official Google documentation

119
ux-writing

Create user-centered, accessible interface copy (microcopy) for digital products including buttons, labels, error messages, notifications, forms, onboarding, empty states, success messages, and help text. Use when writing or editing any text that appears in apps, websites, or software interfaces, designing conversational flows, establishing voice and tone guidelines, auditing product content for consistency and usability, reviewing UI strings, or improving existing interface copy. Applies UX writing best practices based on four quality standards — purposeful, concise, conversational, and clear. Includes accessibility guidelines, research-backed benchmarks (sentence length, comprehension rates, reading levels), expanded error patterns, tone adaptation frameworks, and comprehensive reference materials.

31
last30days

Research a topic from the last 30 days on Reddit + X + Web, become an expert, and write copy-paste-ready prompts for the user's target tool.

27
browser-automation

Automate web browser interactions using natural language via CLI commands. Use when the user asks to browse websites, navigate web pages, extract data from websites, take screenshots, fill forms, click buttons, or interact with web applications. Triggers include "browse", "navigate to", "go to website", "extract data from webpage", "screenshot", "web scraping", "fill out form", "click on", "search for on the web". When taking actions be as specific as possible.

23
seo-optimizer

Search Engine Optimization specialist for content strategy, technical SEO, keyword research, and ranking improvements. Use when optimizing website content, improving search rankings, conducting keyword analysis, or implementing SEO best practices. Expert in on-page SEO, meta tags, schema markup, and Core Web Vitals.

21
web-research

Use this skill for requests related to web research; it provides a structured approach to conducting comprehensive web research

19