
4get
Provides privacy-focused web searches through the 4get meta search engine, which aggregates results from multiple sources without tracking users.
Integrates with the 4get meta search engine to provide privacy-focused web, image, and news searches through a search aggregator that maintains user privacy while accessing results from multiple sources with built-in rate limiting resilience and caching optimization.
What it does
- Search the web across multiple search engines
- Find images from various sources
- Search news articles
- Access cached and optimized results
Best for
About 4get
4get is a community-built MCP server published by yshalsager that provides AI assistants with tools and capabilities via the Model Context Protocol. 4get is a privacy-focused private search engine that aggregates web, image, and news results while protecting your data It is categorized under search web.
How to install
You can install 4get in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
4get is released under the GPL-3.0 license.
4get MCP Server
A MCP server that provides seamless access to the 4get Meta Search engine API for LLM clients via FastMCP.
✨ Features
- 🔍 Multi Search Functions: Web, image, and news search with comprehensive result formatting
- ⚡ Smart Caching: TTL-based response caching with configurable size limits
- 🔄 Retry Logic: Exponential backoff for rate-limited and network errors
- 🏗️ Production Ready: Connection pooling, comprehensive error handling, and validation
- 📊 Rich Responses: Featured answers, related searches, pagination support, and more
- 🧪 Well Tested: Extensive test suite including integration tests with real API, unit tests, and more
- ⚙️ Highly Configurable: 11+ environment variables for fine-tuning
- 🎯 Engine Shorthands: Pick a 4get scraper via the
engineparameter without memorizing query strings
📋 Requirements
- Python 3.13+
- uv for dependency management
Quick Start
# Install dependencies
uv sync
# Run the server
uv run -m mcp_4get
# Or use mise
mise run
⚙️ Configuration
The server is highly configurable via environment variables. All settings have sensible defaults for the public https://4get.ca instance.
Core Settings
| Variable | Description | Default |
|---|---|---|
FOURGET_BASE_URL | Base URL for the 4get instance | https://4get.ca |
FOURGET_PASS | Optional pass token for rate-limited instances | unset |
FOURGET_USER_AGENT | Override User-Agent header | mcp-4get/<version> |
FOURGET_TIMEOUT | Request timeout in seconds | 20.0 |
Caching & Performance
| Variable | Description | Default |
|---|---|---|
FOURGET_CACHE_TTL | Cache lifetime in seconds | 600.0 |
FOURGET_CACHE_MAXSIZE | Maximum cached responses | 128 |
FOURGET_CONNECTION_POOL_MAXSIZE | Max concurrent connections | 10 |
FOURGET_CONNECTION_POOL_MAX_KEEPALIVE | Max persistent connections | 5 |
Retry & Resilience
| Variable | Description | Default |
|---|---|---|
FOURGET_MAX_RETRIES | Maximum retry attempts | 3 |
FOURGET_RETRY_BASE_DELAY | Base retry delay in seconds | 1.0 |
FOURGET_RETRY_MAX_DELAY | Maximum retry delay in seconds | 60.0 |
🚀 Running the Server
Local Development
uv run -m mcp_4get
Production Deployment
# With custom configuration
export FOURGET_BASE_URL="https://my-4get-instance.com"
export FOURGET_PASS="my-secret-token"
export FOURGET_CACHE_TTL="300"
export FOURGET_MAX_RETRIES="5"
uv run -m mcp_4get
MCP Server Integration
You can integrate the 4get MCP server with popular IDEs and AI assistants. Here are configuration examples:
Cursor IDE
Add this to your Cursor MCP configuration (~/.cursor/mcp.json):
{
"mcpServers": {
"4get": {
"command": "uvx",
"args": [
"mcp_4get@latest"
],
"env": {
"FOURGET_BASE_URL": "https://4get.ca"
}
}
}
}
OpenAI Codex
Add this to your Codex MCP configuration (~/.codex/config.toml):
[mcp_servers.4get]
command = "uvx"
args = ["mcp_4get@latest"]
env = { FOURGET_BASE_URL = "https://4get.ca" }
Note: Replace /path/to/your/mcp-4get with the actual path to your project directory.
🔧 MCP Tools
The server exposes three powerful search tools with comprehensive response formatting:
fourget_web_search
fourget_web_search(
query: str,
page_token: str = None, # Use 'npt' from previous response
extended_search: bool = False, # Enable extended search mode
engine: str = None, # Pick a scraper from the supported engine list
extra_params: dict = None # Language, region, etc.
)
Response includes: web[], answer[], spelling, related[], npt
fourget_image_search
fourget_image_search(
query: str,
page_token: str = None, # Use 'npt' from previous response
engine: str = None, # Pick a scraper from the supported engine list
extra_params: dict = None # Size, color, type filters
)
Response includes: image[], npt
fourget_news_search
fourget_news_search(
query: str,
page_token: str = None, # Use 'npt' from previous response
engine: str = None, # Pick a scraper from the supported engine list
extra_params: dict = None # Date range, source filters
)
Response includes: news[], npt
Engine shorthands
All MCP tools accept an optional engine argument that maps directly to the 4get scraper query parameter. This shorthand overrides any scraper value you may include in extra_params.
| Value | Engine |
|---|---|
ddg | DuckDuckGo |
brave | Brave |
mullvad_brave | Mullvad (Brave) |
yandex | Yandex |
google | |
google_cse | Google CSE |
mullvad_google | Mullvad (Google) |
startpage | Startpage |
qwant | Qwant |
ghostery | Ghostery |
yep | Yep |
greppr | Greppr |
crowdview | Crowdview |
mwmbl | Mwmbl |
mojeek | Mojeek |
baidu | Baidu |
coccoc | Coc Coc |
solofield | Solofield |
marginalia | Marginalia |
wiby | wiby |
curlie | Curlie |
If you need to pass additional 4get query parameters (such as country or language), continue to supply them through extra_params.
📄 Pagination
All tools support pagination via the npt (next page token):
# Get first page
result = await client.web_search("python programming")
# Get next page if available
if result.get('npt'):
next_page = await client.web_search("ignored", page_token=result['npt'])
🐍 Using the Async Client Directly
You can reuse the bundled async client outside MCP for direct API access:
import asyncio
from mcp_4get.client import FourGetClient
from mcp_4get.config import Config
async def main() -> None:
client = FourGetClient(Config.from_env())
data = await client.web_search(
"model context protocol",
options={"scraper": "mullvad_brave"},
)
for result in data.get("web", []):
print(result["title"], "->", result["url"])
asyncio.run(main())
This allows you to integrate 4get search capabilities directly into your Python applications without going through the MCP protocol.
🛡️ Error Handling & Resilience
Automatic Retry Logic
- Rate Limiting (429): Exponential backoff with jitter
- Network Errors: Connection failures and timeouts
- Non-retryable: HTTP 404/500 errors fail immediately
Error Types
FourGetAuthError: Rate limited or invalid authenticationFourGetAPIError: API returned non-success statusFourGetTransportError: Network or HTTP protocol errorsFourGetError: Generic client errors
Configuration Validation
All settings are validated on startup with clear error messages for misconfigurations.
📊 Response Format
Based on the real 4get API, responses include rich metadata:
{
"status": "ok",
"web": [
{
"title": "Example Result",
"description": "Result description...",
"url": "https://example.com",
"date": 1640995200,
"type": "web"
}
],
"answer": [
{
"title": "Featured Answer",
"description": [{"type": "text", "value": "Answer content..."}],
"url": "https://source.com",
"table": {"Key": "Value"}
}
],
"spelling": {
"type": "no_correction",
"correction": null
},
"related": ["related search", "terms"],
"npt": "pagination_token_here"
}
Development
This project uses several tools to streamline the development process:
mise
mise is used for managing project-level dependencies and environment variables. mise helps ensure consistent development environments across different machines.
To get started with mise:
- Install mise by following the instructions on the official website.
- Run
mise installin the project root to set up the development environment.
**Environment Variab
README truncated. View full README on GitHub.
Alternatives
Related Skills
Browse all skillsOfficial Google SEO guide covering search optimization, best practices, Search Console, crawling, indexing, and improving website search visibility based on official Google documentation
Create user-centered, accessible interface copy (microcopy) for digital products including buttons, labels, error messages, notifications, forms, onboarding, empty states, success messages, and help text. Use when writing or editing any text that appears in apps, websites, or software interfaces, designing conversational flows, establishing voice and tone guidelines, auditing product content for consistency and usability, reviewing UI strings, or improving existing interface copy. Applies UX writing best practices based on four quality standards — purposeful, concise, conversational, and clear. Includes accessibility guidelines, research-backed benchmarks (sentence length, comprehension rates, reading levels), expanded error patterns, tone adaptation frameworks, and comprehensive reference materials.
Research a topic from the last 30 days on Reddit + X + Web, become an expert, and write copy-paste-ready prompts for the user's target tool.
Automate web browser interactions using natural language via CLI commands. Use when the user asks to browse websites, navigate web pages, extract data from websites, take screenshots, fill forms, click buttons, or interact with web applications. Triggers include "browse", "navigate to", "go to website", "extract data from webpage", "screenshot", "web scraping", "fill out form", "click on", "search for on the web". When taking actions be as specific as possible.
Search Engine Optimization specialist for content strategy, technical SEO, keyword research, and ranking improvements. Use when optimizing website content, improving search rankings, conducting keyword analysis, or implementing SEO best practices. Expert in on-page SEO, meta tags, schema markup, and Core Web Vitals.
Use this skill for requests related to web research; it provides a structured approach to conducting comprehensive web research