
FetchSERP
OfficialRetrieves search engine results and performs SEO analysis across Google, Bing, Yahoo, and DuckDuckGo. Provides SERP data, keyword research, and backlink analysis through the FetchSERP API.
Integrates with FetchSERP API to provide SEO analysis, SERP data retrieval, web scraping, keyword research, backlink analysis, and domain intelligence across Google, Bing, Yahoo, and DuckDuckGo search engines.
What it does
- Retrieve search engine results from multiple providers
- Analyze keyword rankings and search performance
- Extract backlink data for domains
- Scrape web pages for SEO analysis
- Research competitor keywords and rankings
- Monitor SERP positions over time
Best for
About FetchSERP
FetchSERP is an official MCP server published by fetchserp that provides AI assistants with tools and capabilities via the Model Context Protocol. FetchSERP delivers advanced keyword research and SEO analysis by integrating with Google Keyword Planner and top search It is categorized under search web.
How to install
You can install FetchSERP in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
FetchSERP is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
FetchSERP MCP Server
A Model Context Protocol (MCP) server that exposes the FetchSERP API for SEO, SERP analysis, web scraping, and keyword research.
Features
This MCP server provides access to all FetchSERP API endpoints:
SEO & Analysis
- Domain Analysis: Get backlinks, domain info (DNS, WHOIS, SSL, tech stack)
- Keyword Research: Search volume, suggestions, long-tail keyword generation
- SEO Analysis: Comprehensive webpage SEO analysis
- AI Analysis: AI-powered webpage analysis with custom prompts
- Moz Integration: Domain authority and Moz metrics
SERP & Search
- Search Results: Get SERP results from Google, Bing, Yahoo, DuckDuckGo
- AI Overview: Google's AI overview with JavaScript rendering
- Enhanced Results: SERP with HTML or text content
- Ranking Check: Domain ranking for specific keywords
- Indexation Check: Verify if pages are indexed
Web Scraping
- Basic Scraping: Scrape webpages without JavaScript
- JS Scraping: Execute custom JavaScript on pages
- Proxy Scraping: Scrape with country-specific proxies
- Domain Scraping: Scrape multiple pages from a domain
User Management
- Account Info: Check API credits and user information
Installation
No installation required! This MCP server runs directly from GitHub using npx.
Get your FetchSERP API token: Sign up at https://www.fetchserp.com to get your API token. New users get 250 free credits to get started!
Usage
Transport Modes
This MCP server supports two transport modes:
npx mode (Option 1):
- ✅ Zero installation required
- ✅ Always gets latest version from GitHub
- ✅ Perfect for individual users
- ✅ Runs locally with Claude Desktop
HTTP mode (Option 2):
- ✅ Remote deployment capability
- ✅ Multiple clients can connect
- ✅ Better for enterprise/team environments
- ✅ Centralized server management
- ✅ Single API key authentication (FetchSERP token)
- ✅ Scalable architecture
Configuration
Option 1: Using npx (Local/Remote GitHub) Add this server to your MCP client configuration. For example, in Claude Desktop using github registry :
{
"mcpServers": {
"fetchserp": {
"command": "npx",
"args": [
"github:FetchSERP-LLC/fetchserp-mcp-server-node"
],
"env": {
"FETCHSERP_API_TOKEN": "your_fetchserp_api_token_here"
}
}
}
}
or using npm registry
{
"mcpServers": {
"fetchserp": {
"command": "npx",
"args": ["fetchserp-mcp-server"],
"env": {
"FETCHSERP_API_TOKEN": "your_fetchserp_api_token_here"
}
}
}
}
Option 2: Claude API with MCP Server For programmatic usage with Claude's API and your deployed MCP server:
const claudeRequest = {
model: "claude-sonnet-4-20250514",
max_tokens: 1024,
messages: [
{
role: "user",
content: question
}
],
// MCP Server Configuration
mcp_servers: [
{
type: "url",
url: "https://www.fetchserp.com/sse",
name: "fetchserp",
authorization_token: FETCHSERP_API_TOKEN,
tool_configuration: {
enabled: true
}
}
]
};
const response = await httpRequest('https://api.anthropic.com/v1/messages', {
method: 'POST',
headers: {
'x-api-key': CLAUDE_API_KEY,
'anthropic-version': '2023-06-01',
'anthropic-beta': 'mcp-client-2025-04-04',
'content-type': 'application/json'
}
}, JSON.stringify(claudeRequest));
Option 3: OpenAI API with MCP Server For programmatic usage with OpenAI's API and your deployed MCP server:
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
const res = await openai.responses.create({
model: "gpt-4.1",
tools: [
{
type: "mcp",
server_label: "fetchserp",
server_url: "https://www.fetchserp.com/sse",
headers: {
Authorization: `Bearer ${FETCHSERP_API_TOKEN}`
}
}
],
input: question
});
console.log(res.choices[0].message);
Available Tools
Domain & SEO Analysis
get_backlinks
Get backlinks for a domain
- domain (required): Target domain
- search_engine: google, bing, yahoo, duckduckgo (default: google)
- country: Country code (default: us)
- pages_number: Pages to search 1-30 (default: 15)
get_domain_info
Get comprehensive domain information
- domain (required): Target domain
get_domain_emails
Extract emails from a domain
- domain (required): Target domain
- search_engine: Search engine (default: google)
- country: Country code (default: us)
- pages_number: Pages to search 1-30 (default: 1)
get_playwright_mcp
Use GPT-4.1 to remote control a browser via a Playwright MCP server
- prompt (required): The prompt to use for remote control of the browser
This endpoint uses GPT-4.1 to remote control a browser via a Playwright MCP server.
get_webpage_seo_analysis
Comprehensive SEO analysis of a webpage
- url (required): URL to analyze
get_webpage_ai_analysis
AI-powered webpage analysis
- url (required): URL to analyze
- prompt (required): Analysis prompt
generate_wordpress_content
Generate WordPress content using AI with customizable prompts and models
- user_prompt (required): The user prompt
- system_prompt (required): The system prompt
- ai_model: The AI model (default: gpt-4.1-nano)
Generates SEO-optimized WordPress content including title and content (800-1500 words) with keyword targeting in the first 100 words.
generate_social_content
Generate social media content using AI with customizable prompts and models
- user_prompt (required): The user prompt
- system_prompt (required): The system prompt
- ai_model: The AI model (default: gpt-4.1-nano)
Generates engaging social media content optimized for various platforms and audiences.
get_moz_analysis
Get Moz domain authority and metrics
- domain (required): Target domain
Keyword Research
get_keywords_search_volume
Get search volume for keywords
- keywords (required): Array of keywords
- country: Country code
get_keywords_suggestions
Get keyword suggestions
- url: URL to analyze (optional if keywords provided)
- keywords: Array of seed keywords (optional if url provided)
- country: Country code
get_long_tail_keywords
Generate long-tail keywords
- keyword (required): Seed keyword
- search_intent: informational, commercial, transactional, navigational (default: informational)
- count: Number to generate 1-500 (default: 10)
SERP & Search
get_serp_results
Get search engine results
- query (required): Search query
- search_engine: google, bing, yahoo, duckduckgo (default: google)
- country: Country code (default: us)
- pages_number: Pages to search 1-30 (default: 1)
get_serp_html
Get SERP results with HTML content
- Same parameters as
get_serp_results
get_serp_text
Get SERP results with text content
- Same parameters as
get_serp_results
get_serp_ai_mode
Get SERP with AI Overview and AI Mode response
- query (required): Search query
- country: Country code (default: us)
Returns AI overview and AI mode response for the query. Less reliable than the 2-step process but returns results in under 30 seconds.
check_page_indexation
Check if domain is indexed for keyword
- domain (required): Target domain
- keyword (required): Search keyword
get_domain_ranking
Get domain ranking for keyword
- keyword (required): Search keyword
- domain (required): Target domain
- search_engine: Search engine (default: google)
- country: Country code (default: us)
- pages_number: Pages to search 1-30 (default: 10)
Web Scraping
scrape_webpage
Scrape webpage without JavaScript
- url (required): URL to scrape
scrape_domain
Scrape multiple pages from domain
- domain (required): Target domain
- max_pages: Maximum pages to scrape, up to 200 (default: 10)
scrape_webpage_js
Scrape webpage with custom JavaScript
- url (required): URL to scrape
- js_script (required): JavaScript code to execute
scrape_webpage_js_proxy
Scrape webpage with JavaScript and proxy
- url (required): URL to scrape
- country (required): Proxy country
- js_script (required): JavaScript code to execute
User Management
get_user_info
Get user information and API credits
- No parameters required
API Token
You need a FetchSERP API token to use this server.
Getting your API token:
- Sign up at https://www.fetchserp.com
- New users automatically receive 250 free credits to get started
- Your API token will be available in your dashboard
Set the token as an environment variable:
export FETCHSERP_API_TOKEN="your_token_here"
Error Handling
The server includes comprehensive error handling:
- Missing API token validation
- API response error handling
- Input validation
- Proper MCP error responses
Docker deploy
docker build --platform=linux/amd64 -t us-east4-docker.pkg.dev/fetchserp-474019/fetchserp/mcp-server-node:latest --push .
npm login npm publish --access public
Alternatives
Related Skills
Browse all skillsOfficial Google SEO guide covering search optimization, best practices, Search Console, crawling, indexing, and improving website search visibility based on official Google documentation
Create user-centered, accessible interface copy (microcopy) for digital products including buttons, labels, error messages, notifications, forms, onboarding, empty states, success messages, and help text. Use when writing or editing any text that appears in apps, websites, or software interfaces, designing conversational flows, establishing voice and tone guidelines, auditing product content for consistency and usability, reviewing UI strings, or improving existing interface copy. Applies UX writing best practices based on four quality standards — purposeful, concise, conversational, and clear. Includes accessibility guidelines, research-backed benchmarks (sentence length, comprehension rates, reading levels), expanded error patterns, tone adaptation frameworks, and comprehensive reference materials.
Research a topic from the last 30 days on Reddit + X + Web, become an expert, and write copy-paste-ready prompts for the user's target tool.
Automate web browser interactions using natural language via CLI commands. Use when the user asks to browse websites, navigate web pages, extract data from websites, take screenshots, fill forms, click buttons, or interact with web applications. Triggers include "browse", "navigate to", "go to website", "extract data from webpage", "screenshot", "web scraping", "fill out form", "click on", "search for on the web". When taking actions be as specific as possible.
Search Engine Optimization specialist for content strategy, technical SEO, keyword research, and ranking improvements. Use when optimizing website content, improving search rankings, conducting keyword analysis, or implementing SEO best practices. Expert in on-page SEO, meta tags, schema markup, and Core Web Vitals.
Use this skill for requests related to web research; it provides a structured approach to conducting comprehensive web research