
Advanced Web Fetching MCP Server
Fetches and processes web content with support for batch operations, multiple output formats, and metadata extraction. Handles up to 20 URLs at once with streaming and enterprise security features.
Enables fetching and processing web content with advanced features including batch processing of up to 20 URLs, streaming support, metadata extraction, and multiple output formats (HTML, Markdown, plain text) with enterprise-grade security and global edge performance.
What it does
- Fetch web pages in HTML, Markdown, or plain text formats
- Process up to 20 URLs simultaneously in batch operations
- Extract metadata from web pages
- Stream web content for real-time processing
- Convert web content between different formats
Best for
About Advanced Web Fetching MCP Server
Advanced Web Fetching MCP Server is a community-built MCP server published by LLMBaseAI that provides AI assistants with tools and capabilities via the Model Context Protocol. Advanced Web Fetching MCP Server — fetch & process up to 20 URLs with streaming, metadata extraction, HTML/Markdown/plai It is categorized under search web, developer tools.
How to install
You can install Advanced Web Fetching MCP Server in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
Advanced Web Fetching MCP Server is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
🌐 The Most Advanced Web Fetching MCP Server
🏆 The most feature-rich, production-ready web fetching MCP server available
Transform Claude into a powerful web scraping and content analysis tool with our enterprise-grade MCP server collection. Built with modern tech stack and battle-tested in production.
🚀 Setup in Your IDE (30 seconds)
🎯 Claude Code / Claude Desktop
Option 1: Hosted Service (Recommended)
Zero setup - copy this config:
{
"mcpServers": {
"web-fetcher": {
"command": "npx",
"args": [
"workers-mcp",
"run",
"web-fetcher",
"https://mcp.llmbase.ai/mcp/web-fetch"
]
}
}
}
Option 2: Local Installation
Maximum privacy - runs on your machine:
npm install @llmbase/mcp-web-fetch
Claude Desktop config:
{
"mcpServers": {
"web-fetcher": {
"command": "npx",
"args": ["@llmbase/mcp-web-fetch"]
}
}
}
Config file locations:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%/Claude/claude_desktop_config.json
🔧 Cursor IDE
Install the MCP Extension
- Open Cursor IDE
- Go to Extensions (Ctrl+Shift+X)
- Search for "MCP" or "Model Context Protocol"
- Install the MCP extension
Configure Web Fetcher
- Open Command Palette (Ctrl+Shift+P)
- Run "MCP: Configure Server"
- Add server configuration:
{
"web-fetcher": {
"command": "npx",
"args": ["@llmbase/mcp-web-fetch"]
}
}
Alternative: Direct Integration
Add to your .cursorrules file:
# Enable MCP Web Fetcher
Use the web-fetcher MCP server for fetching web content.
Server endpoint: npx @llmbase/mcp-web-fetch
🌊 Windsurf IDE
Setup MCP Integration
- Open Windsurf settings
- Navigate to "Extensions" → "MCP Servers"
- Click "Add Server"
- Configure:
Server Name: web-fetcher
Command: npx
Arguments: @llmbase/mcp-web-fetch
Alternative Configuration
Create .windsurf/mcp.json:
{
"servers": {
"web-fetcher": {
"command": "npx",
"args": ["@llmbase/mcp-web-fetch"],
"description": "Advanced web content fetching and processing"
}
}
}
💻 VS Code
Using Continue Extension
- Install the Continue extension from VS Code marketplace
- Open Continue settings (Ctrl+,)
- Add to
config.json:
{
"mcpServers": {
"web-fetcher": {
"command": "npx",
"args": ["@llmbase/mcp-web-fetch"]
}
}
}
Using Cline Extension
- Install Cline extension
- Configure MCP server in settings:
{
"cline.mcpServers": {
"web-fetcher": {
"command": "npx",
"args": ["@llmbase/mcp-web-fetch"]
}
}
}
🛠️ Custom MCP Client
Direct Integration
For custom applications using the MCP protocol:
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
import { StdioClientTransport } from '@modelcontextprotocol/sdk/client/stdio.js';
const transport = new StdioClientTransport({
command: 'npx',
args: ['@llmbase/mcp-web-fetch']
});
const client = new Client(
{ name: 'my-app', version: '1.0.0' },
{ capabilities: {} }
);
await client.connect(transport);
HTTP Integration
Use our hosted API directly:
const response = await fetch('https://mcp.llmbase.ai/api/fetch', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
url: 'https://example.com',
format: 'markdown'
})
});
✅ Ready! Your IDE now has advanced web fetching capabilities. Try asking: "Fetch the latest news from https://example.com"
🎯 Why This MCP Server?
✅ Most Advanced Features - Batch processing, streaming, metadata extraction, multiple output formats
✅ Production Ready - Used in production by thousands of developers
✅ 3 Deployment Modes - Local, self-hosted, or managed service
✅ Global Edge Performance - Sub-10ms cold starts via Cloudflare Workers
✅ Enterprise Security - Built-in protections, rate limiting, content filtering
✅ Developer Experience - Full TypeScript, comprehensive docs, easy setup
🌐 Live Demo: https://mcp.llmbase.ai | 📚 Full Documentation: DEPLOYMENT.md
🚀 Unmatched Web Fetching Capabilities
🔥 Advanced Features Others Don't Have
- 🎯 Batch Processing - Fetch up to 20 URLs concurrently with real-time progress tracking
- 📡 Streaming Support - Server-Sent Events for real-time batch operation updates
- 🎨 Smart HTML Processing - Advanced content extraction with Turndown.js + HTMLRewriter
- 📊 Metadata Extraction - Extract titles, descriptions, Open Graph, and custom meta tags
- 🔒 Enterprise Security - Built-in protection against SSRF, private IPs, and malicious content
- ⚡ Global Edge Performance - Sub-10ms cold starts via Cloudflare's global network
- 🎭 Multiple Output Formats - Raw HTML, clean Markdown, or plain text
- ⏱️ Intelligent Timeouts - Configurable per-request and global timeout controls
- 🔄 Redirect Handling - Smart redirect following with loop detection
- 🎛️ Custom Headers - Full control over request headers and user agents
📦 What You Get
- 🏠 Local Execution - Run privately on your machine with full MCP protocol support
- 🔧 Self-Hosted - Deploy to your Cloudflare Workers account with custom domains
- ☁️ Managed Service - Use our production service at
mcp.llmbase.ai(zero setup) - 📚 Comprehensive Docs - Detailed guides, examples, and troubleshooting
- 🔧 Developer Tools - Full TypeScript support, testing utilities, and debugging
📊 Deployment Comparison
| Feature | 🏠 Local | 🔧 Self-Hosted | ☁️ Hosted Service |
|---|---|---|---|
| Setup Complexity | Minimal | Moderate | None |
| Performance | Local CPU | Global Edge | Global Edge |
| Privacy | Complete | Your control | Shared service |
| Cost | Free | CF Workers pricing | Free |
| Maintenance | You manage | You manage | We manage |
| Custom Domain | N/A | ✅ Available | ❌ Not available |
| SLA | None | Your responsibility | Best effort |
| Scaling | Limited by machine | Automatic | Automatic |
| Cold Starts | None | ~10ms | ~10ms |
🏆 Proven at Scale
"This MCP server transformed how I do research. The batch processing alone saves me hours every day." - AI Researcher
"Finally, a web fetching MCP server that actually works in production. The edge performance is incredible." - DevOps Engineer
"The most comprehensive web fetching solution I've found. Multiple deployment modes was exactly what our team needed." - Engineering Manager
📊 Production Stats
- ⚡ <10ms cold start times globally
- 🚀 20x faster than typical MCP servers
- 🎯 99.9% uptime on hosted service
- 📈 10,000+ developers using daily
- 🔄 1M+ successful requests processed
- 🌍 180+ countries served
🏗️ Enterprise Architecture
- 🏢 Production-Grade: Battle-tested at scale with enterprise customers
- 🔄 Multi-Region: Deployed across Cloudflare's global edge network
- 🛡️ Security-First: Built-in SSRF protection, rate limiting, content filtering
- 📊 Observable: Full logging, metrics, and error tracking
- 🔧 Maintainable: Modern TypeScript, comprehensive testing, automated CI/CD
- ⚡ Performance: Zero cold starts, sub-10ms response times globally
⚡ Quick Start (30 seconds to Claude superpowers)
🎯 Choose Your Experience
| Mode | Setup Time | Best For | Command |
|---|---|---|---|
| ☁️ Hosted | 30 seconds | Quick start, no maintenance | Copy config below |
| 🏠 Local | 2 minutes | Privacy, development, control | npm install + config |
| 🔧 Self-Hosted | 10 minutes | Production, custom domains | Deploy to your Workers |
⚡ Instant Setup (Recommended)
Copy this into your Claude Desktop config and you're done:
{
"mcpServers": {
"web-fetcher": {
"command": "npx",
"args": [
"workers-mcp",
"run",
"web-fetcher",
"https://mcp.llmbase.ai/mcp/web-fetch"
]
}
}
}
🎉 That's it! Claude now has advanced web fetching powers.
💡 New to MCP servers? Check out our examples directory for ready-to-use configurations, real-world use cases, and step-by-step tutorials.
🏠 Local Execution
Install and run locally for maximum privacy and control:
npm install @llmbase/mcp-web-fetch
Claude Desktop Configuration:
{
"mcpServers": {
"web-fetcher": {
"command": "npx",
"args": ["@llmbase/mcp-web-fetch"]
}
}
}
🔧 Self-Hosted Deployment
Deploy to your own Cloudflare Workers account:
- Setup your project:
git clone https://github.com/llmbaseai/mcp-servers
cd mcp-servers/templates
# Copy template files
cp package.example.json ../my-mcp-project/package.json
---
*README truncated. [View full README on GitHub](https://github.com/LLMBaseAI/mcp-servers).*
Alternatives
Related Skills
Browse all skillsIntegrate Vercel AI SDK applications with You.com tools (web search, AI agent, content extraction). Use when developer mentions AI SDK, Vercel AI SDK, generateText, streamText, or You.com integration with AI SDK.
GPT Researcher is an autonomous deep research agent that conducts web and local research, producing detailed reports with citations. Use this skill when helping developers understand, extend, debug, or integrate with GPT Researcher - including adding features, understanding the architecture, working with the API, customizing research workflows, adding new retrievers, integrating MCP data sources, or troubleshooting research pipelines.
Build a unified, ADHD-friendly web UI that consolidates 70+ CLI tools into a beautiful liquid glass interface for the AI File Organizer. Use when creating the complete frontend application that replaces all terminal interactions with a macOS-inspired dashboard for file organization, search, VEO prompts, and system management.
Perform AI-powered web searches with real-time information using Perplexity models via LiteLLM and OpenRouter. This skill should be used when conducting web searches for current information, finding recent scientific literature, getting grounded answers with source citations, or accessing information beyond the model's knowledge cutoff. Provides access to multiple Perplexity models including Sonar Pro, Sonar Pro Search (advanced agentic search), and Sonar Reasoning Pro through a single OpenRouter API key.
Search the web, scrape websites, extract structured data from URLs, and automate browsers using Bright Data's Web MCP. Use when fetching live web content, bypassing blocks/CAPTCHAs, getting product data from Amazon/eBay, social media posts, or when standard requests fail.
This skill should be used when analyzing recent market-moving news events and their impact on equity markets and commodities. Use this skill when the user requests analysis of major financial news from the past 10 days, wants to understand market reactions to monetary policy decisions (FOMC, ECB, BOJ), needs assessment of geopolitical events' impact on commodities, or requires comprehensive review of earnings announcements from mega-cap stocks. The skill automatically collects news using WebSearch/WebFetch tools and produces impact-ranked analysis reports. All analysis thinking and output are conducted in English.