Scrapezy

Scrapezy

Official
scrapezy

Connects to the Scrapezy API to extract specific data from websites using natural language prompts, returning structured information instead of raw HTML.

Integrates with the Scrapezy API to extract structured data from websites based on user-specified prompts, enabling flexible web scraping for data collection, content aggregation, and automated research tasks.

13278 views8Local (stdio)

What it does

  • Extract structured data from any website URL
  • Parse web content using custom prompts
  • Convert unstructured web data into organized formats
  • Scrape product information, prices, and descriptions
  • Collect contact details and business information

Best for

Market research and competitor analysisContent aggregation and data collectionE-commerce product monitoringLead generation and contact scraping
Prompt-based extractionRequires Scrapezy API key

About Scrapezy

Scrapezy is an official MCP server published by scrapezy that provides AI assistants with tools and capabilities via the Model Context Protocol. Scrapezy lets you easily extract structured data and scrape any website for web scraping, content aggregation, and autom It is categorized under search web.

How to install

You can install Scrapezy in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

Scrapezy is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

@scrapezy/mcp MCP Server

Scrapezy MCP server

smithery badge

A Model Context Protocol server for Scrapezy that enables AI models to extract structured data from websites.

Features

Tools

  • extract_structured_data - Extract structured data from a website
    • Takes URL and prompt as required parameters
    • Returns structured data extracted from the website based on the prompt
    • The prompt should clearly describe what data to extract from the website

Installation

Installing via Smithery

To install Scrapezy MCP Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @Scrapezy/mcp --client claude

Manual Installation

npm install -g @scrapezy/mcp

Usage

API Key Setup

There are two ways to provide your Scrapezy API key:

  1. Environment Variable:

    export SCRAPEZY_API_KEY=your_api_key
    npx @scrapezy/mcp
    
  2. Command-line Argument:

    npx @scrapezy/mcp --api-key=your_api_key
    

To use with Claude Desktop, add the server config:

On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

{
  "mcpServers": {
    "scrapezy": {
      "command": "npx @scrapezy/mcp --api-key=your_api_key"
    }
  }
}

Example Usage in Claude

You can use this tool in Claude with prompts like:

Please extract product information from this page: https://example.com/product
Extract the product name, price, description, and available colors.

Claude will use the MCP server to extract the requested structured data from the website.

Debugging

Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:

npm run inspector

The Inspector will provide a URL to access debugging tools in your browser.

License

MIT

Alternatives

Related Skills

Browse all skills
google-official-seo-guide

Official Google SEO guide covering search optimization, best practices, Search Console, crawling, indexing, and improving website search visibility based on official Google documentation

119
ux-writing

Create user-centered, accessible interface copy (microcopy) for digital products including buttons, labels, error messages, notifications, forms, onboarding, empty states, success messages, and help text. Use when writing or editing any text that appears in apps, websites, or software interfaces, designing conversational flows, establishing voice and tone guidelines, auditing product content for consistency and usability, reviewing UI strings, or improving existing interface copy. Applies UX writing best practices based on four quality standards — purposeful, concise, conversational, and clear. Includes accessibility guidelines, research-backed benchmarks (sentence length, comprehension rates, reading levels), expanded error patterns, tone adaptation frameworks, and comprehensive reference materials.

31
last30days

Research a topic from the last 30 days on Reddit + X + Web, become an expert, and write copy-paste-ready prompts for the user's target tool.

27
browser-automation

Automate web browser interactions using natural language via CLI commands. Use when the user asks to browse websites, navigate web pages, extract data from websites, take screenshots, fill forms, click buttons, or interact with web applications. Triggers include "browse", "navigate to", "go to website", "extract data from webpage", "screenshot", "web scraping", "fill out form", "click on", "search for on the web". When taking actions be as specific as possible.

23
seo-optimizer

Search Engine Optimization specialist for content strategy, technical SEO, keyword research, and ranking improvements. Use when optimizing website content, improving search rankings, conducting keyword analysis, or implementing SEO best practices. Expert in on-page SEO, meta tags, schema markup, and Core Web Vitals.

21
web-research

Use this skill for requests related to web research; it provides a structured approach to conducting comprehensive web research

19