Integrates with Linkd API to extract LinkedIn profiles, search for users and companies, and retrieve contact information for sales prospecting and recruitment workflows.

Integrates with Linkd API to extract LinkedIn user and company profiles, search contacts, retrieve email addresses, and perform deep research workflows for sales prospecting and recruitment.

3392 views3Local (stdio)

What it does

  • Search LinkedIn users with filters
  • Search companies on LinkedIn
  • Extract detailed LinkedIn profile data
  • Retrieve email addresses and phone numbers
  • Scrape LinkedIn posts and comments

Best for

Sales professionals building prospect listsRecruiters sourcing candidatesBusiness development teams researching leadsMarketing teams building contact databases
Requires Linkd API keyCredit-based usage model

About Linkd

Linkd is a community-built MCP server published by automcp-app that provides AI assistants with tools and capabilities via the Model Context Protocol. Linkd uses the LinkedIn API for easy recruitment—search contacts, extract profiles, and boost sales prospecting with dee It is categorized under search web, analytics data.

How to install

You can install Linkd in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

Linkd is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

Linkd MCP Server

This is an unofficial Model Context Protocol (MCP) Server for Linkd..

More information about automcp can be found at automcp.app.

For detailed API documentation and usage examples, visit the official Linkd documentation.

More information about the Model Context Protocol can be found here.

Table of Contents

Installation

Manual Installation

To install the server, run:

npx linkd-mcp <YOUR-LINKD-API-KEY>

Running on Cursor

Add to ~/.cursor/mcp.json like this:

{
  "mcpServers": {
    "linkd": {
      "command": "npx",
      "args": ["-y", "linkd-mcp"],
      "env": {
        "LINKD_API_KEY": "YOUR-API-KEY"
      }
    }
  }
}

Running on Windsurf

Add to your ./codeium/windsurf/model_config.json like this:

{
  "mcpServers": {
    "linkd": {
      "command": "npx",
      "args": ["-y", "linkd-mcp"],
      "env": {
        "LINKD_API_KEY": "YOUR-API-KEY"
      }
    }
  }
}

Claude Desktop app

This is an example config for the Linkd MCP server for the Claude Desktop client.

{
  "mcpServers": {
    "linkd": {
      "command": "npx",
      "args": ["--yes", "linkd-mcp"],
      "env": {
        "LINKD_API_KEY": "your-api-key"
      }
    }
  }
}

Tools

  • search_for_users - Search for LinkedIn users with filters like query, school, and match threshold
  • search_for_companies - Search for companies on Linkd using filters like query and match threshold
  • enrich_linkedin - Retrieves detailed profile information for a specific LinkedIn URL (1 credit per lookup)
  • retrieve_contacts - Retrieves email addresses and phone numbers for a LinkedIn profile (1 credit per lookup)
  • scrape_linkedin - Retrieves detailed profile data and posts with comments from a LinkedIn profile URL (2 credits per request)
  • research_profile - Research a profile using email or phone number
  • initiate_deep_research - Start a deep research job for comprehensive LinkedIn data gathering
  • check_deep_research_status - Check the status of an ongoing deep research job

License

This project is licensed under the MIT License.

Alternatives

Related Skills

Browse all skills
ga4-analytics

Google Analytics 4, Search Console, and Indexing API toolkit. Analyze website traffic, page performance, user demographics, real-time visitors, search queries, and SEO metrics. Use when the user asks to: check site traffic, analyze page views, see traffic sources, view user demographics, get real-time visitor data, check search console queries, analyze SEO performance, request URL re-indexing, inspect index status, compare date ranges, check bounce rates, view conversion data, or get e-commerce revenue. Requires a Google Cloud service account with GA4 and Search Console access.

0
browser-automation

Automate web browser interactions using natural language via CLI commands. Use when the user asks to browse websites, navigate web pages, extract data from websites, take screenshots, fill forms, click buttons, or interact with web applications. Triggers include "browse", "navigate to", "go to website", "extract data from webpage", "screenshot", "web scraping", "fill out form", "click on", "search for on the web". When taking actions be as specific as possible.

16
content-trend-researcher

Advanced content and topic research skill that analyzes trends across Google Analytics, Google Trends, Substack, Medium, Reddit, LinkedIn, X, blogs, podcasts, and YouTube to generate data-driven article outlines based on user intent analysis

13
gpt-researcher

GPT Researcher is an autonomous deep research agent that conducts web and local research, producing detailed reports with citations. Use this skill when helping developers understand, extend, debug, or integrate with GPT Researcher - including adding features, understanding the architecture, working with the API, customizing research workflows, adding new retrievers, integrating MCP data sources, or troubleshooting research pipelines.

4
zotero

Manage Zotero reference libraries via the Web API. Search, list, add items by DOI/ISBN/PMID (with duplicate detection), delete/trash items, update metadata and tags, export in BibTeX/RIS/CSL-JSON, batch-add from files, check PDF attachments, cross-reference citations, find missing DOIs via CrossRef, and fetch open-access PDFs. Supports --json output for scripting. Use when the user asks about academic references, citation management, literature libraries, PDFs for papers, bibliography export, or Zotero specifically.

4
backend-dev-guidelines

Comprehensive backend development guide for Langfuse's Next.js 14/tRPC/Express/TypeScript monorepo. Use when creating tRPC routers, public API endpoints, BullMQ queue processors, services, or working with tRPC procedures, Next.js API routes, Prisma database access, ClickHouse analytics queries, Redis queues, OpenTelemetry instrumentation, Zod v4 validation, env.mjs configuration, tenant isolation patterns, or async patterns. Covers layered architecture (tRPC procedures → services, queue processors → services), dual database system (PostgreSQL + ClickHouse), projectId filtering for multi-tenant isolation, traceException error handling, observability patterns, and testing strategies (Jest for web, vitest for worker).

1