Updated April 2026Comparison18 min read

Context7 vs DeepWiki vs Ref Tools vs Docfork: Tested Comparison (2026)

Four MCP servers, one job: stop your AI agent from hallucinating outdated APIs. The differences come down to where the docs live, what the trigger looks like, what they cost, and how much context they spend per call. We pulled every fact from the tools’ own docs, repos, and pricing pages — no review-aggregator slop.

Editorial illustration: a 2x2 grid of luminous teal documentation glyphs (books, wiki page, magnifying glass over docs, document split) connected by a soft cross of dot-and-dash light, on a midnight navy background.
On this page · 14 sections
  1. TL;DR + decision tree
  2. What docs-RAG MCP servers do
  3. Side-by-side matrix
  4. Context7 — install + recipe
  5. DeepWiki — install + recipe
  6. Ref Tools — install + recipe
  7. Docfork — install + recipe
  8. Pricing matrix
  9. Free / open-source alternatives
  10. Benchmark them yourself
  11. Common pitfalls
  12. Community signal
  13. FAQ
  14. Sources

TL;DR + decision tree

  • Need version-pinned framework snippets injected mid-prompt? Context7. Append use context7 to your prompt and you’re done.
  • Need to understand an unfamiliar public repo end-to-end? DeepWiki. It exposes Cognition’s auto-generated wiki for any public GitHub repo via three tools.
  • Need token-efficient docs search across public docs and your private repos / PDFs / uploads? Ref Tools. Closed-API, paid after 200 free credits, but the strongest signal-to-noise on long-running agent tasks.
  • Need stack-scoped docs that auto-limit search to your package.json dependencies? Docfork. dgrep init reads your manifest once, caches the canonical library IDs, and versions them via “Cabinets”.

We’ll cover each in detail below — feature matrix first, then per-tool install (the same canonical install card from each server’s detail page), pricing, free-tier reality, and a benchmark methodology you can run yourself.

What docs-RAG MCP servers actually do

These four servers solve the same underlying problem in different ways: an LLM’s training data is months stale, so when it writes Next.js 15 App Router code or React Query 5 hooks, it routinely invents APIs that don’t exist or uses signatures that were deprecated two minor versions ago. Documentation-RAG MCP servers fix this by retrieving fresh, version-specific docs at request time and injecting them into the model’s context window.

The differences come down to four axes:

  1. Where the docs live. Context7 indexes community-contributed library projects. DeepWiki auto-generates a wiki for every public GitHub repo. Ref crawls open web docs plus your own private sources. Docfork curates a popular-libraries catalog and lets you add custom-indexed repos.
  2. How the agent invokes it. Context7 has a prompt suffix (use context7). The other three expose tool calls the agent picks based on the model’s reasoning.
  3. How much context it spends. Context7 ships full snippet sets. Ref caps every page read at ~5k tokens via session-aware filtering. DeepWiki returns wiki sections. Docfork returns ranked sections via hybrid semantic + BM25.
  4. Money. Context7 is MIT and self-hostable. DeepWiki is hosted-free for public repos. Ref is closed and paid after 200 credits. Docfork is currently free (“Pro plans coming”).

If you’re new to the protocol underneath, our What is MCP primer covers the JSON-RPC wire format these servers run on. The rest of this post assumes you know that already.

Side-by-side matrix

Every cell in this matrix is sourced from the tool’s own docs, repo, or pricing page (citations in the per-tool sections below). Live GitHub stats checked 2026-04-30.

DimensionContext7DeepWikiRef ToolsDocfork
AuthorUpstashCognition AIMatthew DaileyDocfork Corp
LicenseMITHosted (no public repo)MIT (shim only)MIT
Self-hostable✅ Yes❌ Hosted only❌ Paid APIUnclear
Triggeruse context7 (prompt suffix)Tool calls (3 tools)Tool calls (2 tools)Tool calls (2 tools)
Docs sourceCommunity-contributed library projectsPublic GitHub repos via wikiOpen web + private repos/PDFs/uploadsCurated catalog + custom repos
Remote URLmcp.context7.com/mcpmcp.deepwiki.com/mcpapi.ref.tools/mcpmcp.docfork.com/mcp
Free tier1,000 calls / moFree, no auth (public repos)200 credits one-time1,000 req / mo (or all features free per pricing page)
Paid entry$10 / seat / mo (Pro)Devin (separate product)$19 / mo (Basic)Not yet published
GitHub stars (live)54,174(no public repo)1,092471
MCP.Directory page/servers/context7/servers/deepwiki/servers/ref-tools/servers/docfork

Three takeaways from the matrix. Context7’s star count (54,174) is roughly 50× any competitor. That’s the dominant ecosystem signal — every tutorial, every blog post, every “how do I MCP” reply on Reddit defaults to Context7. DeepWiki has no public repo because Cognition runs it as a hosted service; the only relevant licensing question is the terms-of-service on deepwiki.com itself. Ref’s MIT shim is misleading if you’re hunting for self-hosting — the search index is closed-source and lives behind a paid API key.

Context7 — install + recipe

Upstash’s Context7 leads on raw adoption. The pitch on their GitHub README: “Up-to-date Code Docs For Any Prompt.” The mechanism: append use context7 to a prompt; the MCP server first calls resolve-library-id to map the package name to a Context7-compatible /org/project ID, then query-docs returns the current docs and snippets, injected directly into the model’s context window.

Free tier. $0 includes 1,000 API calls per month, public repos, OAuth 2.0, and 20 bonus daily calls after the limit (verbatim from context7.com/plans). Pro is $10/seat/month for private repos and effectively unlimited calls (5,000 included per member, $10 per 1,000 overage). Enterprise adds SOC-2, SSO, and a self-hosted option.

License. MIT. Self-hostable — that’s the cleanest answer to the cluster of “free alternative to context7” queries: Context7 itself can be the free alternative, since the source is open and the container is shippable.

One-line install · Context7

Open server page

Install

Recipe — Next.js 15 App Router migration. In a Cursor chat, with Context7 installed:

Generate a Next.js 15 App Router page that streams a database query
from Drizzle ORM. Use server components, no use client. Stream via
Suspense boundaries. use context7

The model invokes resolve-library-id for both next and drizzle-orm, fetches the version-current docs for App Router streaming and Drizzle’s async helpers, and writes a route that uses the actual 15.x API surface — not the App Router 13.x layout the model otherwise hallucinates from training data. This is the workflow Thoughtworks named in placing Context7 in the Trial ring of their Technology Radar.

DeepWiki — install + recipe

Cognition AI launched DeepWiki on May 5, 2025 as “the free public version of Devin Wiki and Devin Search.” The MCP server, launched May 22, exposes Cognition’s index of public GitHub repos via three tools: read_wiki_structure, read_wiki_contents, and ask_question. Cognition’s own pitch: at launch, the index covered “the top 50,000 most popular GitHub repositories”.

Free tier. Verbatim from the Devin docs: “a free, remote, no-authentication-required service.” Public repos only. Private-repo coverage requires a Devin account and the separate Devin MCP server.

License. N/A in the conventional sense — Cognition has no public source repo for the MCP server; it runs as a hosted service. There’s a community wrapper at github.com/regenrek/deepwiki-mcp that scrapes deepwiki.com via Markdown conversion (MIT, 1,338 stars), but the README now states “currently not working since DeepWiki has cut off the possibility to scrape it” — so the official MCP endpoint is the only programmatic path.

One-line install · DeepWiki

Open server page

Install

Recipe — orient yourself in an unfamiliar repo. In Claude Code, with DeepWiki installed, you’re onboarded to a new repository and need a quick architectural overview before you start touching files:

Use the DeepWiki MCP. Read the wiki structure for the
modelcontextprotocol/python-sdk repo, then call ask_question:
"Where is the StreamableHTTP transport implemented and what
class names should I look at?" Cite the file paths in the
answer.

The model calls read_wiki_structure, picks the Architecture section, then ask_question drives a narrowed answer. This is the same pattern Andrej Karpathy reportedly used to extract a training feature from a library as a standalone module.

Honesty bit. DeepWiki’s auto-generated wikis are good for orientation but can confidently hallucinate on niche or complex codebases. On the launch HN thread, user buovjaga reported DeepWiki listed Buck as LibreOffice’s primary build system, which is factually wrong; jcranmer documented similar inaccuracies on the LLVM page. Use it to navigate, not to trust as authoritative on complex projects.

Ref Tools — install + recipe

Ref Tools is the smallest of the four by stars (1,092) but the most-cited when developers complain about Context7’s token bloat. The pitch on their homepage: “Context for your coding agent. Give your agent the docs it needs to succeed. Exactly the tokens you need, no bloat.” Two tools the agent calls: ref_search_documentation and ref_read_url. The token-efficiency mechanism is a 5k-token cap on every page read combined with session-aware filtering that never returns repeat results inside a single session.

Free tier. 200 credits — and importantly, that’s a one-time grant, not monthly (verified from Ref’s own JS bundle plans-CjSGq3Er.js: { initialCredits: 200, name: 'Free' }). Heavy daily users hit a paywall fast. Basic is $19/month for 2,000 credits; Pro $50/month for 6,000 credits; Max $200/mo for 30,000.

License. The npm shim (github.com/ref-tools/ref-tools-mcp) is MIT, but the search index lives behind api.ref.tools and requires REF_API_KEY. Not self-hostable. If “open source” means “run my own search index,” Ref doesn’t qualify.

One-line install · Ref Tools

Open server page

Install

Recipe — long-running agent on a private repo. You’re running a Claude Opus background agent on a refactor that touches both your private codebase and the Stripe API docs. Context budget matters because the conversation will run for an hour.

Use ref-tools. Search documentation for "stripe customer
portal sessions" and read the most relevant page. Then search
my private repo @stripe-billing for existing CustomerPortalSession
usage. Pair the results into a migration plan that switches us
from the old Checkout flow.

Ref’s session-aware filtering means the agent won’t re-fetch the same Stripe page on its second search. The 5k-token cap on each page read keeps the budget tight enough that the agent can run hours without context rot. This is the workflow Ray Fernando demonstrated in his “Two MCPs That Save 97% of Your Context Window” walkthrough — Ref + Exa as the canonical doc-search pair for long-context agent runs.

Docfork — install + recipe

Docfork is the newest of the four and explicitly positions itself against Context7 — their README FAQ has a section titled “How is Docfork different from Context7?” Two angles set them apart: stack scoping (the dgrep init CLI reads your package.json and limits searches to your declared dependencies, cached in .dgrep/config.json) and hybrid retrieval (semantic search plus BM25 fused via Reciprocal Rank Fusion, with AST-aware chunking).

Free tier. The README says “Free: 1,000 requests/month per organization”. The pricing page currently states “We’re currently offering free access to all features. Pro plans with additional benefits will be available soon.” Both are primary sources; we keep both verbatim rather than reconciling them. Pro pricing is not yet published.

License. MIT (verified via the LICENSE file in github.com/docfork/docfork).

One-line install · Docfork

Open server page

Install

Recipe — stack-scoped Next.js + Tailwind + Drizzle help.

Use Docfork. Run dgrep init in this Next.js 15 app to map
package.json to canonical libraries. Then search_docs for
"Tailwind v4 container queries" and fetch_doc for the top
result. Apply the recipe to components/Pricing.tsx.

Because dgrep init resolves the manifest once and caches it, every subsequent search_docs call in the session is automatically scoped to your declared versions. You don’t get React Native results when you’re working on a Next.js project — a problem Context7 occasionally has when the model picks the wrong /org/project in resolve-library-id.

Honesty bit. Docfork’s third-party-community signal is thin. We could only verify a single Hacker News comment — and it’s from frodo999, a member of the Docfork team announcing the launch (HN 44063703). The mechanics in the README are well-documented and the MIT-licensed repo is real and actively committed (last commit 2026-04-19), but if “social proof” is a hard requirement, prefer Context7.

Pricing matrix

All numbers below are pulled directly from each tool’s pricing page or live JS bundle on 2026-04-30. Treat as a snapshot — the vendors update independently.

TierContext7DeepWikiRef ToolsDocfork
Free1,000 calls/mo + OAuthFree, no auth (public repos)200 credits one-time1,000 req/mo / all features (currently)
Entry paid$10/seat/mo (Pro)n/a — use Devin$19/mo (Basic, 2k credits)TBA
MidDevin (separate)$50/mo (Pro, 6k credits)TBA
TopCustom (Enterprise)$200/mo (Max, 30k credits)TBA
Overage$10 per 1,000 callsn/a$9–10 per 1,000 creditsTBA
Self-host✅ MIT, ship it yourself❌ Hosted only❌ Closed APIUnclear

Free and open-source alternatives — what your real options are

The most-searched intent in this cluster is some variant of “free alternative to Context7”. The honest answer is that the framing is slightly off. Context7 itself is the open-source option. Here’s the actual decision tree:

Want to self-host docs RAG with the same UX?

Run Context7 itself. The repo is MIT, the streamable-http server is shippable, and you can point it at your own library catalog. It’s the canonical “self-hosted Context7 alternative.”

Want a hosted free option, no auth?

DeepWiki for public-repo Q&A — the Cognition-hosted endpoint at mcp.deepwiki.com/mcp requires no sign-in.

Want token efficiency AND open source?

There isn’t a perfect option in this set. Ref Tools optimizes hardest for tokens but is closed-API. Your best move is Context7 self-hosted with aggressive trimming on snippet length, or wait for Docfork to publish Pro pricing.

Want a free-forever paid tool with no usage cap?

Doesn’t exist. Every hosted option has a quota or a paid tier. The only path to “unbounded free” is self-hosting Context7.

Benchmark them yourself

We don’t publish a one-shot latency benchmark in this post. Region, time of day, model, and prompt shape all affect the result enough that a single run from one machine isn’t representative — and we’d rather show you the methodology than fabricate numbers. Spend 15 minutes:

# Pick 5 prompts that represent your real workflow.
# Example for a Next.js dev:
PROMPTS=(
  "How do I stream a Drizzle query in a Next.js 15 server component?"
  "Tailwind v4 container queries — example with a card grid"
  "Riverpod 3 AsyncNotifier migration from StateNotifier"
  "FastAPI lifespan events for background tasks"
  "Stripe customer portal session creation"
)

# Run each prompt through your agent 3 times per server.
# Capture: total turn latency, tokens consumed, whether the
# answer compiled / passed your linter, whether the cited
# version matched your installed version.

# Compare:
#   - Context7 (use context7 suffix)
#   - DeepWiki (read_wiki_structure → ask_question)
#   - Ref Tools (ref_search_documentation → ref_read_url)
#   - Docfork (dgrep init → search_docs → fetch_doc)

The result is workload-specific. We’ve seen Ref win on long agent runs (the 5k cap genuinely matters past ~30 turns), Context7 win on short single-prompt asks (its snippet quality is high when the library is well-indexed), DeepWiki win on architectural orientation, and Docfork win when stack-scoping prevents wrong-library matches. Your workload may shape the answer differently — run the methodology, don’t trust ours.

Common pitfalls (regardless of which one you pick)

Treating any of these as “the truth”

They’re retrieval layers, not authority. DeepWiki can hallucinate (Buck-as-LibreOffice-build-system). Context7’s coverage is community-contributed. Always sanity-check against the official docs when stakes are high.

Stacking three of them in one config

Multiplies tool descriptions in every prompt (Context7’s ~2 tools, DeepWiki’s 3, Ref’s 2, Docfork’s 2 — that’s 9 tool descriptions eating context). Pick one or at most two. Three is almost always wrong.

Forgetting the trigger suffix

Context7 specifically requires you to type use context7 at the end of the prompt — the other three rely on the model picking the tool from description. If Context7 isn’t firing, this is the 99% reason.

Burning the Ref free tier in one weekend

The 200 credits are a one-time grant, not monthly. Heavy agent workflows can blow through this in a single session. If you’re evaluating, run a fixed, bounded test — don’t hand Ref to a long-running overnight loop.

Community signal

Three voices that capture the different reasons developers pick each tool. Verbatim with sources.

The skip-to-the-end answer: Context7 MCP is so good it seems like magic, even to many well-informed, highly capable hackers. Simply wildly good for libraries and SDKs.

metadat · Hacker News

Top-level comment on an HN thread answering 'what's your must-have MCP server' — the strongest unsolicited Context7 endorsement.

Source
In our experience, Context7 has greatly reduced code hallucinations and reliance on stale training data.

Thoughtworks Technology Radar · Blog

Thoughtworks placed Context7 in the Trial ring of their Technology Radar — credible enterprise-engineering signal.

Source
Context7 injects a *huge* amount of tokens into your context, which leads to a very low signal/noise ratio. I'm using https://ref.tools myself, it delivers much more targeted docs.

stingraycharles · Hacker News

The canonical Context7-vs-Ref signal-to-noise critique on Hacker News.

Source
Deepwiki was instrumental in our refactor of a large codebase away from playwright to pure CDP...one of the few strictly net positive AI coding tools.

nikisweeting · Hacker News

From the 231-point HN launch thread for DeepWiki. Captures the architectural-orientation use case.

Source

Frequently asked questions

What's the simplest way to compare Context7, DeepWiki, Ref Tools, and Docfork?

Context7 injects version-specific code snippets when you append "use context7" to a prompt. DeepWiki turns any public GitHub repo into a queryable wiki. Ref Tools is a token-efficient docs search that works across public docs, your private repos, PDFs, and uploads. Docfork is a stack-scoped docs MCP that reads your package.json and limits search to your declared dependencies. Pick by intent, not by hype.

Is Context7 free? What's the free-tier limit?

Yes — Context7's Free plan ($0) includes 1,000 API calls per month, with 20 bonus daily calls after the limit, plus access to public repos and OAuth 2.0. Pro is $10 per seat per month with private repos and unlimited calls (5,000 included per member, $10 per 1,000 overage). Source: context7.com/plans.

Is DeepWiki free? Do I need a Cognition / Devin account?

Yes — Cognition's official DeepWiki MCP server at https://mcp.deepwiki.com/mcp is described in their own docs as "a free, remote, no-authentication-required service" for public repositories. You only need a Devin account if you want private-repo indexing through Devin's separate MCP server.

Is there a free alternative to Context7 I can self-host?

Context7 itself is MIT-licensed and self-hostable — that's the most direct answer. If you want a hosted free option for repo-specific Q&A, DeepWiki's official MCP server is free with no auth required for public repos. Ref Tools is closed behind a paid API key (the npm shim is MIT but the search index is not self-hostable). Docfork's pricing page currently states "all features are free" while Pro tiers are forthcoming.

Context7 vs DeepWiki — which is better for my codebase?

Different tools, different jobs. Use Context7 when you're coding against fast-moving frameworks (Next.js 15, React 19, Tailwind 4) and want version-pinned snippets injected mid-prompt. Use DeepWiki when you're trying to understand an unfamiliar repo — its tools (read_wiki_structure, read_wiki_contents, ask_question) are designed for codebase orientation, not snippet retrieval.

Why do people switch from Context7 to Ref Tools?

The published reason — verbatim from a Hacker News comment by user stingraycharles on a thread comparing the two — is signal-to-noise: "Context7 injects a huge amount of tokens into your context, which leads to a very low signal/noise ratio. I'm using https://ref.tools myself, it delivers much more targeted docs." Ref's own marketing claims roughly 55% fewer tokens at the same or better accuracy; that figure comes from Ref's docs and the founder, not an independent benchmark.

Can I use these MCP servers with Cursor, VS Code, and Claude Code?

Yes — all four expose either a remote MCP endpoint, a stdio command, or both. Each tool's install card on this page covers Cursor, VS Code, Claude Desktop, Claude Code, Gemini, Codex, Windsurf, ChatGPT, and a manual JSON config. The remote-URL transport (streamable HTTP) is the recommended path for hosted services like Context7 (mcp.context7.com), DeepWiki (mcp.deepwiki.com), Ref (api.ref.tools), and Docfork (mcp.docfork.com).

Are there other docs-RAG MCP servers worth knowing about?

Yes — for code-specific RAG, the Mermaid and Excalidraw MCP servers handle a different shape of the problem (they generate, not retrieve), and the Notion / Confluence / Atlassian MCP servers work as RAG sources if your team's docs already live there. For raw web search instead of curated docs, Tavily, Exa, and Brave Search MCP servers are the closest neighbours. Browse the full list on /servers or our /best-mcp-servers roundup.

Sources

Context7

DeepWiki

Ref Tools

Docfork

Internal links

Keep reading