
hn-mcp-server
MCP server for Hacker News with feeds, comments, user profiles, and search. Access HN data through Firebase and Algolia
An MCP server that provides access to Hacker News data including feeds, threaded discussions, user profiles, and full-text search via the HN Firebase and Algolia APIs.
About hn-mcp-server
hn-mcp-server is a community-built MCP server published by cyanheads that provides AI assistants with tools and capabilities via the Model Context Protocol. MCP server for Hacker News with feeds, comments, user profiles, and search. Access HN data through Firebase and Algolia It is categorized under databases. This server exposes 4 tools that AI clients can invoke during conversations and coding sessions.
How to install
You can install hn-mcp-server in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
hn-mcp-server is released under the Apache-2.0 license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
Tools (4)
Fetch stories from an HN feed (top, new, best, ask, show, jobs) with pagination
Get an item and its comment tree as a threaded discussion with depth/count controls
Fetch a user profile with karma, about, and optionally their recent submissions
Search stories and comments via Algolia with type, author, date, and score filters
@cyanheads/hn-mcp-server
MCP server for Hacker News. Feeds, threaded discussions, user profiles, and full-text search via the HN Firebase and Algolia APIs. Runs over stdio or HTTP.
Public Hosted Server: https://hn.caseyjhand.com/mcp
Tools
Four read-only tools for accessing Hacker News data:
| Tool Name | Description |
|---|---|
hn_get_stories | Fetch stories from an HN feed (top, new, best, ask, show, jobs) with pagination. |
hn_get_thread | Get an item and its comment tree as a threaded discussion with depth/count controls. |
hn_get_user | Fetch a user profile with karma, about, and optionally their recent submissions. |
hn_search_content | Search stories and comments via Algolia with type, author, date, and score filters. |
hn_get_stories
Fetch stories from any HN feed with pagination support.
- Six feed types:
top,new,best,ask,show,jobs - Configurable count (1–100, default 30) and offset for pagination
- Returns enriched story objects with title, URL, score, author, comment count, and body text
hn_get_thread
Retrieve an item and its full comment tree via ranked breadth-first traversal.
- Depth control (0–10, default 3) — depth 0 doubles as a single-item lookup
- Comment limit (1–200, default 50) caps total comments across all levels
- Breadth-first traversal preserves HN's ranking order
- Flat comment list with
depth/parentIdfor tree reconstruction
hn_get_user
Fetch a user profile with optional recent submission resolution.
- Profile includes karma, creation date, and about text (HTML stripped)
- Optionally resolves up to 50 most recent submissions into full items
- Submission resolution filters out dead/deleted items
hn_search_content
Full-text search via the Algolia HN Search API.
- Filter by content type:
story,comment,ask_hn,show_hn,front_page - Filter by author, date range (ISO 8601), and minimum points
- Sort by relevance or date
- Pagination with page/count controls
Features
Built on @cyanheads/mcp-ts-core:
- Declarative tool definitions — single file per tool, framework handles registration and validation
- Unified error handling across all tools
- Structured logging with request correlation
- Runs locally (stdio/HTTP) from the same codebase
HN-specific:
- Concurrent batch fetching with configurable parallelism for item resolution
- HTML entity decoding and tag stripping with code block and link preservation
- No API keys required — HN APIs are public
Getting Started
Public Hosted Instance
A public instance is available at https://hn.caseyjhand.com/mcp — no installation required. Point any MCP client at it via Streamable HTTP:
{
"mcpServers": {
"hn": {
"type": "streamable-http",
"url": "https://hn.caseyjhand.com/mcp"
}
}
}
Self-Hosted / Local
Add to your MCP client config (e.g., claude_desktop_config.json):
{
"mcpServers": {
"hn-mcp-server": {
"type": "stdio",
"command": "bunx",
"args": ["@cyanheads/hn-mcp-server@latest"]
}
}
}
Or with npx:
{
"mcpServers": {
"hn-mcp-server": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@cyanheads/hn-mcp-server"]
}
}
}
Prerequisites
- Bun v1.2.0 or higher (or Node.js >= 22)
Installation
git clone https://github.com/cyanheads/hn-mcp-server.git
cd hn-mcp-server
bun install
Configuration
All configuration is via environment variables. No API keys required — HN APIs are public.
| Variable | Description | Default |
|---|---|---|
HN_CONCURRENCY_LIMIT | Max concurrent HTTP requests for batch item fetches (1–50). | 10 |
MCP_TRANSPORT_TYPE | Transport: stdio or http. | stdio |
MCP_HTTP_PORT | HTTP server port. | 3010 |
MCP_HTTP_HOST | HTTP server host. | localhost |
MCP_LOG_LEVEL | Log level: debug, info, notice, warning, error. | info |
LOGS_DIR | Directory for log files (Node.js only). | <project-root>/logs |
Running the Server
Local Development
bun run dev:stdio # Dev mode (stdio, auto-reload)
bun run dev:http # Dev mode (HTTP, auto-reload)
bun run test # Run test suite
bun run devcheck # Lint + format + typecheck + audit
Production
bun run build
bun run start:stdio # or start:http
Docker
docker build -t hn-mcp-server .
docker run -p 3010:3010 hn-mcp-server
Project Structure
| Directory | Purpose |
|---|---|
src/index.ts | createApp() entry point. |
src/config/ | Server-specific env var parsing with Zod. |
src/services/hn/ | HN Firebase + Algolia API client and domain types. |
src/mcp-server/tools/definitions/ | Tool definitions (*.tool.ts). |
Development Guide
See CLAUDE.md for development guidelines and architectural rules. The short version:
- Handlers throw, framework catches — no
try/catchin tool logic - Use
ctx.logfor request-scoped logging - All tools are read-only — no auth scopes required
Contributing
Issues and pull requests are welcome. Run checks before submitting:
bun run devcheck
bun run test
License
Apache-2.0 — see LICENSE for details.
Alternatives
Related Skills
Browse all skillsConduct comprehensive, systematic literature reviews using multiple academic databases (PubMed, arXiv, bioRxiv, Semantic Scholar, etc.). This skill should be used when conducting systematic literature reviews, meta-analyses, research synthesis, or comprehensive literature searches across biomedical, scientific, and technical domains. Creates professionally formatted markdown documents and PDFs with verified citations in multiple citation styles (APA, Nature, Vancouver, etc.).
Comprehensive guide for PostgreSQL psql - the interactive terminal client for PostgreSQL. Use when connecting to PostgreSQL databases, executing queries, managing databases/tables, configuring connection options, formatting output, writing scripts, managing transactions, and using advanced psql features for database administration and development.
Notion workspace integration. Use when user wants to read/write Notion pages, search databases, create tasks, or sync content with Notion.
This skill should be used when the user requests to generate, create, or add Row-Level Security (RLS) policies for Supabase databases in multi-tenant or role-based applications. It generates comprehensive RLS policies using auth.uid(), auth.jwt() claims, and role-based access patterns. Trigger terms include RLS, row level security, supabase security, generate policies, auth policies, multi-tenant security, role-based access, database security policies, supabase permissions, tenant isolation.
Transforms conversations and discussions into structured documentation pages in Notion. Captures insights, decisions, and knowledge from chat context, formats appropriately, and saves to wikis or databases with proper organization and linking for easy discovery.
Autonomous biomedical AI agent framework for executing complex research tasks across genomics, drug discovery, molecular biology, and clinical analysis. Use this skill when conducting multi-step biomedical research including CRISPR screening design, single-cell RNA-seq analysis, ADMET prediction, GWAS interpretation, rare disease diagnosis, or lab protocol optimization. Leverages LLM reasoning with code execution and integrated biomedical databases.