
NotebookLM
Connects CLI agents like Claude and Cursor directly to Google's NotebookLM to get accurate, citation-backed answers from your own documents without hallucinations.
Empower your CLI agents with zero-hallucination answers from your own docs. NotebookLM MCP Server connects AI tools like Claude, Cursor, and Codex directly to Google’s NotebookLM, ensuring every answer is accurate, current, and citation-backed. Skip error-prone manual copy-paste—let your AI assistant research, synthesize, and reference across your documents for confident coding. All queries are grounded in your uploads, eliminating invented APIs or outdated info. Manage notebooks, automate deep research, and achieve seamless collaboration between multiple tools using a shared, always-relevant knowledge base. Spend less time debugging and more time building with trustworthy, source-based answers.
What it does
- Query NotebookLM notebooks for document-based answers
- Manage multiple NotebookLM notebooks and sessions
- Search across notebook libraries by topic or content
- Switch between different notebook contexts automatically
- Get usage statistics and session history
Best for
About NotebookLM
NotebookLM is a community-built MCP server published by PleasePrompto that provides AI assistants with tools and capabilities via the Model Context Protocol. Empower your CLI agents with NotebookLM—connect AI tools for citation-backed answers from your docs, grounded in your ow It is categorized under productivity. This server exposes 16 tools that AI clients can invoke during conversations and coding sessions.
How to install
You can install NotebookLM in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
NotebookLM is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
Tools (16)
# Conversational Research Partner (NotebookLM • Gemini 2.5 • Session RAG) ## No Active Notebook - Visit https://notebooklm.google to create a notebook and get a share link - Use **add_notebook** to add it to your library (explains how to get the link) - Use **list_notebooks** to show available sources - Use **select_notebook** to set one active > Auth tip: If login is required, use the prompt 'notebooklm.auth-setup' and then verify with the 'get_health' tool. If authentication later fails (e.g., expired cookies), use the prompt 'notebooklm.auth-repair'. Tip: Tell the user you can manage NotebookLM library and ask which notebook to use for the current task.
PERMISSION REQUIRED — Only when user explicitly asks to add a notebook. ## Conversation Workflow (Mandatory) When the user says: "I have a NotebookLM with X" 1) Ask URL: "What is the NotebookLM URL?" 2) Ask content: "What knowledge is inside?" (1–2 sentences) 3) Ask topics: "Which topics does it cover?" (3–5) 4) Ask use cases: "When should we consult it?" 5) Propose metadata and confirm: - Name: [suggested] - Description: [from user] - Topics: [list] - Use cases: [list] "Add it to your library now?" 6) Only after explicit "Yes" → call this tool ## Rules - Do not add without user permission - Do not guess metadata — ask concisely - Confirm summary before calling the tool ## Example User: "I have a notebook with n8n docs" You: Ask URL → content → topics → use cases; propose summary User: "Yes" You: Call add_notebook ## How to Get a NotebookLM Share Link Visit https://notebooklm.google/ → Login (free: 100 notebooks, 50 sources each, 500k words, 50 daily queries) 1) Click "+ New" (top right) → Upload sources (docs, knowledge) 2) Click "Share" (top right) → Select "Anyone with the link" 3) Click "Copy link" (bottom left) → Give this link to Claude (Upgraded: Google AI Pro/Ultra gives 5x higher limits)
List all library notebooks with metadata (name, topics, use cases, URL). Use this to present options, then ask which notebook to use for the task.
Get detailed information about a specific notebook by ID
Set a notebook as the active default (used when ask_question has no notebook_id). ## When To Use - User switches context: "Let's work on React now" - User asks explicitly to activate a notebook - Obvious task change requires another notebook ## Auto-Switching - Safe to auto-switch if the context is clear and you announce it: "Switching to React notebook for this task..." - If ambiguous, ask: "Switch to [notebook] for this task?" ## Example User: "Now let's build the React frontend" You: "Switching to React notebook..." (call select_notebook)
NotebookLM MCP Server
Let your CLI agents (Claude, Cursor, Codex...) chat directly with NotebookLM for zero-hallucination answers based on your own notebooks
Installation • Quick Start • Why NotebookLM • Examples • Claude Code Skill • Documentation
The Problem
When you tell Claude Code or Cursor to "search through my local documentation", here's what happens:
- Massive token consumption: Searching through documentation means reading multiple files repeatedly
- Inaccurate retrieval: Searches for keywords, misses context and connections between docs
- Hallucinations: When it can't find something, it invents plausible-sounding APIs
- Expensive & slow: Each question requires re-reading multiple files
The Solution
Let your local agents chat directly with NotebookLM — Google's zero-hallucination knowledge base powered by Gemini 2.5 that provides intelligent, synthesized answers from your docs.
Your Task → Local Agent asks NotebookLM → Gemini synthesizes answer → Agent writes correct code
The real advantage: No more manual copy-paste between NotebookLM and your editor. Your agent asks NotebookLM directly and gets answers straight back in the CLI. It builds deep understanding through automatic follow-ups — Claude asks multiple questions in sequence, each building on the last, getting specific implementation details, edge cases, and best practices. You can save NotebookLM links to your local library with tags and descriptions, and Claude automatically selects the relevant notebook based on your current task.
Why NotebookLM, Not Local RAG?
| Approach | Token Cost | Setup Time | Hallucinations | Answer Quality |
|---|---|---|---|---|
| Feed docs to Claude | 🔴 Very high (multiple file reads) | Instant | Yes - fills gaps | Variable retrieval |
| Web search | 🟡 Medium | Instant | High - unreliable sources | Hit or miss |
| Local RAG | 🟡 Medium-High | Hours (embeddings, chunking) | Medium - retrieval gaps | Depends on setup |
| NotebookLM MCP | 🟢 Minimal | 5 minutes | Zero - refuses if unknown | Expert synthesis |
What Makes NotebookLM Superior?
- Pre-processed by Gemini: Upload docs once, get instant expert knowledge
- Natural language Q&A: Not just retrieval — actual understanding and synthesis
- Multi-source correlation: Connects information across 50+ documents
- Citation-backed: Every answer includes source references
- No infrastructure: No vector DBs, embeddings, or chunking strategies needed
Installation
Claude Code
claude mcp add notebooklm npx notebooklm-mcp@latest
Codex
codex mcp add notebooklm -- npx notebooklm-mcp@latest
Gemini
gemini mcp add notebooklm npx notebooklm-mcp@latest
Cursor
Add to ~/.cursor/mcp.json:
{
"mcpServers": {
"notebooklm": {
"command": "npx",
"args": ["-y", "notebooklm-mcp@latest"]
}
}
}
amp
amp mcp add notebooklm -- npx notebooklm-mcp@latest
VS Code
code --add-mcp '{"name":"notebooklm","command":"npx","args":["notebooklm-mcp@latest"]}'
Other MCP clients
Generic MCP config:
{
"mcpServers": {
"notebooklm": {
"command": "npx",
"args": ["notebooklm-mcp@latest"]
}
}
}
Alternative: Claude Code Skill
Prefer Claude Code Skills over MCP? This server is now also available as a native Claude Code Skill with a simpler setup:
NotebookLM Claude Code Skill - Clone to ~/.claude/skills and start using immediately
Key differences:
- MCP Server (this repo): Persistent sessions, works with Claude Code, Codex, Cursor, and other MCP clients
- Claude Code Skill: Simpler setup, Python-based, stateless queries, works only with local Claude Code
Both use the same browser automation technology and provide zero-hallucination answers from your NotebookLM notebooks.
Quick Start
1. Install the MCP server (see Installation above)
2. Authenticate (one-time)
Say in your chat (Claude/Codex):
"Log me in to NotebookLM"
A Chrome window opens → log in with Google
3. Create your knowledge base
Go to notebooklm.google.com → Create notebook → Upload your docs:
- 📄 PDFs, Google Docs, markdown files
- 🔗 Websites, GitHub repos
- 🎥 YouTube videos
- 📚 Multiple sources per notebook
Share: ⚙️ Share → Anyone with link → Copy
4. Let Claude use it
"I'm building with [library]. Here's my NotebookLM: [link]"
That's it. Claude now asks NotebookLM whatever it needs, building expertise before writing code.
Real-World Example
Building an n8n Workflow Without Hallucinations
Challenge: n8n's API is new — Claude hallucinates node names and functions.
Solution:
- Downloaded complete n8n documentation → merged into manageable chunks
- Uploaded to NotebookLM
- Told Claude: "Build me a Gmail spam filter workflow. Use this NotebookLM: [link]"
Watch the AI-to-AI conversation:
Claude → "How does Gmail integration work in n8n?"
NotebookLM → "Use Gmail Trigger with polling, or Gmail node with Get Many..."
Claude → "How to decode base64 email body?"
NotebookLM → "Body is base64url encoded in payload.parts, use Function node..."
Claude → "How to parse OpenAI response as JSON?"
NotebookLM → "Set responseFormat to json, use {{ $json.spam }} in IF node..."
Claude → "What about error handling if the API fails?"
NotebookLM → "Use Error Trigger node with Continue On Fail enabled..."
Claude → ✅ "Here's your complete workflow JSON..."
Result: Perfect workflow on first try. No debugging hallucinated APIs.
Core Features
Zero Hallucinations
NotebookLM refuses to answer if information isn't in your docs. No invented APIs.
Autonomous Research
Claude asks follow-up questions automatically, building complete understanding before coding.
Smart Library Management
Save NotebookLM links with tags and descriptions. Claude auto-selects the right notebook for your task.
"Add [link] to library tagged 'frontend, react, components'"
Deep, Iterative Research
- Claude automatically asks follow-up questions to build complete understanding
- Each answer triggers deeper questions until Claude has all the details
- Example: For n8n workflow, Claude asked multiple sequential questions about Gmail integration, error handling, and data transformation
Cross-Tool Sharing
Set up once, use everywhere. Claude Code, Codex, Cursor — all share the same library.
Deep Cleanup Tool
Fresh start anytime. Scans entire system for NotebookLM data with categorized preview.
Tool Profiles
Reduce token usage by loading only the tools you need. Each tool consumes context tokens — fewer tools = faster responses and lower costs.
Available Profiles
| Profile | Tools | Use Case |
|---|---|---|
| minimal | 5 | Query-only: ask_question, get_health, list_notebooks, select_notebook, get_notebook |
| standard | 10 | + Library management: setup_auth, list_sessions, add_notebook, update_notebook, search_notebooks |
| full | 16 | All tools including cleanup_data, re_auth, remove_notebook, reset_session, close_session, get_library_stats |
Configure via CLI
# Check current settings
npx notebooklm-mcp config get
# Set a profile
npx notebooklm-mcp config set profile minimal
npx notebooklm-mcp config set profile standard
npx notebooklm-mcp config set profile full
# Disable specific tools (comma-separated)
npx notebooklm-mcp config set disabled-tools "cleanup_data,re_auth"
# Reset to defaults
npx notebooklm-mcp config reset
Configure via Environment Variables
# Set profile
export NOTEBOOKLM_PROFILE=minimal
# Disable specific tools
export NOTEBOOKLM_DISABLED_TOOLS="cleanup_data,re_auth,remove_notebook"
Settings are saved to ~/.config/notebooklm-mcp/settings.json and persist across sessions. Environment variables override file settings.
Architecture
graph LR
A[Your Task] --> B[Claude/Codex]
B --> C[MCP Server]
C --> D[Chrome Automation]
D --> E[NotebookLM]
E --> F[Gemini 2.5]
F --> G[Your Docs]
G --> F
F --> E
E --> D
D --> C
C --> B
B --> H[Accurate Code]
Common Commands
| Intent | Say | Result |
|---|---|---|
| Authenticate | "Open NotebookLM auth setup" or "Log me in to NotebookLM" | Chrome opens for login |
| Add notebook | "Add [link] to library" | Saves notebook with metadata |
| List notebooks | "Show our notebooks" | Lists all saved notebooks |
| Research first | "Research this in NotebookLM before coding" | Multi-question session |
| Select notebook | "Use the React notebook" | S |
README truncated. View full README on GitHub.
Alternatives
Related Skills
Browse all skillsQuery Google NotebookLM for source-grounded, citation-backed answers from uploaded documents. Reduces hallucinations through Gemini's document-only responses. Browser automation with library management and persistent authentication.
自动从 Z-Library 下载书籍并上传到 Google NotebookLM。支持 PDF/EPUB 格式,自动转换,一键创建知识库。
Leveraging AI coding assistants and tools to boost development productivity, while maintaining oversight to ensure quality results.
Creates educational Teams channel posts for internal knowledge sharing about Claude Code features, tools, and best practices. Applies when writing posts, announcements, or documentation to teach colleagues effective Claude Code usage, announce new features, share productivity tips, or document lessons learned. Provides templates, writing guidelines, and structured approaches emphasizing concrete examples, underlying principles, and connections to best practices like context engineering. Activates for content involving Teams posts, channel announcements, feature documentation, or tip sharing.
Expert guide for the NotebookLM CLI (`nlm`) and MCP server - interfaces for Google NotebookLM. Use this skill when users want to interact with NotebookLM programmatically, including: creating/managing notebooks, adding sources (URLs, YouTube, text, Google Drive), generating content (podcasts, reports, quizzes, flashcards, mind maps, slides, infographics, videos, data tables), conducting research, chatting with sources, or automating NotebookLM workflows. Triggers on mentions of "nlm", "notebooklm", "notebook lm", "podcast generation", "audio overview", or any NotebookLM-related automation task.
Expert methodology for defining, tracking, and interpreting engineering performance metrics including DORA, team health, productivity, and executive reporting.