MotherDuck & DuckDB

MotherDuck & DuckDB

Official
motherduckdb

Connects AI assistants to DuckDB databases (local, MotherDuck, or S3) for SQL analytics and data queries.

Integrates MotherDuck and local DuckDB databases for flexible querying and analysis of structured data in MCP-compatible environments.

435218 views73Local (stdio)

What it does

  • Execute SQL queries on DuckDB databases
  • Query local DuckDB files and in-memory databases
  • Connect to MotherDuck cloud databases
  • Access S3-hosted DuckDB databases
  • Switch between database connections
  • Browse database catalogs and schemas

Best for

Data analysts querying structured datasetsSQL analytics through AI assistantsData engineering workflowsCross-database querying between local and cloud
Works with local files and cloud databasesRead-only mode by default for securityAlternative to fully-managed remote option

About MotherDuck & DuckDB

MotherDuck & DuckDB is an official MCP server published by motherduckdb that provides AI assistants with tools and capabilities via the Model Context Protocol. Integrates MotherDuck and local DuckDB for flexible querying and analysis of structured data in MCP-compatible environme It is categorized under databases, analytics data. This server exposes 1 tool that AI clients can invoke during conversations and coding sessions.

How to install

You can install MotherDuck & DuckDB in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

MotherDuck & DuckDB is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

Tools (1)

query

Use this to execute a query on the MotherDuck or DuckDB database

MotherDuck / DuckDB Local MCP Server

DuckDB / MotherDuck Local MCP Server

SQL analytics and data engineering for AI Assistants and IDEs.


Connect AI assistants to your data using DuckDB's powerful analytical SQL engine. Supports connecting to local DuckDB files, in-memory databases, S3-hosted databases, and MotherDuck. Allows executing SQL read- and write-queries, browsing database catalogs, and switching between different database connections on-the-fly.

Looking for a fully-managed remote MCP server for MotherDuck?Go to the MotherDuck Remote MCP docs

Remote vs Local MCP

Remote MCPLocal MCP (this repo)
HostingHosted by MotherDuckRuns locally/self-hosted
SetupZero-setupRequires local installation
AccessRead-write supportedRead-write supported
Local filesystem-Query across local and remote databases, ingest data from / export data to local filesystem

📝 Migrating from v0.x?

  • Read-only by default: The server now runs in read-only mode by default. Add --read-write to enable write access. See Securing for Production.
  • Default database changed: --db-path default changed from md: to :memory:. Add --db-path md: explicitly for MotherDuck.
  • MotherDuck read-only requires read-scaling token: MotherDuck connections in read-only mode require a read-scaling token. Regular tokens require --read-write.

Quick Start

Prerequisites: Install uv via pip install uv or brew install uv

Connecting to In-Memory DuckDB (Dev Mode)

{
  "mcpServers": {
    "DuckDB (in-memory, r/w)": {
      "command": "uvx",
      "args": ["mcp-server-motherduck", "--db-path", ":memory:", "--read-write", "--allow-switch-databases"]
    }
  }
}

Full flexibility with no guardrails — read-write access and the ability to switch to any database (local files, S3, or MotherDuck) at runtime.

Connecting to a Local DuckDB File in Read-Only Mode

{
  "mcpServers": {
    "DuckDB (read-only)": {
      "command": "uvx",
      "args": ["mcp-server-motherduck", "--db-path", "/absolute/path/to/your.duckdb"]
    }
  }
}

Connects to a specific DuckDB file in read-only mode. Won't hold on to the file lock, so convenient to use alongside a write connection to the same DuckDB file. You can also connect to remote DuckDB files on S3 using s3://bucket/path.duckdb — see Environment Variables for S3 authentication. If you're considering third-party access to the MCP, see Securing for Production.

Connecting to MotherDuck in Read-Write Mode

{
  "mcpServers": {
    "MotherDuck (local, r/w)": {
      "command": "uvx",
      "args": ["mcp-server-motherduck", "--db-path", "md:", "--read-write"],
      "env": {
        "motherduck_token": "<YOUR_MOTHERDUCK_TOKEN>"
      }
    }
  }
}

See Command Line Parameters for more options, Securing for Production for deployment guidance, and Troubleshooting if you encounter issues.

Client Setup

ClientConfig LocationOne-Click Install
Claude DesktopSettings → Developer → Edit Config.mcpb (MCP Bundle)
Claude CodeUse CLI commands below-
Codex CLIUse CLI commands below-
CursorSettings → MCP → Add new global MCP serverInstall in Cursor
VS CodeCtrl+Shift+P → "Preferences: Open User Settings (JSON)"Install with UV in VS Code

Any MCP-compatible client can use this server. Add the JSON configuration from Quick Start to your client's MCP config file. Consult your client's documentation for the config file location.

Claude Code CLI commands

In-Memory DuckDB (Dev Mode):

claude mcp add --scope user duckdb --transport stdio -- uvx mcp-server-motherduck --db-path :memory: --read-write --allow-switch-databases

Local DuckDB (Read-Only):

claude mcp add --scope user duckdb --transport stdio -- uvx mcp-server-motherduck --db-path /absolute/path/to/db.duckdb

MotherDuck (Read-Write):

claude mcp add --scope user motherduck --transport stdio --env motherduck_token=YOUR_TOKEN -- uvx mcp-server-motherduck --db-path md: --read-write
Codex CLI commands

In-Memory DuckDB (Dev Mode):

codex mcp add duckdb -- uvx mcp-server-motherduck --db-path :memory: --read-write --allow-switch-databases

Local DuckDB (Read-Only):

codex mcp add duckdb -- uvx mcp-server-motherduck --db-path /absolute/path/to/db.duckdb

MotherDuck (Read-Write):

codex mcp add motherduck --env motherduck_token=YOUR_TOKEN -- uvx mcp-server-motherduck --db-path md: --read-write

Tools

ToolDescriptionRequired InputsOptional Inputs
execute_queryExecute SQL query (DuckDB dialect)sql-
list_databasesList all databases (useful for MotherDuck or multiple attached DBs)--
list_tablesList tables and views-database, schema
list_columnsList columns of a table/viewtabledatabase, schema
switch_database_connection*Switch to different databasepathcreate_if_not_exists

*Requires --allow-switch-databases flag

All tools return JSON. Results are limited to 1024 rows / 50,000 chars by default (configurable via --max-rows, --max-chars).

Securing for Production

When giving third parties access to a self-hosted MCP server, read-only mode alone is not sufficient — it still allows access to the local filesystem, changing DuckDB settings, and other potentially sensitive operations.

For production deployments with third-party access, we recommend MotherDuck Remote MCP — zero-setup, read-write capable, and hosted by MotherDuck.

Self-hosting MotherDuck MCP: Fork this repo and customize as needed. Use a service account with read-scaling tokens and enable SaaS mode to restrict local file access.

Self-hosting DuckDB MCP: Use --init-sql to apply security settings. See the Securing DuckDB guide for available options.

Command Line Parameters

ParameterDefaultDescription
--db-path:memory:Database path: local file (absolute), md: (MotherDuck), or s3:// URL
--motherduck-tokenmotherduck_token env varMotherDuck access token
--read-writeFalseEnable write access
--motherduck-saas-modeFalseMotherDuck SaaS mode (restricts local access)
--allow-switch-databasesFalseEnable switch_database_connection tool
--max-rows1024Max rows returned
--max-chars50000Max characters returned
--query-timeout-1Query timeout in seconds (-1 = disabled)
--init-sqlNoneSQL to execute on startup
--motherduck-connection-parameterssession_hint=mcp&
dbinstance_inactivity_ttl=0s
Additional MotherDuck connection string parameters (key=value pairs separated by &)
--ephemeral-connectionsTrueUse temporary connections for read-only local files
--transportstdioTransport type: stdio or http
--stateless-httpFalseFor protocol compatibility only (e.g. with AWS Bedrock AgentCore Runtime). Server still maintains global state via the shared DatabaseClient.
--port8000Port for HTTP transport
--host127.0.0.1Host for HTTP transport

Environment Variables

VariableDescription
motherduck_token or MOTHERDUCK_TOKENMotherDuck access token (alternative to --motherduck-token)
HOMEUsed by DuckDB for extensions and config. Override with --home-dir if not set.
AWS_ACCESS_KEY_IDAWS access key for S3 database connections
AWS_SECRET_ACCESS_KEYAWS secret key for S3 database connections
AWS_SESSION_TOKENAWS session token for temporary credentials (IAM roles, SSO, EC2 instance profiles)
`AWS_D

README truncated. View full README on GitHub.

Alternatives

Related Skills

Browse all skills
sql-splitter

High-performance CLI for working with SQL dump files: split/merge by table, analyze contents, validate integrity, convert between MySQL/PostgreSQL/SQLite/MSSQL, create FK-safe samples, shard multi-tenant dumps, generate ERD diagrams, reorder for safe imports, and run SQL analytics with embedded DuckDB. Use when working with .sql dump files for migrations, dev seeding, CI validation, schema visualization, data extraction, or ad-hoc analytics.

0
literature-review

Conduct comprehensive, systematic literature reviews using multiple academic databases (PubMed, arXiv, bioRxiv, Semantic Scholar, etc.). This skill should be used when conducting systematic literature reviews, meta-analyses, research synthesis, or comprehensive literature searches across biomedical, scientific, and technical domains. Creates professionally formatted markdown documents and PDFs with verified citations in multiple citation styles (APA, Nature, Vancouver, etc.).

377
postgresql-psql

Comprehensive guide for PostgreSQL psql - the interactive terminal client for PostgreSQL. Use when connecting to PostgreSQL databases, executing queries, managing databases/tables, configuring connection options, formatting output, writing scripts, managing transactions, and using advanced psql features for database administration and development.

38
data-storytelling

Transform data into compelling narratives using visualization, context, and persuasive structure. Use when presenting analytics to stakeholders, creating data reports, or building executive presentations.

27
content-trend-researcher

Advanced content and topic research skill that analyzes trends across Google Analytics, Google Trends, Substack, Medium, Reddit, LinkedIn, X, blogs, podcasts, and YouTube to generate data-driven article outlines based on user intent analysis

23
data-scientist

Expert data scientist for advanced analytics, machine learning, and statistical modeling. Handles complex data analysis, predictive modeling, and business intelligence. Use PROACTIVELY for data analysis tasks, ML modeling, statistical analysis, and data-driven insights.

13