
MotherDuck & DuckDB
OfficialConnects AI assistants to DuckDB databases (local, MotherDuck, or S3) for SQL analytics and data queries.
Integrates MotherDuck and local DuckDB databases for flexible querying and analysis of structured data in MCP-compatible environments.
What it does
- Execute SQL queries on DuckDB databases
- Query local DuckDB files and in-memory databases
- Connect to MotherDuck cloud databases
- Access S3-hosted DuckDB databases
- Switch between database connections
- Browse database catalogs and schemas
Best for
About MotherDuck & DuckDB
MotherDuck & DuckDB is an official MCP server published by motherduckdb that provides AI assistants with tools and capabilities via the Model Context Protocol. Integrates MotherDuck and local DuckDB for flexible querying and analysis of structured data in MCP-compatible environme It is categorized under databases, analytics data. This server exposes 1 tool that AI clients can invoke during conversations and coding sessions.
How to install
You can install MotherDuck & DuckDB in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
MotherDuck & DuckDB is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
Tools (1)
Use this to execute a query on the MotherDuck or DuckDB database
DuckDB / MotherDuck Local MCP Server
SQL analytics and data engineering for AI Assistants and IDEs.
Connect AI assistants to your data using DuckDB's powerful analytical SQL engine. Supports connecting to local DuckDB files, in-memory databases, S3-hosted databases, and MotherDuck. Allows executing SQL read- and write-queries, browsing database catalogs, and switching between different database connections on-the-fly.
Looking for a fully-managed remote MCP server for MotherDuck? → Go to the MotherDuck Remote MCP docs
Remote vs Local MCP
| Remote MCP | Local MCP (this repo) | |
|---|---|---|
| Hosting | Hosted by MotherDuck | Runs locally/self-hosted |
| Setup | Zero-setup | Requires local installation |
| Access | Read-write supported | Read-write supported |
| Local filesystem | - | Query across local and remote databases, ingest data from / export data to local filesystem |
📝 Migrating from v0.x?
- Read-only by default: The server now runs in read-only mode by default. Add
--read-writeto enable write access. See Securing for Production.- Default database changed:
--db-pathdefault changed frommd:to:memory:. Add--db-path md:explicitly for MotherDuck.- MotherDuck read-only requires read-scaling token: MotherDuck connections in read-only mode require a read-scaling token. Regular tokens require
--read-write.
Quick Start
Prerequisites: Install uv via pip install uv or brew install uv
Connecting to In-Memory DuckDB (Dev Mode)
{
"mcpServers": {
"DuckDB (in-memory, r/w)": {
"command": "uvx",
"args": ["mcp-server-motherduck", "--db-path", ":memory:", "--read-write", "--allow-switch-databases"]
}
}
}
Full flexibility with no guardrails — read-write access and the ability to switch to any database (local files, S3, or MotherDuck) at runtime.
Connecting to a Local DuckDB File in Read-Only Mode
{
"mcpServers": {
"DuckDB (read-only)": {
"command": "uvx",
"args": ["mcp-server-motherduck", "--db-path", "/absolute/path/to/your.duckdb"]
}
}
}
Connects to a specific DuckDB file in read-only mode. Won't hold on to the file lock, so convenient to use alongside a write connection to the same DuckDB file. You can also connect to remote DuckDB files on S3 using s3://bucket/path.duckdb — see Environment Variables for S3 authentication. If you're considering third-party access to the MCP, see Securing for Production.
Connecting to MotherDuck in Read-Write Mode
{
"mcpServers": {
"MotherDuck (local, r/w)": {
"command": "uvx",
"args": ["mcp-server-motherduck", "--db-path", "md:", "--read-write"],
"env": {
"motherduck_token": "<YOUR_MOTHERDUCK_TOKEN>"
}
}
}
}
See Command Line Parameters for more options, Securing for Production for deployment guidance, and Troubleshooting if you encounter issues.
Client Setup
| Client | Config Location | One-Click Install |
|---|---|---|
| Claude Desktop | Settings → Developer → Edit Config | .mcpb (MCP Bundle) |
| Claude Code | Use CLI commands below | - |
| Codex CLI | Use CLI commands below | - |
| Cursor | Settings → MCP → Add new global MCP server | |
| VS Code | Ctrl+Shift+P → "Preferences: Open User Settings (JSON)" |
Any MCP-compatible client can use this server. Add the JSON configuration from Quick Start to your client's MCP config file. Consult your client's documentation for the config file location.
Claude Code CLI commands
In-Memory DuckDB (Dev Mode):
claude mcp add --scope user duckdb --transport stdio -- uvx mcp-server-motherduck --db-path :memory: --read-write --allow-switch-databases
Local DuckDB (Read-Only):
claude mcp add --scope user duckdb --transport stdio -- uvx mcp-server-motherduck --db-path /absolute/path/to/db.duckdb
MotherDuck (Read-Write):
claude mcp add --scope user motherduck --transport stdio --env motherduck_token=YOUR_TOKEN -- uvx mcp-server-motherduck --db-path md: --read-write
Codex CLI commands
In-Memory DuckDB (Dev Mode):
codex mcp add duckdb -- uvx mcp-server-motherduck --db-path :memory: --read-write --allow-switch-databases
Local DuckDB (Read-Only):
codex mcp add duckdb -- uvx mcp-server-motherduck --db-path /absolute/path/to/db.duckdb
MotherDuck (Read-Write):
codex mcp add motherduck --env motherduck_token=YOUR_TOKEN -- uvx mcp-server-motherduck --db-path md: --read-write
Tools
| Tool | Description | Required Inputs | Optional Inputs |
|---|---|---|---|
execute_query | Execute SQL query (DuckDB dialect) | sql | - |
list_databases | List all databases (useful for MotherDuck or multiple attached DBs) | - | - |
list_tables | List tables and views | - | database, schema |
list_columns | List columns of a table/view | table | database, schema |
switch_database_connection* | Switch to different database | path | create_if_not_exists |
*Requires --allow-switch-databases flag
All tools return JSON. Results are limited to 1024 rows / 50,000 chars by default (configurable via --max-rows, --max-chars).
Securing for Production
When giving third parties access to a self-hosted MCP server, read-only mode alone is not sufficient — it still allows access to the local filesystem, changing DuckDB settings, and other potentially sensitive operations.
For production deployments with third-party access, we recommend MotherDuck Remote MCP — zero-setup, read-write capable, and hosted by MotherDuck.
Self-hosting MotherDuck MCP: Fork this repo and customize as needed. Use a service account with read-scaling tokens and enable SaaS mode to restrict local file access.
Self-hosting DuckDB MCP: Use --init-sql to apply security settings. See the Securing DuckDB guide for available options.
Command Line Parameters
| Parameter | Default | Description |
|---|---|---|
--db-path | :memory: | Database path: local file (absolute), md: (MotherDuck), or s3:// URL |
--motherduck-token | motherduck_token env var | MotherDuck access token |
--read-write | False | Enable write access |
--motherduck-saas-mode | False | MotherDuck SaaS mode (restricts local access) |
--allow-switch-databases | False | Enable switch_database_connection tool |
--max-rows | 1024 | Max rows returned |
--max-chars | 50000 | Max characters returned |
--query-timeout | -1 | Query timeout in seconds (-1 = disabled) |
--init-sql | None | SQL to execute on startup |
--motherduck-connection-parameters | session_hint=mcp&dbinstance_inactivity_ttl=0s | Additional MotherDuck connection string parameters (key=value pairs separated by &) |
--ephemeral-connections | True | Use temporary connections for read-only local files |
--transport | stdio | Transport type: stdio or http |
--stateless-http | False | For protocol compatibility only (e.g. with AWS Bedrock AgentCore Runtime). Server still maintains global state via the shared DatabaseClient. |
--port | 8000 | Port for HTTP transport |
--host | 127.0.0.1 | Host for HTTP transport |
Environment Variables
| Variable | Description |
|---|---|
motherduck_token or MOTHERDUCK_TOKEN | MotherDuck access token (alternative to --motherduck-token) |
HOME | Used by DuckDB for extensions and config. Override with --home-dir if not set. |
AWS_ACCESS_KEY_ID | AWS access key for S3 database connections |
AWS_SECRET_ACCESS_KEY | AWS secret key for S3 database connections |
AWS_SESSION_TOKEN | AWS session token for temporary credentials (IAM roles, SSO, EC2 instance profiles) |
| `AWS_D |
README truncated. View full README on GitHub.
Alternatives
Related Skills
Browse all skillsHigh-performance CLI for working with SQL dump files: split/merge by table, analyze contents, validate integrity, convert between MySQL/PostgreSQL/SQLite/MSSQL, create FK-safe samples, shard multi-tenant dumps, generate ERD diagrams, reorder for safe imports, and run SQL analytics with embedded DuckDB. Use when working with .sql dump files for migrations, dev seeding, CI validation, schema visualization, data extraction, or ad-hoc analytics.
Conduct comprehensive, systematic literature reviews using multiple academic databases (PubMed, arXiv, bioRxiv, Semantic Scholar, etc.). This skill should be used when conducting systematic literature reviews, meta-analyses, research synthesis, or comprehensive literature searches across biomedical, scientific, and technical domains. Creates professionally formatted markdown documents and PDFs with verified citations in multiple citation styles (APA, Nature, Vancouver, etc.).
Comprehensive guide for PostgreSQL psql - the interactive terminal client for PostgreSQL. Use when connecting to PostgreSQL databases, executing queries, managing databases/tables, configuring connection options, formatting output, writing scripts, managing transactions, and using advanced psql features for database administration and development.
Transform data into compelling narratives using visualization, context, and persuasive structure. Use when presenting analytics to stakeholders, creating data reports, or building executive presentations.
Advanced content and topic research skill that analyzes trends across Google Analytics, Google Trends, Substack, Medium, Reddit, LinkedIn, X, blogs, podcasts, and YouTube to generate data-driven article outlines based on user intent analysis
Expert data scientist for advanced analytics, machine learning, and statistical modeling. Handles complex data analysis, predictive modeling, and business intelligence. Use PROACTIVELY for data analysis tasks, ML modeling, statistical analysis, and data-driven insights.