Databricks

Databricks

characat0

Connects AI assistants to Databricks workspaces for browsing data catalogs and executing SQL queries. Query your Databricks tables and warehouses directly through natural language.

Provides a bridge between AI and Databricks workspaces, enabling interaction with data catalogs, schemas, tables, and SQL warehouses for direct querying and analysis of Databricks data.

4501 views1Local (stdio)

What it does

  • Execute SQL queries on Databricks warehouses
  • Browse data catalogs and schemas
  • List available tables with filtering
  • Get detailed table information
  • Access SQL warehouse metadata
  • Query Databricks data structures

Best for

Data analysts exploring Databricks datasetsAI-powered data analysis workflowsAutomated reporting from DatabricksData discovery and catalog exploration
Direct SQL execution on warehousesFull catalog browsing capabilities

About Databricks

Databricks is a community-built MCP server published by characat0 that provides AI assistants with tools and capabilities via the Model Context Protocol. Interact with Databricks data catalogs, schemas, and SQL warehouses securely. Ideal for Databricks certified and Azure D It is categorized under databases, analytics data. This server exposes 6 tools that AI clients can invoke during conversations and coding sessions.

How to install

You can install Databricks in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

Databricks is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

Tools (6)

execute_sql

Executes SQL statements on a Databricks warehouse and returns the results

get_table

Gets detailed information about a single Databricks table

list_catalogs

Lists all catalogs available in the Databricks workspace

list_schemas

Lists all schemas in a specified Databricks catalog

list_tables

Lists all tables in a specified Databricks schema with optional filtering

Databricks MCP Server

A Model Context Protocol (MCP) server for interacting with Databricks.

Installation

You can download the latest release for your platform from the Releases page.

VS Code

Install the Databricks MCP Server extension in VS Code by pressing the following link:

Install in VS Code

Alternatively, you can install the extension manually by running the following command:

# For VS Code
code --add-mcp '{"name":"databricks","command":"npx","args":["databricks-mcp-server@latest"]}'
# For VS Code Insiders
code-insiders --add-mcp '{"name":"databricks","command":"npx","args":["databricks-mcp-server@latest"]}'

Tools

The Databricks MCP Server provides a Model Context Protocol (MCP) interface to interact with Databricks workspaces. It offers the following functionalities:

List Catalogs

Lists all catalogs available in the Databricks workspace.

Tool name: list_catalogs

Parameters: None

Returns: JSON array of catalog objects

List Schemas

Lists all schemas in a specified Databricks catalog.

Tool name: list_schemas

Parameters:

  • catalog (string, required): Name of the catalog to list schemas from

Returns: JSON array of schema objects

List Tables

Lists all tables in a specified Databricks schema with optional filtering.

Tool name: list_tables

Parameters:

  • catalog (string, required): Name of the catalog containing the schema
  • schema (string, required): Name of the schema to list tables from
  • filter_pattern (string, optional, default: ".*"): Regular expression pattern to filter table names

Returns: JSON array of table objects

Execute SQL

Executes SQL statements on a Databricks SQL warehouse and returns the results.

Tool name: execute_sql

Parameters:

  • statement (string, required): SQL statement to execute
  • timeout_seconds (number, optional, default: 60): Timeout in seconds for the statement execution
  • row_limit (number, optional, default: 100): Maximum number of rows to return in the result

Returns: JSON object containing columns and rows from the query result, with information of the SQL warehouse used to execute the statement.

List SQL Warehouses

Lists all SQL warehouses available in the Databricks workspace.

Tool name: list_warehouses

Parameters: None

Returns: JSON array of SQL warehouse objects

Supported Platforms

  • Linux (amd64)
  • Windows (amd64)
  • macOS (Intel/amd64)
  • macOS (Apple Silicon/arm64)

Usage

Authentication

The application uses Databricks unified authentication. For details on how to configure authentication, please refer to the Databricks Authentication documentation.

Running the Server

Start the MCP server:

./databricks-mcp-server

The server will start and listen for MCP protocol commands on standard input/output.

Development

Prerequisites

  • Go 1.24 or later

Alternatives

Related Skills

Browse all skills
literature-review

Conduct comprehensive, systematic literature reviews using multiple academic databases (PubMed, arXiv, bioRxiv, Semantic Scholar, etc.). This skill should be used when conducting systematic literature reviews, meta-analyses, research synthesis, or comprehensive literature searches across biomedical, scientific, and technical domains. Creates professionally formatted markdown documents and PDFs with verified citations in multiple citation styles (APA, Nature, Vancouver, etc.).

377
postgresql-psql

Comprehensive guide for PostgreSQL psql - the interactive terminal client for PostgreSQL. Use when connecting to PostgreSQL databases, executing queries, managing databases/tables, configuring connection options, formatting output, writing scripts, managing transactions, and using advanced psql features for database administration and development.

38
data-storytelling

Transform data into compelling narratives using visualization, context, and persuasive structure. Use when presenting analytics to stakeholders, creating data reports, or building executive presentations.

27
content-trend-researcher

Advanced content and topic research skill that analyzes trends across Google Analytics, Google Trends, Substack, Medium, Reddit, LinkedIn, X, blogs, podcasts, and YouTube to generate data-driven article outlines based on user intent analysis

23
sql-queries

Write correct, performant SQL across all major data warehouse dialects (Snowflake, BigQuery, Databricks, PostgreSQL, etc.). Use when writing queries, optimizing slow SQL, translating between dialects, or building complex analytical queries with CTEs, window functions, or aggregations.

14
data-scientist

Expert data scientist for advanced analytics, machine learning, and statistical modeling. Handles complex data analysis, predictive modeling, and business intelligence. Use PROACTIVELY for data analysis tasks, ML modeling, statistical analysis, and data-driven insights.

13