
Redshift Utils
Provides Amazon Redshift database administration and monitoring through 40+ curated SQL scripts accessible via AWS Data API. Enables AI assistants to perform cluster health checks, query performance analysis, and diagnostic operations on Redshift warehouses.
Provides Amazon Redshift database administration tools for cluster health monitoring, query performance analysis, workload management, and diagnostic operations through AWS Data API integration with over 40 curated SQL scripts for production database maintenance.
What it does
- Monitor Redshift cluster health and performance
- Analyze query execution and workload patterns
- Execute diagnostic SQL scripts for database maintenance
- Access database schema and structure information
- Perform workload management operations
- Run production database health checks
Best for
About Redshift Utils
Redshift Utils is a community-built MCP server published by vinodismyname that provides AI assistants with tools and capabilities via the Model Context Protocol. Redshift Utils offers essential Amazon Redshift database admin tools for health monitoring, query analysis, and automate It is categorized under databases, analytics data.
How to install
You can install Redshift Utils in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
Redshift Utils is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
Redshift Utils MCP Server
Overview
This project implements a Model Context Protocol (MCP) server designed specifically to interact with Amazon Redshift databases.
It bridges the gap between Large Language Models (LLMs) or AI assistants (like those in Claude, Cursor, or custom applications) and your Redshift data warehouse, enabling secure, standardized data access and interaction. This allows users to query data, understand database structure, and monitoring/diagnostic operations using natural language or AI-driven prompts.
This server is for developers, data analysts, or teams looking to integrate LLM capabilities directly with their Amazon Redshift data environment in a structured and secure manner.
Table of Contents
- Redshift Utils MCP Server
Features
- ✨ Secure Redshift Connection (via Data API): Connects to your Amazon Redshift cluster using the AWS Redshift Data API via Boto3, leveraging AWS Secrets Manager for credentials managed securely via environment variables.
- 🔍 Schema Discovery: Exposes MCP resources for listing schemas and tables within a specified schema.
- 📊 Metadata & Statistics: Provides a tool (
handle_inspect_table) to gather detailed table metadata, statistics (like size, row counts, skew, stats staleness), and maintenance status. - 📝 Read-Only Query Execution: Offers a secure MCP tool (
handle_execute_ad_hoc_query) to execute arbitrary SELECT queries against the Redshift database, enabling data retrieval based on LLM requests. - 📈 Query Performance Analysis: Includes a tool (
handle_diagnose_query_performance) to retrieve and analyze the execution plan, metrics, and historical data for a specific query ID. - 🔍 Table Inspection: Provides a tool (
handle_inspect_table) to perform a comprehensive inspection of a table, including design, storage, health, and usage. - 🩺 Cluster Health Check: Offers a tool (
handle_check_cluster_health) to perform a basic or full health assessment of the cluster using various diagnostic queries. - 🔒 Lock Diagnosis: Provides a tool (
handle_diagnose_locks) to identify and report on current lock contention and blocking sessions. - 📊 Workload Monitoring: Includes a tool (
handle_monitor_workload) to analyze cluster workload patterns over a time window, covering WLM, top queries, and resource usage. - 📝 DDL Retrieval: Offers a tool (
handle_get_table_definition) to retrieve theSHOW TABLEoutput (DDL) for a specified table. - 🛡️ Input Sanitization: Utilizes parameterized queries via the Boto3 Redshift Data API client where applicable to mitigate SQL injection risks.
- 🧩 Standardized MCP Interface: Adheres to the Model Context Protocol specification for seamless integration with compatible clients (e.g., Claude Desktop, Cursor IDE, custom applications).
Prerequisites
Software:
- Python 3.10+
uv(recommended package manager) orpip
Infrastructure & Access:
- Access to an Amazon Redshift cluster.
- An AWS account with permissions to use the Redshift Data API (
redshift-data:*) and access the specified Secrets Manager secret (secretsmanager:GetSecretValue). - A Redshift user account whose credentials are stored in AWS Secrets Manager. This user needs the necessary permissions within Redshift to perform the actions enabled by this server (e.g.,
CONNECTto the database,SELECTon target tables,SELECTon relevant system views likepg_class,pg_namespace,svv_all_schemas,svv_tables, `svv_table_info``). Using a role with the principle of least privilege is strongly recommended. See Security Considerations.
Credentials:
Your Redshift connection details are managed via AWS Secrets Manager, and the server connects using the Redshift Data API. You need:
- The Redshift cluster identifier.
- The database name within the cluster.
- The ARN of the AWS Secrets Manager secret containing the database credentials (username and password).
- The AWS region where the cluster and secret reside.
- Optionally, an AWS profile name if not using default credentials/region.
These details will be configured via environment variables as detailed in the Configuration section.
Installation
Install from PyPI (Recommended)
The easiest way to install the Redshift Utils MCP Server is directly from PyPI:
# Using pip
pip install redshift-utils-mcp
# Using uv (recommended)
uv pip install redshift-utils-mcp
Install from Source
Alternatively, you can install from the source repository:
# Clone the repository
git clone https://github.com/vinodismyname/redshift-utils-mcp.git
cd redshift-utils-mcp
# Install using uv (recommended)
uv sync
# Or install using pip
pip install -e .
Configuration
Set Environment Variables:
This server requires the following environment variables to connect to your Redshift cluster via the AWS Data API. You can set these directly in your shell, using a systemd service file, a Docker environment file, or by creating a .env file in the project's root directory (if using a tool like uv or python-dotenv that supports loading from .env).
Example using shell export:
export REDSHIFT_CLUSTER_ID="your-cluster-id"
export REDSHIFT_DATABASE="your_database_name"
export REDSHIFT_SECRET_ARN="arn:aws:secretsmanager:us-east-1:123456789012:secret:your-redshift-secret-XXXXXX"
export AWS_REGION="us-east-1" # Or AWS_DEFAULT_REGION
# export AWS_PROFILE="your-aws-profile-name" # Optional
Example .env file (see .env.example):
# .env file for Redshift MCP Server configuration
# Ensure this file is NOT committed to version control if it contains secrets. Add it to .gitignore.
REDSHIFT_CLUSTER_ID="your-cluster-id"
REDSHIFT_DATABASE="your_database_name"
REDSHIFT_SECRET_ARN="arn:aws:secretsmanager:us-east-1:123456789012:secret:your-redshift-secret-XXXXXX"
AWS_REGION="us-east-1" # Or AWS_DEFAULT_REGION
# AWS_PROFILE="your-aws-profile-name" # Optional
Required Variables Table:
| Variable Name | Required | Description | Example Value |
|---|---|---|---|
REDSHIFT_CLUSTER_ID | Yes | Your Redshift cluster identifier. | my-redshift-cluster |
REDSHIFT_DATABASE | Yes | The name of the database to connect to. | mydatabase |
REDSHIFT_SECRET_ARN | Yes | AWS Secrets Manager ARN for Redshift credentials. | arn:aws:secretsmanager:us-east-1:123456789012:secret:mysecret-abcdef |
AWS_REGION | Yes | AWS region for Data API and Secrets Manager. | us-east-1 |
AWS_DEFAULT_REGION | No | Alternative to AWS_REGION for specifying the AWS region. | us-west-2 |
AWS_PROFILE | No | AWS profile name to use from your credentials file (~/.aws/...). | my-redshift-profile |
Note: Ensure the AWS credentials used by Boto3 (via environment, profile, or IAM role) have permissions to access the specified REDSHIFT_SECRET_ARN and use the Redshift Data API (redshift-data:*).
Usage
After installation, you can run the server directly from the command line:
# If installed from PyPI
redshift-utils-mcp
# Or using uvx (no installation required)
uvx redshift-utils-mcp
Connecting with Claude Desktop / Anthropic Console:
Add the following configuration block to your mcp.json file:
{
"mcpServers": {
"redshift-utils-mcp": {
"command": "u
---
*README truncated. [View full README on GitHub](https://github.com/vinodismyname/redshift-utils-mcp).*
Alternatives
Related Skills
Browse all skillsConduct comprehensive, systematic literature reviews using multiple academic databases (PubMed, arXiv, bioRxiv, Semantic Scholar, etc.). This skill should be used when conducting systematic literature reviews, meta-analyses, research synthesis, or comprehensive literature searches across biomedical, scientific, and technical domains. Creates professionally formatted markdown documents and PDFs with verified citations in multiple citation styles (APA, Nature, Vancouver, etc.).
Comprehensive guide for PostgreSQL psql - the interactive terminal client for PostgreSQL. Use when connecting to PostgreSQL databases, executing queries, managing databases/tables, configuring connection options, formatting output, writing scripts, managing transactions, and using advanced psql features for database administration and development.
Transform data into compelling narratives using visualization, context, and persuasive structure. Use when presenting analytics to stakeholders, creating data reports, or building executive presentations.
Advanced content and topic research skill that analyzes trends across Google Analytics, Google Trends, Substack, Medium, Reddit, LinkedIn, X, blogs, podcasts, and YouTube to generate data-driven article outlines based on user intent analysis
No API KEY needed for free tier. Professional-grade cryptocurrency market data integration for real-time prices, historical charts, and global analytics.
Notion workspace integration. Use when user wants to read/write Notion pages, search databases, create tasks, or sync content with Notion.