Prometheus

Prometheus

idanfishman

Connects AI assistants to Prometheus monitoring systems for querying time-series metrics and analyzing performance data through natural language. Execute PromQL queries and retrieve operational metrics directly in your AI chat.

Integrates with Prometheus monitoring systems to provide direct access to time-series metrics through specialized tools for discovering available metrics and labels, retrieving metadata and target information, and executing PromQL queries for real-time performance analysis and operational intelligence.

24822 views6Local (stdio)

What it does

  • Execute PromQL queries against Prometheus
  • Discover available metrics and labels
  • Retrieve target and metadata information
  • Analyze time-series performance data
  • Browse monitoring infrastructure through natural language

Best for

DevOps engineers monitoring system performanceSREs analyzing operational metricsDevelopers debugging application performanceTeams wanting natural language access to metrics
Works with existing Prometheus serversLLM-optimized JSON responsesConfigurable tool permissions

About Prometheus

Prometheus is a community-built MCP server published by idanfishman that provides AI assistants with tools and capabilities via the Model Context Protocol. Integrate with Prometheus for real-time performance analysis, process monitoring, and advanced Prometheus 2.0 metric dis It is categorized under developer tools, analytics data.

How to install

You can install Prometheus in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

Prometheus is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

Prometheus MCP Logo
Prometheus MCP Server

npm Docker codecov Node.js Version License: MIT

A Model Context Protocol (MCP) server that provides seamless integration between AI assistants and Prometheus, enabling natural language interactions with your monitoring infrastructure. This server allows for effortless querying, discovery, and analysis of metrics through Visual Studio Code, Cursor, Windsurf, Claude Desktop, and other MCP clients.

Key Features

  • Fast and lightweight. Direct API integration with Prometheus, no complex parsing needed.
  • LLM-friendly. Structured JSON responses optimized for AI assistant consumption.
  • Configurable capabilities. Enable/disable tool categories based on your security and operational requirements.
  • Dual transport support. Works with both stdio and HTTP transports for maximum compatibility.

Requirements

  • Node.js 20.19.0 or newer
  • Access to a Prometheus server
  • VS Code, Cursor, Windsurf, Claude Desktop or any other MCP client

Getting Started

First, install the Prometheus MCP server with your client. A typical configuration looks like this:

{
  "mcpServers": {
    "prometheus": {
      "command": "npx",
      "args": ["prometheus-mcp@latest", "stdio"],
      "env": {
        "PROMETHEUS_URL": "http://localhost:9090"
      }
    }
  }
}
Install in VS Code
# For VS Code
code --add-mcp '{"name":"prometheus","command":"npx","args":["prometheus-mcp@latest","stdio"],"env":{"PROMETHEUS_URL":"http://localhost:9090"}}'

# For VS Code Insiders
code-insiders --add-mcp '{"name":"prometheus","command":"npx","args":["prometheus-mcp@latest","stdio"],"env":{"PROMETHEUS_URL":"http://localhost:9090"}}'

After installation, the Prometheus MCP server will be available for use with your GitHub Copilot agent in VS Code.

Install in Cursor

Go to Cursor SettingsMCPAdd new MCP Server. Name to your liking, use command type with the command npx prometheus-mcp. You can also verify config or add command arguments via clicking Edit.

{
  "mcpServers": {
    "prometheus": {
      "command": "npx",
      "args": ["prometheus-mcp@latest", "stdio"],
      "env": {
        "PROMETHEUS_URL": "http://localhost:9090"
      }
    }
  }
}
Install in Windsurf

Follow Windsurf MCP documentation. Use the following configuration:

{
  "mcpServers": {
    "prometheus": {
      "command": "npx",
      "args": ["prometheus-mcp@latest", "stdio"],
      "env": {
        "PROMETHEUS_URL": "http://localhost:9090"
      }
    }
  }
}
Install in Claude Desktop

Claude Desktop supports two installation methods:

Option 1: DXT Extension

The easiest way to install is using the pre-built DXT extension:

  1. Download the latest .dxt file from the releases page
  2. Double-click the downloaded file to install automatically
  3. Configure your Prometheus URL in the extension settings

Option 2: Developer Settings

For advanced users or custom configurations, manually configure the MCP server:

  1. Open Claude Desktop settings
  2. Navigate to the Developer section
  3. Add the following MCP server configuration:
{
  "mcpServers": {
    "prometheus": {
      "command": "npx",
      "args": ["prometheus-mcp@latest", "stdio"],
      "env": {
        "PROMETHEUS_URL": "http://localhost:9090"
      }
    }
  }
}

Configuration

Prometheus MCP server supports the following arguments. They can be provided in the JSON configuration above, as part of the "args" list:

> npx prometheus-mcp@latest --help

Commands:
  stdio  Start Prometheus MCP server using stdio transport
  http   Start Prometheus MCP server using HTTP transport

Options:
  --help     Show help                          [boolean]
  --version  Show version number                [boolean]

Environment Variables

You can also configure the server using environment variables:

  • PROMETHEUS_URL - Prometheus server URL
  • ENABLE_DISCOVERY_TOOLS - Set to "false" to disable discovery tools (default: true)
  • ENABLE_INFO_TOOLS - Set to "false" to disable info tools (default: true)
  • ENABLE_QUERY_TOOLS - Set to "false" to disable query tools (default: true)

Standalone MCP Server

When running in server environments or when you need HTTP transport, run the MCP server with the http command:

npx prometheus-mcp@latest http --port 3000

And then in your MCP client config, set the url to the HTTP endpoint:

{
  "mcpServers": {
    "prometheus": {
      "command": "npx",
      "args": ["mcp-remote", "http://localhost:3000/mcp"]
    }
  }
}

Docker

Run the Prometheus MCP server using Docker:

{
  "mcpServers": {
    "prometheus": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "--init",
        "--pull=always",
        "-e",
        "PROMETHEUS_URL=http://host.docker.internal:9090",
        "ghcr.io/idanfishman/prometheus-mcp",
        "stdio"
      ]
    }
  }
}

Tools

The Prometheus MCP server provides 10 tools organized into three configurable categories:

Discovery

Tools for exploring your Prometheus infrastructure:

  • prometheus_list_metrics

    • Description: List all available Prometheus metrics
    • Parameters: None
    • Read-only: true
  • prometheus_metric_metadata

    • Description: Get metadata for a specific Prometheus metric
    • Parameters:
      • metric (string): Metric name to get metadata for
    • Read-only: true
  • prometheus_list_labels

    • Description: List all available Prometheus labels
    • Parameters: None
    • Read-only: true
  • prometheus_label_values

    • Description: Get all values for a specific Prometheus label
    • Parameters:
      • label (string): Label name to get values for
    • Read-only: true
  • prometheus_list_targets

    • Description: List all Prometheus scrape targets
    • Parameters: None
    • Read-only: true
  • prometheus_scrape_pool_targets

    • Description: Get targets for a specific scrape pool
    • Parameters:
      • scrapePool (string): Scrape pool name
    • Read-only: true
Info

Tools for accessing Prometheus server information:

  • prometheus_runtime_info

    • Description: Get Prometheus runtime information
    • Parameters: None
    • Read-only: true
  • prometheus_build_info

    • Description: Get Prometheus build information
    • Parameters: None
    • Read-only: true
Query

Tools for executing Prometheus queries:

  • prometheus_query

    • Description: Execute an instant Prometheus query
    • Parameters:
      • query (string): Prometheus query expression
      • time (string, optional): Time parameter for the query (RFC3339 format)
    • Read-only: true
  • prometheus_query_range

    • Description: Execute a Prometheus range query
    • Parameters:
      • query (string): Prometheus query expression
      • start (string): Start timestamp (RFC3339 or unix timestamp)
      • end (string): End timestamp (RFC3339 or unix timestamp)
      • step (string): Query resolution step width
    • Read-only: true

Example Usage

Here are some example interactions you can have with your AI assistant:

Basic Queries

  • "Show me all available metrics in Prometheus"
  • "What's the current CPU usage across all instances?"
  • "Get the memory usage for the last hour"

Discovery and Exploration

  • "List all scrape targets and their status"
  • "What labels are available for the http_requests_total metric?"
  • "Show me all metrics related to 'cpu'"

Advanced Analysis

  • "Compare CPU usage between production and staging environments"
  • "Show me the top 10 services by memory consumption"
  • "What's the error rate trend for the API service over the last 24 hours?"

Security Considerations

  • Network Access: The server requires network access to your Prometheus instance
  • Resource Usage: Range queries can be resource-intensive; monitor your Prometheus server load

Troubleshooting

Connection Issues

  • Verify your Prometheus server is accessible at the configured URL
  • Check firewall settings and network connectivity
  • Ensure Prometheus API is enabled (default on port 9090)

Permission Errors

  • Verify the MCP server has network access to Prometheus
  • Check if authentication is required for your Prometheus setup

Tool Availability

  • If certain tools are missing, check if they've been disabled via configuration

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

  • GitHub Issues: [Report bugs or request

README truncated. View full README on GitHub.

Alternatives

Related Skills

Browse all skills
usage-export

Export OpenClaw usage data to CSV for analytics tools like Power BI. Hourly aggregates by activity type, model, and channel.

1
mcp-developer

Use when building MCP servers or clients that connect AI systems with external tools and data sources. Invoke for MCP protocol compliance, TypeScript/Python SDKs, resource providers, tool functions.

1
ccxt-typescript

CCXT cryptocurrency exchange library for TypeScript and JavaScript developers (Node.js and browser). Covers both REST API (standard) and WebSocket API (real-time). Helps install CCXT, connect to exchanges, fetch market data, place orders, stream live tickers/orderbooks, handle authentication, and manage errors. Use when working with crypto exchanges in TypeScript/JavaScript projects, trading bots, arbitrage systems, or portfolio management tools. Includes both REST and WebSocket examples.

1
dotnet-backend

.NET/C# backend developer for ASP.NET Core APIs with Entity Framework Core. Builds REST APIs, minimal APIs, gRPC services, authentication with Identity/JWT, authorization, database operations, background services, SignalR real-time features. Activates for: .NET, C#, ASP.NET Core, Entity Framework Core, EF Core, .NET Core, minimal API, Web API, gRPC, authentication .NET, Identity, JWT .NET, authorization, LINQ, async/await C#, background service, IHostedService, SignalR, SQL Server, PostgreSQL .NET, dependency injection, middleware .NET.

109
supabase-developer

Build full-stack applications with Supabase (PostgreSQL, Auth, Storage, Real-time, Edge Functions). Use when implementing authentication, database design with RLS, file storage, real-time features, or serverless functions.

87
python-expert

Senior Python developer expertise for writing clean, efficient, and well-documented code. Use when: writing Python code, optimizing Python scripts, reviewing Python code for best practices, debugging Python issues, implementing type hints, or when user mentions Python, PEP 8, or needs help with Python data structures and algorithms.

40