astro-airflow-mcp

astro-airflow-mcp

Official
astronomer

Connects AI assistants to Apache Airflow's REST API to manage workflows, monitor tasks, and diagnose system issues. Provides comprehensive Airflow operations through conversational interface.

An MCP server that enables AI assistants to interact with Apache Airflow's REST API for DAG management, task monitoring, and system diagnostics. It provides comprehensive tools for triggering workflows, retrieving logs, and inspecting system health across Airflow 2.x and 3.x versions.

8691 views4Local (stdio)

What it does

  • Trigger DAG runs
  • Retrieve task and workflow logs
  • Monitor DAG and task statuses
  • Check system health
  • List and inspect DAGs
  • Query workflow execution history

Best for

Data engineers managing Airflow pipelinesDevOps teams monitoring workflow systemsDebugging failed DAG runsAirflow system administration
Supports Airflow 2.x and 3.xBuilt by AstronomerOne-click IDE installation

About astro-airflow-mcp

astro-airflow-mcp is an official MCP server published by astronomer that provides AI assistants with tools and capabilities via the Model Context Protocol. astro-airflow-mcp: AI assistant access to Apache Airflow REST API for DAG management, task monitoring, logs, and diagnos It is categorized under developer tools.

How to install

You can install astro-airflow-mcp in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

astro-airflow-mcp is released under the Apache-2.0 license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

[!WARNING] This project has been relocated to the Astronomer agents monorepo.


Airflow MCP Server

CI Python 3.10+ PyPI - Version License: Apache 2.0

A Model Context Protocol (MCP) server for Apache Airflow that provides AI assistants with access to Airflow's REST API. Built with FastMCP.

Quickstart

IDEs

Install in VS Code Add to Cursor

Manual configuration

Add to your MCP settings (Cursor: ~/.cursor/mcp.json, VS Code: .vscode/mcp.json):

{
  "mcpServers": {
    "airflow": {
      "command": "uvx",
      "args": ["astro-airflow-mcp", "--transport", "stdio"]
    }
  }
}

CLI Tools

Claude Code
claude mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Gemini CLI
gemini mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Codex CLI
codex mcp add airflow -- uvx astro-airflow-mcp --transport stdio

Desktop Apps

Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):

{
  "mcpServers": {
    "airflow": {
      "command": "uvx",
      "args": ["astro-airflow-mcp", "--transport", "stdio"]
    }
  }
}

Other MCP Clients

Manual JSON Configuration

Add to your MCP configuration file:

{
  "mcpServers": {
    "airflow": {
      "command": "uvx",
      "args": ["astro-airflow-mcp", "--transport", "stdio"]
    }
  }
}

Or connect to a running HTTP server: "url": "http://localhost:8000/mcp"

Note: No installation required - uvx runs directly from PyPI. The --transport stdio flag is required because the server defaults to HTTP mode.

Configuration

By default, the server connects to http://localhost:8080 (Astro CLI default). Set environment variables for custom Airflow instances:

VariableDescription
AIRFLOW_API_URLAirflow webserver URL
AIRFLOW_USERNAMEUsername (Airflow 3.x uses OAuth2 token exchange)
AIRFLOW_PASSWORDPassword
AIRFLOW_AUTH_TOKENBearer token (alternative to username/password)

Example with auth (Claude Code):

claude mcp add airflow -e AIRFLOW_API_URL=https://your-airflow.example.com -e AIRFLOW_USERNAME=admin -e AIRFLOW_PASSWORD=admin -- uvx astro-airflow-mcp --transport stdio

Features

  • Airflow 2.x and 3.x Support: Automatic version detection with adapter pattern
  • MCP Tools for accessing Airflow data:
    • DAG management (list, get details, get source code, stats, warnings, import errors, trigger, pause/unpause)
    • Task management (list, get details, get task instances, get logs)
    • Pool management (list, get details)
    • Variable management (list, get specific variables)
    • Connection management (list connections with credentials excluded)
    • Asset/Dataset management (unified naming across versions, data lineage)
    • Plugin and provider information
    • Configuration and version details
  • Consolidated Tools for agent workflows:
    • explore_dag: Get comprehensive DAG information in one call
    • diagnose_dag_run: Debug failed DAG runs with task instance details
    • get_system_health: System overview with health, errors, and warnings
  • MCP Resources: Static Airflow info exposed as resources (version, providers, plugins, config)
  • MCP Prompts: Guided workflows for common tasks (troubleshooting, health checks, onboarding)
  • Dual deployment modes:
    • Standalone server: Run as an independent MCP server
    • Airflow plugin: Integrate directly into Airflow 3.x webserver
  • Flexible Authentication:
    • Bearer token (Airflow 2.x and 3.x)
    • Username/password with automatic OAuth2 token exchange (Airflow 3.x)
    • Basic auth (Airflow 2.x)

Available Tools

Consolidated Tools (Agent-Optimized)

ToolDescription
explore_dagGet comprehensive DAG info: metadata, tasks, recent runs, source code
diagnose_dag_runDebug a DAG run: run details, failed task instances, logs
get_system_healthSystem overview: health status, import errors, warnings, DAG stats

Core Tools

ToolDescription
list_dagsGet all DAGs and their metadata
get_dag_detailsGet detailed info about a specific DAG
get_dag_sourceGet the source code of a DAG
get_dag_statsGet DAG run statistics (Airflow 3.x only)
list_dag_warningsGet DAG import warnings
list_import_errorsGet import errors from DAG files that failed to parse
list_dag_runsGet DAG run history
get_dag_runGet specific DAG run details
trigger_dagTrigger a new DAG run (start a workflow execution)
pause_dagPause a DAG to prevent new scheduled runs
unpause_dagUnpause a DAG to resume scheduled runs
list_tasksGet all tasks in a DAG
get_taskGet details about a specific task
get_task_instanceGet task instance execution details
get_task_logsGet logs for a specific task instance execution
list_poolsGet all resource pools
get_poolGet details about a specific pool
list_variablesGet all Airflow variables
get_variableGet a specific variable by key
list_connectionsGet all connections (credentials excluded for security)
list_assetsGet assets/datasets (unified naming across versions)
list_pluginsGet installed Airflow plugins
list_providersGet installed provider packages
get_airflow_configGet Airflow configuration
get_airflow_versionGet Airflow version information

MCP Resources

Resource URIDescription
airflow://versionAirflow version information
airflow://providersInstalled provider packages
airflow://pluginsInstalled Airflow plugins
airflow://configAirflow configuration

MCP Prompts

PromptDescription
troubleshoot_failed_dagGuided workflow for diagnosing DAG failures
daily_health_checkMorning health check routine
onboard_new_dagGuide for understanding a new DAG

Advanced Usage

Running as Standalone Server

For HTTP-based integrations or connecting multiple clients to one server:

# Run server (HTTP mode is default)
uvx astro-airflow-mcp --airflow-url https://my-airflow.example.com --username admin --password admin

Connect MCP clients to: http://localhost:8000/mcp

Airflow Plugin Mode

Install into your Airflow 3.x environment to expose MCP at http://your-airflow:8080/mcp/v1:

# Add to your Astro project
echo astro-airflow-mcp >> requirements.txt

CLI Options

FlagEnvironment VariableDefaultDescription
--transportMCP_TRANSPORTstdioTransport mode (stdio or http)
--hostMCP_HOSTlocalhostHost to bind to (HTTP mode only)
--portMCP_PORT8000Port to bind to (HTTP mode only)
--airflow-urlAIRFLOW_API_URLAuto-discovered or http://localhost:8080Airflow webserver URL
--airflow-project-dirAIRFLOW_PROJECT_DIR$PWDAstro project directory for auto-discovering Airflow URL from .astro/config.yaml
--auth-tokenAIRFLOW_AUTH_TOKENNoneBearer token for authentication
--usernameAIRFLOW_USERNAMENoneUsername for authentication (Airflow 3.x uses OAuth2 token exchange)
--passwordAIRFLOW_PASSWORDNonePassword for authentication

Architecture

The server is built using FastMCP with an adapter pattern for Airflow version compatibility:

Core Components

  • Adapters (adapters/): Version-specific API implementations
    • AirflowAdapter (base): Abstract interface for all Airflow API operations
    • AirflowV2Adapter: Airflow 2.x API (/api/v1) with basic auth
    • AirflowV3Adapter: Airflow 3.x API (/api/v2) with OAuth2 token exchange
  • Version Detection: Automatic detection at startup by probing API endpoints
  • Models (models.py): Pydantic models for type-safe API responses

Version Handling Strategy

  1. Major versions (2.x vs 3.x): Adapter pattern with runtime version detection
  2. Minor versions (3.1 vs 3.2): Runtime feature detection with graceful fallbacks
  3. New API parameters: Pass-th

README truncated. View full README on GitHub.

Alternatives

Related Skills

Browse all skills
ui-design-system

UI design system toolkit for Senior UI Designer including design token generation, component documentation, responsive design calculations, and developer handoff tools. Use for creating design systems, maintaining visual consistency, and facilitating design-dev collaboration.

18
ai-sdk

Answer questions about the AI SDK and help build AI-powered features. Use when developers: (1) Ask about AI SDK functions like generateText, streamText, ToolLoopAgent, embed, or tools, (2) Want to build AI agents, chatbots, RAG systems, or text generation features, (3) Have questions about AI providers (OpenAI, Anthropic, Google, etc.), streaming, tool calling, structured output, or embeddings, (4) Use React hooks like useChat or useCompletion. Triggers on: "AI SDK", "Vercel AI SDK", "generateText", "streamText", "add AI to my app", "build an agent", "tool calling", "structured output", "useChat".

6
api-documenter

Master API documentation with OpenAPI 3.1, AI-powered tools, and modern developer experience practices. Create interactive docs, generate SDKs, and build comprehensive developer portals. Use PROACTIVELY for API documentation or developer portal creation.

4
openai-knowledge

Use when working with the OpenAI API (Responses API) or OpenAI platform features (tools, streaming, Realtime API, auth, models, rate limits, MCP) and you need authoritative, up-to-date documentation (schemas, examples, limits, edge cases). Prefer the OpenAI Developer Documentation MCP server tools when available; otherwise guide the user to enable `openaiDeveloperDocs`.

4
cli-builder

Guide for building TypeScript CLIs with Bun. Use when creating command-line tools, adding subcommands to existing CLIs, or building developer tooling. Covers argument parsing, subcommand patterns, output formatting, and distribution.

3
ydc-ai-sdk-integration

Integrate Vercel AI SDK applications with You.com tools (web search, AI agent, content extraction). Use when developer mentions AI SDK, Vercel AI SDK, generateText, streamText, or You.com integration with AI SDK.

2