
astro-airflow-mcp
OfficialConnects AI assistants to Apache Airflow's REST API to manage workflows, monitor tasks, and diagnose system issues. Provides comprehensive Airflow operations through conversational interface.
An MCP server that enables AI assistants to interact with Apache Airflow's REST API for DAG management, task monitoring, and system diagnostics. It provides comprehensive tools for triggering workflows, retrieving logs, and inspecting system health across Airflow 2.x and 3.x versions.
What it does
- Trigger DAG runs
- Retrieve task and workflow logs
- Monitor DAG and task statuses
- Check system health
- List and inspect DAGs
- Query workflow execution history
Best for
About astro-airflow-mcp
astro-airflow-mcp is an official MCP server published by astronomer that provides AI assistants with tools and capabilities via the Model Context Protocol. astro-airflow-mcp: AI assistant access to Apache Airflow REST API for DAG management, task monitoring, logs, and diagnos It is categorized under developer tools.
How to install
You can install astro-airflow-mcp in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
astro-airflow-mcp is released under the Apache-2.0 license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
[!WARNING] This project has been relocated to the Astronomer agents monorepo.
Airflow MCP Server
A Model Context Protocol (MCP) server for Apache Airflow that provides AI assistants with access to Airflow's REST API. Built with FastMCP.
Quickstart
IDEs
Manual configuration
Add to your MCP settings (Cursor: ~/.cursor/mcp.json, VS Code: .vscode/mcp.json):
{
"mcpServers": {
"airflow": {
"command": "uvx",
"args": ["astro-airflow-mcp", "--transport", "stdio"]
}
}
}
CLI Tools
Claude Code
claude mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Gemini CLI
gemini mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Codex CLI
codex mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Desktop Apps
Claude Desktop
Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"airflow": {
"command": "uvx",
"args": ["astro-airflow-mcp", "--transport", "stdio"]
}
}
}
Other MCP Clients
Manual JSON Configuration
Add to your MCP configuration file:
{
"mcpServers": {
"airflow": {
"command": "uvx",
"args": ["astro-airflow-mcp", "--transport", "stdio"]
}
}
}
Or connect to a running HTTP server: "url": "http://localhost:8000/mcp"
Note: No installation required -
uvxruns directly from PyPI. The--transport stdioflag is required because the server defaults to HTTP mode.
Configuration
By default, the server connects to http://localhost:8080 (Astro CLI default). Set environment variables for custom Airflow instances:
| Variable | Description |
|---|---|
AIRFLOW_API_URL | Airflow webserver URL |
AIRFLOW_USERNAME | Username (Airflow 3.x uses OAuth2 token exchange) |
AIRFLOW_PASSWORD | Password |
AIRFLOW_AUTH_TOKEN | Bearer token (alternative to username/password) |
Example with auth (Claude Code):
claude mcp add airflow -e AIRFLOW_API_URL=https://your-airflow.example.com -e AIRFLOW_USERNAME=admin -e AIRFLOW_PASSWORD=admin -- uvx astro-airflow-mcp --transport stdio
Features
- Airflow 2.x and 3.x Support: Automatic version detection with adapter pattern
- MCP Tools for accessing Airflow data:
- DAG management (list, get details, get source code, stats, warnings, import errors, trigger, pause/unpause)
- Task management (list, get details, get task instances, get logs)
- Pool management (list, get details)
- Variable management (list, get specific variables)
- Connection management (list connections with credentials excluded)
- Asset/Dataset management (unified naming across versions, data lineage)
- Plugin and provider information
- Configuration and version details
- Consolidated Tools for agent workflows:
explore_dag: Get comprehensive DAG information in one calldiagnose_dag_run: Debug failed DAG runs with task instance detailsget_system_health: System overview with health, errors, and warnings
- MCP Resources: Static Airflow info exposed as resources (version, providers, plugins, config)
- MCP Prompts: Guided workflows for common tasks (troubleshooting, health checks, onboarding)
- Dual deployment modes:
- Standalone server: Run as an independent MCP server
- Airflow plugin: Integrate directly into Airflow 3.x webserver
- Flexible Authentication:
- Bearer token (Airflow 2.x and 3.x)
- Username/password with automatic OAuth2 token exchange (Airflow 3.x)
- Basic auth (Airflow 2.x)
Available Tools
Consolidated Tools (Agent-Optimized)
| Tool | Description |
|---|---|
explore_dag | Get comprehensive DAG info: metadata, tasks, recent runs, source code |
diagnose_dag_run | Debug a DAG run: run details, failed task instances, logs |
get_system_health | System overview: health status, import errors, warnings, DAG stats |
Core Tools
| Tool | Description |
|---|---|
list_dags | Get all DAGs and their metadata |
get_dag_details | Get detailed info about a specific DAG |
get_dag_source | Get the source code of a DAG |
get_dag_stats | Get DAG run statistics (Airflow 3.x only) |
list_dag_warnings | Get DAG import warnings |
list_import_errors | Get import errors from DAG files that failed to parse |
list_dag_runs | Get DAG run history |
get_dag_run | Get specific DAG run details |
trigger_dag | Trigger a new DAG run (start a workflow execution) |
pause_dag | Pause a DAG to prevent new scheduled runs |
unpause_dag | Unpause a DAG to resume scheduled runs |
list_tasks | Get all tasks in a DAG |
get_task | Get details about a specific task |
get_task_instance | Get task instance execution details |
get_task_logs | Get logs for a specific task instance execution |
list_pools | Get all resource pools |
get_pool | Get details about a specific pool |
list_variables | Get all Airflow variables |
get_variable | Get a specific variable by key |
list_connections | Get all connections (credentials excluded for security) |
list_assets | Get assets/datasets (unified naming across versions) |
list_plugins | Get installed Airflow plugins |
list_providers | Get installed provider packages |
get_airflow_config | Get Airflow configuration |
get_airflow_version | Get Airflow version information |
MCP Resources
| Resource URI | Description |
|---|---|
airflow://version | Airflow version information |
airflow://providers | Installed provider packages |
airflow://plugins | Installed Airflow plugins |
airflow://config | Airflow configuration |
MCP Prompts
| Prompt | Description |
|---|---|
troubleshoot_failed_dag | Guided workflow for diagnosing DAG failures |
daily_health_check | Morning health check routine |
onboard_new_dag | Guide for understanding a new DAG |
Advanced Usage
Running as Standalone Server
For HTTP-based integrations or connecting multiple clients to one server:
# Run server (HTTP mode is default)
uvx astro-airflow-mcp --airflow-url https://my-airflow.example.com --username admin --password admin
Connect MCP clients to: http://localhost:8000/mcp
Airflow Plugin Mode
Install into your Airflow 3.x environment to expose MCP at http://your-airflow:8080/mcp/v1:
# Add to your Astro project
echo astro-airflow-mcp >> requirements.txt
CLI Options
| Flag | Environment Variable | Default | Description |
|---|---|---|---|
--transport | MCP_TRANSPORT | stdio | Transport mode (stdio or http) |
--host | MCP_HOST | localhost | Host to bind to (HTTP mode only) |
--port | MCP_PORT | 8000 | Port to bind to (HTTP mode only) |
--airflow-url | AIRFLOW_API_URL | Auto-discovered or http://localhost:8080 | Airflow webserver URL |
--airflow-project-dir | AIRFLOW_PROJECT_DIR | $PWD | Astro project directory for auto-discovering Airflow URL from .astro/config.yaml |
--auth-token | AIRFLOW_AUTH_TOKEN | None | Bearer token for authentication |
--username | AIRFLOW_USERNAME | None | Username for authentication (Airflow 3.x uses OAuth2 token exchange) |
--password | AIRFLOW_PASSWORD | None | Password for authentication |
Architecture
The server is built using FastMCP with an adapter pattern for Airflow version compatibility:
Core Components
- Adapters (
adapters/): Version-specific API implementationsAirflowAdapter(base): Abstract interface for all Airflow API operationsAirflowV2Adapter: Airflow 2.x API (/api/v1) with basic authAirflowV3Adapter: Airflow 3.x API (/api/v2) with OAuth2 token exchange
- Version Detection: Automatic detection at startup by probing API endpoints
- Models (
models.py): Pydantic models for type-safe API responses
Version Handling Strategy
- Major versions (2.x vs 3.x): Adapter pattern with runtime version detection
- Minor versions (3.1 vs 3.2): Runtime feature detection with graceful fallbacks
- New API parameters: Pass-th
README truncated. View full README on GitHub.
Alternatives
Related Skills
Browse all skillsUI design system toolkit for Senior UI Designer including design token generation, component documentation, responsive design calculations, and developer handoff tools. Use for creating design systems, maintaining visual consistency, and facilitating design-dev collaboration.
Answer questions about the AI SDK and help build AI-powered features. Use when developers: (1) Ask about AI SDK functions like generateText, streamText, ToolLoopAgent, embed, or tools, (2) Want to build AI agents, chatbots, RAG systems, or text generation features, (3) Have questions about AI providers (OpenAI, Anthropic, Google, etc.), streaming, tool calling, structured output, or embeddings, (4) Use React hooks like useChat or useCompletion. Triggers on: "AI SDK", "Vercel AI SDK", "generateText", "streamText", "add AI to my app", "build an agent", "tool calling", "structured output", "useChat".
Master API documentation with OpenAPI 3.1, AI-powered tools, and modern developer experience practices. Create interactive docs, generate SDKs, and build comprehensive developer portals. Use PROACTIVELY for API documentation or developer portal creation.
Use when working with the OpenAI API (Responses API) or OpenAI platform features (tools, streaming, Realtime API, auth, models, rate limits, MCP) and you need authoritative, up-to-date documentation (schemas, examples, limits, edge cases). Prefer the OpenAI Developer Documentation MCP server tools when available; otherwise guide the user to enable `openaiDeveloperDocs`.
Guide for building TypeScript CLIs with Bun. Use when creating command-line tools, adding subcommands to existing CLIs, or building developer tooling. Covers argument parsing, subcommand patterns, output formatting, and distribution.
Integrate Vercel AI SDK applications with You.com tools (web search, AI agent, content extraction). Use when developer mentions AI SDK, Vercel AI SDK, generateText, streamText, or You.com integration with AI SDK.