
Axiom
Connects to Axiom to execute APL queries and manage datasets. Enables AI agents to perform log analysis and anomaly detection on your Axiom data.
Integrates with Axiom for executing APL queries and listing datasets, enabling log analysis, anomaly detection, and data-driven decision making.
What it does
- Execute APL queries on Axiom datasets
- List available datasets
- Perform log analysis and filtering
- Detect anomalies in data
- Query time-series data
- Generate data-driven insights
Best for
About Axiom
Axiom is a community-built MCP server published by thetabird that provides AI assistants with tools and capabilities via the Model Context Protocol. Integrate with Axiom to run APL queries, analyze logs, detect anomalies, and make data-driven decisions easily. It is categorized under analytics data.
How to install
You can install Axiom in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
Axiom is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
MCP Server for Axiom
A JavaScript port of the official Axiom MCP server that enables AI agents to query data using Axiom Processing Language (APL).
This implementation provides the same functionality as the original Go version but packaged as an npm module for easier integration with Node.js environments.
Installation & Usage
MCP Configuration
You can run this MCP server directly using npx. Add the following configuration to your MCP configuration file:
{
"axiom": {
"command": "npx",
"args": ["-y", "mcp-server-axiom"],
"env": {
"AXIOM_TOKEN": "<AXIOM_TOKEN_HERE>",
"AXIOM_URL": "https://api.axiom.co",
"AXIOM_ORG_ID": "<AXIOM_ORG_ID_HERE>"
}
}
}
Local Development & Testing
Installation
npm install -g mcp-server-axiom
Environment Variables
The server can be configured using environment variables:
AXIOM_TOKEN(required): Your Axiom API tokenAXIOM_ORG_ID(required): Your Axiom organization IDAXIOM_URL(optional): Custom Axiom API URL (defaults to https://api.axiom.co)AXIOM_QUERY_RATE(optional): Queries per second limit (default: 1)AXIOM_QUERY_BURST(optional): Query burst capacity (default: 1)AXIOM_DATASETS_RATE(optional): Dataset list operations per second (default: 1)AXIOM_DATASETS_BURST(optional): Dataset list burst capacity (default: 1)PORT(optional): Server port (default: 3000)
Running the Server Locally
- Using environment variables:
export AXIOM_TOKEN=your_token
mcp-server-axiom
- Using a config file:
mcp-server-axiom config.json
Example config.json:
{
"token": "your_token",
"url": "https://custom.axiom.co",
"orgId": "your_org_id",
"queryRate": 2,
"queryBurst": 5,
"datasetsRate": 1,
"datasetsBurst": 2
}
API Endpoints
GET /: Get server implementation infoGET /tools: List available toolsPOST /tools/:name/call: Call a specific tool- Available tools:
queryApl: Execute APL querieslistDatasets: List available datasets
- Available tools:
Example Tool Calls
- Query APL:
curl -X POST http://localhost:3000/tools/queryApl/call \
-H "Content-Type: application/json" \
-d '{
"arguments": {
"query": "['logs'] | where ['severity'] == \"error\" | limit 10"
}
}'
- List Datasets:
curl -X POST http://localhost:3000/tools/listDatasets/call \
-H "Content-Type: application/json" \
-d '{
"arguments": {}
}'
License
MIT
Alternatives
Related Skills
Browse all skillsTransform data into compelling narratives using visualization, context, and persuasive structure. Use when presenting analytics to stakeholders, creating data reports, or building executive presentations.
Advanced content and topic research skill that analyzes trends across Google Analytics, Google Trends, Substack, Medium, Reddit, LinkedIn, X, blogs, podcasts, and YouTube to generate data-driven article outlines based on user intent analysis
Expert data scientist for advanced analytics, machine learning, and statistical modeling. Handles complex data analysis, predictive modeling, and business intelligence. Use PROACTIVELY for data analysis tasks, ML modeling, statistical analysis, and data-driven insights.
Analyze Google Analytics data, review website performance metrics, identify traffic patterns, and suggest data-driven improvements. Use when the user asks about analytics, website metrics, traffic analysis, conversion rates, user behavior, or performance optimization.
World-class data science skill for statistical modeling, experimentation, causal inference, and advanced analytics. Expertise in Python (NumPy, Pandas, Scikit-learn), R, SQL, statistical methods, A/B testing, time series, and business intelligence. Includes experiment design, feature engineering, model evaluation, and stakeholder communication. Use when designing experiments, building predictive models, performing causal analysis, or driving data-driven decisions.
Comprehensive backend development guide for Langfuse's Next.js 14/tRPC/Express/TypeScript monorepo. Use when creating tRPC routers, public API endpoints, BullMQ queue processors, services, or working with tRPC procedures, Next.js API routes, Prisma database access, ClickHouse analytics queries, Redis queues, OpenTelemetry instrumentation, Zod v4 validation, env.mjs configuration, tenant isolation patterns, or async patterns. Covers layered architecture (tRPC procedures → services, queue processors → services), dual database system (PostgreSQL + ClickHouse), projectId filtering for multi-tenant isolation, traceException error handling, observability patterns, and testing strategies (Jest for web, vitest for worker).