GigAPI

GigAPI

gigapi

Connects to GigAPI Timeseries Lake to run SQL queries and manage time-series data using InfluxDB Line Protocol. Provides database management capabilities for analytics and IoT monitoring workflows.

Integrates with GigAPI Timeseries Lake to execute SQL queries, manage databases and tables, and write time-series data using InfluxDB Line Protocol for analytics and IoT monitoring applications.

6240 views3Local (stdio)

What it does

  • Execute SQL queries on GigAPI clusters
  • List databases and tables
  • Get table schema information
  • Write time-series data using InfluxDB Line Protocol
  • Check server health and connectivity

Best for

IoT data monitoring and analysisTime-series analytics workflowsDatabase administration on GigAPI clustersReal-time data ingestion systems
InfluxDB Line Protocol supportSafe query execution via HTTP API

About GigAPI

GigAPI is a community-built MCP server published by gigapi that provides AI assistants with tools and capabilities via the Model Context Protocol. GigAPI integrates with GigAPI Timeseries Lake for time series database management, analytics, and monitoring with powerf It is categorized under databases, analytics data.

How to install

You can install GigAPI in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

GigAPI is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

README image

GigAPI MCP Server

PyPI - Version CodeQL

An MCP server for GigAPI Timeseries Lake that provides seamless integration with Claude Desktop and other MCP-compatible clients.

Features

GigAPI Tools

  • run_select_query
    • Execute SQL queries on your GigAPI cluster.
    • Input: sql (string): The SQL query to execute, database (string): The database to execute against.
    • All queries are executed safely through GigAPI's HTTP API with NDJSON format.
  • list_databases
    • List all databases on your GigAPI cluster.
    • Input: database (string): The database to use for the SHOW DATABASES query (defaults to "mydb").
  • list_tables
    • List all tables in a database.
    • Input: database (string): The name of the database.
  • get_table_schema
    • Get schema information for a specific table.
    • Input: database (string): The name of the database, table (string): The name of the table.
  • write_data
    • Write data using InfluxDB Line Protocol format.
    • Input: database (string): The database to write to, data (string): Data in InfluxDB Line Protocol format.
  • health_check
    • Check the health status of the GigAPI server.
  • ping
    • Ping the GigAPI server to check connectivity.

Quick Start

1. Install the MCP Server

Option A: From PyPI (Recommended)

# The package will be available on PyPI after the first release
# Users can install it directly with uv
uv run --with mcp-gigapi --python 3.11 mcp-gigapi --help

Option B: From Source

# Clone the repository
git clone https://github.com/gigapi/mcp-gigapi.git
cd mcp-gigapi

# Install dependencies
uv sync

2. Configure Claude Desktop

  1. Open the Claude Desktop configuration file located at:
    • On macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
    • On Windows: %APPDATA%/Claude/claude_desktop_config.json
  2. Add the following configuration:

For the Public Demo (Recommended for Testing)

{
  "mcpServers": {
    "mcp-gigapi": {
      "command": "uv",
      "args": [
        "run",
        "--with",
        "mcp-gigapi",
        "--python",
        "3.13",
        "mcp-gigapi"
      ],
      "env": {
        "GIGAPI_HOST": "gigapi.fly.dev",
        "GIGAPI_PORT": "443",
        "GIGAPI_TIMEOUT": "30",
        "GIGAPI_VERIFY_SSL": "true",
        "GIGAPI_DEFAULT_DATABASE": "mydb"
      }
    }
  }
}

For Local Development

{
  "mcpServers": {
    "mcp-gigapi": {
      "command": "uv",
      "args": [
        "run",
        "--with",
        "mcp-gigapi",
        "--python",
        "3.13",
        "mcp-gigapi"
      ],
      "env": {
        "GIGAPI_HOST": "localhost",
        "GIGAPI_PORT": "7971",
        "GIGAPI_TIMEOUT": "30",
        "GIGAPI_VERIFY_SSL": "false",
        "GIGAPI_DEFAULT_DATABASE": "mydb"
      }
    }
  }
}

With Authentication

{
  "mcpServers": {
    "mcp-gigapi": {
      "command": "uv",
      "args": [
        "run",
        "--with",
        "mcp-gigapi",
        "--python",
        "3.13",
        "mcp-gigapi"
      ],
      "env": {
        "GIGAPI_HOST": "your-gigapi-server",
        "GIGAPI_PORT": "7971",
        "GIGAPI_USERNAME": "your_username",
        "GIGAPI_PASSWORD": "your_password",
        "GIGAPI_TIMEOUT": "30",
        "GIGAPI_VERIFY_SSL": "true",
        "GIGAPI_DEFAULT_DATABASE": "your_database"
      }
    }
  }
}
  1. Important: Replace the uv command with the absolute path to your uv executable:
    which uv  # Find the path
    
  2. Restart Claude Desktop to apply the changes.

API Compatibility

This MCP server is designed to work with GigAPI's HTTP API endpoints:

Query Endpoints

  • POST /query?db={database}&format=ndjson - Execute SQL queries with NDJSON response format
  • All queries return NDJSON (Newline Delimited JSON) format for efficient streaming

Write Endpoints

  • POST /write?db={database} - Write data using InfluxDB Line Protocol

Administrative Endpoints

  • GET /health - Health check
  • GET /ping - Simple ping

Example Usage

Writing Data

Use InfluxDB Line Protocol format:

curl -X POST "http://localhost:7971/write?db=mydb" --data-binary @/dev/stdin << EOF
weather,location=us-midwest,season=summer temperature=82
weather,location=us-east,season=summer temperature=80
weather,location=us-west,season=summer temperature=99
EOF

Reading Data

Execute SQL queries via JSON POST with NDJSON format:

curl -X POST "http://localhost:7971/query?db=mydb&format=ndjson" \
  -H "Content-Type: application/json" \
  -d '{"query": "SELECT time, temperature FROM weather WHERE time >= epoch_ns('\''2025-04-24T00:00:00'\''::TIMESTAMP)"}'

Show Databases/Tables

# Show databases
curl -X POST "http://localhost:7971/query?db=mydb&format=ndjson" \
  -H "Content-Type: application/json" \
  -d '{"query": "SHOW DATABASES"}'

# Show tables  
curl -X POST "http://localhost:7971/query?db=mydb&format=ndjson" \
  -H "Content-Type: application/json" \
  -d '{"query": "SHOW TABLES"}'

# Count records
curl -X POST "http://localhost:7971/query?db=mydb&format=ndjson" \
  -H "Content-Type: application/json" \
  -d '{"query": "SELECT count(*), avg(temperature) FROM weather"}'

Environment Variables

Required Variables

  • GIGAPI_HOST: The hostname of your GigAPI server
  • GIGAPI_PORT: The port number of your GigAPI server (default: 7971)

Optional Variables

  • GIGAPI_USERNAME or GIGAPI_USER: The username for authentication (if required)
  • GIGAPI_PASSWORD or GIGAPI_PASS: The password for authentication (if required)
  • GIGAPI_TIMEOUT: Request timeout in seconds (default: 30)
  • GIGAPI_VERIFY_SSL: Enable/disable SSL certificate verification (default: true)
  • GIGAPI_DEFAULT_DATABASE: Default database to use for queries (default: mydb)
  • GIGAPI_MCP_SERVER_TRANSPORT: Sets the transport method for the MCP server (default: stdio)
  • GIGAPI_ENABLED: Enable/disable GigAPI functionality (default: true)

Example Configurations

For Local Development

# Required variables
GIGAPI_HOST=localhost
GIGAPI_PORT=7971

# Optional: Override defaults for local development
GIGAPI_VERIFY_SSL=false
GIGAPI_TIMEOUT=60
GIGAPI_DEFAULT_DATABASE=mydb

For Production with Authentication

# Required variables
GIGAPI_HOST=your-gigapi-server
GIGAPI_PORT=7971
GIGAPI_USERNAME=your_username
GIGAPI_PASSWORD=your_password

# Optional: Production settings
GIGAPI_VERIFY_SSL=true
GIGAPI_TIMEOUT=30
GIGAPI_DEFAULT_DATABASE=your_database

For Public Demo

GIGAPI_HOST=gigapi.fly.dev
GIGAPI_PORT=443
GIGAPI_VERIFY_SSL=true
GIGAPI_DEFAULT_DATABASE=mydb

Data Format

GigAPI uses Hive partitioning with the structure:

/data
  /mydb
    /weather
      /date=2025-04-10
        /hour=14
          *.parquet
          metadata.json

Development

Setup Development Environment

  1. Install dependencies:

    uv sync --all-extras --dev
    source .venv/bin/activate
    
  2. Create a .env file in the root of the repository:

    GIGAPI_HOST=localhost
    GIGAPI_PORT=7971
    GIGAPI_USERNAME=your_username
    GIGAPI_PASSWORD=your_password
    GIGAPI_TIMEOUT=30
    GIGAPI_VERIFY_SSL=false
    GIGAPI_DEFAULT_DATABASE=mydb
    
  3. For testing with the MCP Inspector:

    fastmcp dev mcp_gigapi/mcp_server.py
    

Running Tests

# Run all tests
uv run pytest -v

# Run only unit tests
uv run pytest -v -m "not integration"

# Run only integration tests
uv run pytest -v -m "integration"

# Run linting
uv run ruff check .

# Test with public demo
python test_demo.py

Testing with Public Demo

The repository includes a test script that validates the MCP server against the public GigAPI demo:

python test_demo.py

This will test:

  • ✅ Health check and connectivity
  • ✅ Database listing (SHOW DATABASES)
  • ✅ Table listing (SHOW TABLES)
  • ✅ Data queries (SELECT count(*) FROM table)
  • ✅ Sample data retrieval

PyPI Publishing

This package is automatically published to PyPI on each GitHub release. The publishing process is handled by GitHub Actions workflows:

  • CI Workflow (.github/workflows/ci.yml): Runs tests on pull requests and pushes to main
  • Publish Workflow (.github/workflows/publish.yml): Publishes to PyPI when a release is created

For Users

Once published, users can install the package directly from PyPI:

# Install and run the MCP server
uv run --with mcp-gigapi --python 3.11 mcp-gigapi

For Maintainers

To publish a new version:

  1. Update the version in pyproject.toml
  2. Create a GitHub release
  3. The workflow will automatically publish to PyPI

See RELEASING.md for detailed release instructions.

Troubleshooting

Common Issues

  1. Connection refused: Check that GigAPI is running and the host/port are correct
  2. Authentication failed: Verify username/password are correct
  3. SSL certificate errors: Set GIGAPI_VERIFY_SSL=false for self-signed certificates
  4. No databases found: Ensure you're using the correct default database (usually "mydb")

Debug Mode

Enable debug logging by setting the log level:

import logging
logging.basicConfig(level=logging.DEBUG)

License

Apache-2.0 license

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests
  5. Submit a pull request

Support


README truncated. View full README on GitHub.

Alternatives

Related Skills

Browse all skills
literature-review

Conduct comprehensive, systematic literature reviews using multiple academic databases (PubMed, arXiv, bioRxiv, Semantic Scholar, etc.). This skill should be used when conducting systematic literature reviews, meta-analyses, research synthesis, or comprehensive literature searches across biomedical, scientific, and technical domains. Creates professionally formatted markdown documents and PDFs with verified citations in multiple citation styles (APA, Nature, Vancouver, etc.).

144
postgresql-psql

Comprehensive guide for PostgreSQL psql - the interactive terminal client for PostgreSQL. Use when connecting to PostgreSQL databases, executing queries, managing databases/tables, configuring connection options, formatting output, writing scripts, managing transactions, and using advanced psql features for database administration and development.

23
data-storytelling

Transform data into compelling narratives using visualization, context, and persuasive structure. Use when presenting analytics to stakeholders, creating data reports, or building executive presentations.

13
content-trend-researcher

Advanced content and topic research skill that analyzes trends across Google Analytics, Google Trends, Substack, Medium, Reddit, LinkedIn, X, blogs, podcasts, and YouTube to generate data-driven article outlines based on user intent analysis

13
crypto-market-data

No API KEY needed for free tier. Professional-grade cryptocurrency market data integration for real-time prices, historical charts, and global analytics.

4
notion

Notion workspace integration. Use when user wants to read/write Notion pages, search databases, create tasks, or sync content with Notion.

4