Groq

Groq

Official
groq

Official Groq MCP server for ultra-fast LLM inference. Connect Claude or Cursor to Groq's LPU-powered API for running Ll

Official Groq MCP server for ultra-fast LLM inference. Connect Claude or Cursor to Groq's LPU-powered API for running Llama, Mixtral, and other open-source models at extreme speed.

6,445 viewsLocal (stdio)

About Groq

Groq is an official MCP server published by groq that provides AI assistants with tools and capabilities via the Model Context Protocol. Official Groq MCP server for ultra-fast LLM inference. Connect Claude or Cursor to Groq's LPU-powered API for running Ll It is categorized under ai ml. This server exposes 15 tools that AI clients can invoke during conversations and coding sessions.

How to install

You can install Groq in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

Groq is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

Tools (15)

list_models

Retrieve available LLM models from Groq's API including Llama, Mixtral, and other supported open-source models

create_completion

Generate text completion using specified Groq model with ultra-fast LPU inference

create_chat_completion

Generate conversational responses using chat-formatted input with Groq's high-speed inference

get_model_details

Fetch detailed information about a specific model including parameters, context length, and capabilities

create_streaming_completion

Generate streaming text completion for real-time response delivery

Alternatives

Related Skills

Browse all skills
groq-debug-bundle

Collect Groq debug evidence for support tickets and troubleshooting. Use when encountering persistent issues, preparing support tickets, or collecting diagnostic information for Groq problems. Trigger with phrases like "groq debug", "groq support bundle", "collect groq logs", "groq diagnostic".

2
groq-deploy-integration

Deploy Groq integrations to Vercel, Fly.io, and Cloud Run platforms. Use when deploying Groq-powered applications to production, configuring platform-specific secrets, or setting up deployment pipelines. Trigger with phrases like "deploy groq", "groq Vercel", "groq production deploy", "groq Cloud Run", "groq Fly.io".

1
groq-rate-limits

Implement Groq rate limiting, backoff, and idempotency patterns. Use when handling rate limit errors, implementing retry logic, or optimizing API request throughput for Groq. Trigger with phrases like "groq rate limit", "groq throttling", "groq 429", "groq retry", "groq backoff".

1
groq-security-basics

Apply Groq security best practices for secrets and access control. Use when securing API keys, implementing least privilege access, or auditing Groq security configuration. Trigger with phrases like "groq security", "groq secrets", "secure groq", "groq API key security".

1
groq-ci-integration

Configure Groq CI/CD integration with GitHub Actions and testing. Use when setting up automated testing, configuring CI pipelines, or integrating Groq tests into your build process. Trigger with phrases like "groq CI", "groq GitHub Actions", "groq automated tests", "CI groq".

1
groq-core-workflow-b

Execute Groq secondary workflow: Core Workflow B. Use when implementing secondary use case, or complementing primary workflow. Trigger with phrases like "groq secondary workflow", "secondary task with groq".

1