
OpenFeature MCP Server
OfficialIntegrates OpenFeature functionality into AI coding assistants, providing SDK installation guidance and feature flag evaluation capabilities. Connects AI assistants to any OFREP-compatible feature flag service.
Provides OpenFeature SDK installation guidance for various programming languages and enables feature flag evaluation through the OpenFeature Remote Evaluation Protocol (OFREP). Supports multiple AI clients and can connect to any OFREP-compatible feature flag service.
What it does
- Generate SDK installation guidance for multiple programming languages
- Evaluate feature flags through OpenFeature Remote Evaluation Protocol
- Connect to OFREP-compatible feature flag services
- Provide OpenFeature code generation assistance
- Bridge AI assistants with feature flag management tools
Best for
About OpenFeature MCP Server
OpenFeature MCP Server is an official MCP server published by open-feature that provides AI assistants with tools and capabilities via the Model Context Protocol. OpenFeature MCP Server — install OpenFeature SDKs, evaluate feature flags remotely via OFREP, and connect AI clients to It is categorized under developer tools.
How to install
You can install OpenFeature MCP Server in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
OpenFeature MCP Server is released under the Apache-2.0 license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
OpenFeature MCP
OpenFeature is an open specification that provides a vendor-agnostic, community-driven API for feature flagging that works with your favorite feature flag management tool or in-house solution.
Overview
⚠️ Active Development: This project is in active development. Features and APIs may change.
The OpenFeature Model Context Protocol (MCP) Server enables AI coding assistants to interact with OpenFeature through a standardized protocol. It provides SDK installation guidance and feature flag evaluation capabilities directly within your AI-powered development environment.
The OpenFeature MCP Server is a local tool that connects AI coding assistants (like Cursor, Claude Code, VS Code, and Windsurf) to OpenFeature functionality. It acts as a bridge between your AI assistant and OpenFeature capabilities, enabling intelligent code generation and migration, SDK installation guidance, and feature flag evaluation.
This server is published to the MCP Registry under dev.openfeature/mcp.
⚠️ AI Agent Behavior: AI agents are non-deterministic and may not complete tasks correctly. Always manually review their changes before committing. If you encounter issues, please open an issue with details about your AI agent (e.g., Claude Code + Sonnet 4.5, Cursor + gpt-5-codex) with the commands you used and the behavior you saw.
Quick Start
NPX Install
The easiest way to use the OpenFeature MCP Server is through NPX, which requires no installation:
{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}
NPM Global Install
You can install the MCP server globally:
npm install -g @openfeature/mcp
Then configure your AI assistant to use the global installation:
{
"mcpServers": {
"OpenFeature": {
"command": "openfeature-mcp"
}
}
}
AI Assistant Configuration
Cursor
To open Cursor and automatically add the OpenFeature MCP, click the install button above.
Alternatively, navigate to Cursor Settings -> Tools & MCP -> New MCP Server and add to ~/.cursor/mcp_settings.json:
{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}
VS Code
To open VS Code and automatically add the OpenFeature MCP, click the install button above. For more details, see the VS Code MCP documentation.
Alternatively, add to .vscode/mcp.json in your project:
{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"],
"env": { // Optional environment variables
"OPENFEATURE_OFREP_BASE_URL": "<your-base-url>",
"OPENFEATURE_OFREP_API_KEY": "<your-api-key>"
}
}
}
}
Claude Code
Add the server via the Claude Code CLI:
claude mcp add --transport stdio openfeature npx -y @openfeature/mcp
Then manage the connection with /mcp in the CLI.
Windsurf
In the Manage MCP servers raw config, add:
{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}
Codex CLI
Edit ~/.codex/config.toml:
[mcp_servers.openfeature]
command = "npx"
args = ["-y", "@openfeature/mcp"]
Restart Codex CLI after saving.
Gemini CLI
Edit ~/.gemini/settings.json:
{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}
Restart Gemini CLI after saving.
Claude Desktop
Edit your Claude Desktop config at:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
Add the following configuration:
{
"mcpServers": {
"openfeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}
Restart Claude Desktop after saving.
Available Tools
The OpenFeature MCP Server provides two main tools accessible to AI assistants:
SDK Installation Guide: install_openfeature_sdk
Fetches installation instructions for OpenFeature SDKs in various languages and frameworks. Optionally includes provider-specific setup documentation.
SDK Tool Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
technology | string | Yes | Target language/framework (see supported list below) |
providers | string[] | No | Provider identifiers to include installation instructions |
Supported Technologies
The technologies list is built from the available prompts/*.md, updated automatically using scripts/build-prompts.js
| Technology | SDK |
|---|---|
android | Android Kotlin SDK |
dotnet | .NET SDK |
go | Go SDK |
ios | iOS Swift SDK |
java | Java SDK |
javascript | JavaScript Web SDK |
nestjs | NestJS SDK |
nodejs | Node.js SDK |
php | PHP SDK |
python | Python SDK |
react | React SDK |
ruby | Ruby SDK |
Supported Providers
The provider list is automatically sourced from the OpenFeature ecosystem (open-feature/openfeature.dev repo).
See scripts/build-providers.js for details on how the provider list is maintained.
OFREP Flag Evaluation: ofrep_flag_eval
Evaluate feature flags using the OpenFeature Remote Evaluation Protocol (OFREP). Supports both single flag and bulk evaluation.
OFREP Tool Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
base_url | string | No | Base URL of your OFREP-compatible flag service |
flag_key | string | No | Flag key for single evaluation (omit for bulk) |
context | object | No | Evaluation context (e.g., { targetingKey: "user-123" }) |
etag | string | No | ETag for bulk evaluation caching |
auth | object | No | Authentication configuration |
auth.bearer_token | string | No | Bearer token for authorization |
auth.api_key | string | No | API key for authorization |
OFREP Configuration
To use OFREP flag evaluation features, configure authentication and endpoint details. The server checks configuration in this priority order:
-
Environment Variables
OPENFEATURE_OFREP_BASE_URLorOFREP_BASE_URLOPENFEATURE_OFREP_BEARER_TOKENorOFREP_BEARER_TOKENOPENFEATURE_OFREP_API_KEYorOFREP_API_KEY
-
Configuration File:
~/.openfeature-mcp.json
Example ~/.openfeature-mcp.json:
{
"OFREP": {
"baseUrl": "https://flags.example.com",
"bearerToken": "<your-token>",
"apiKey": "<your-api-key>"
}
}
You can override the config file path using the OPENFEATURE_MCP_CONFIG_PATH environment variable.
Note: All logs are written to stderr. The MCP protocol messages use stdout.
MCP Usage Examples
SDK Installation Example
"install the OpenFeature SDK for Node.js with the flagd provider"
The AI will use the MCP to fetch relevant installation instructions and attempt to install the OpenFeature SDK with the correct provider.
Flag Evaluation Example
When interacting with your AI assistant:
"Can you check the value of the 'new-checkout-flow' feature flag for 'user-123'?"
The AI will use the MCP to evaluate the flag using OFREP and provide you with the result, along with additional metadata like variant and reason.
Resources
- NPM Package: @openfeature/mcp
- MCP Registry: dev.openfeature/mcp
- OpenFeature Documentation: openfeature.dev
- CNCF Slack: Join #openfeature
Get Involved
The OpenFeature MCP Server is an open-source project maintained by the OpenFeature community. We welcome contributions:
- Report Issues: GitHub Issues
- Contribute Code: Contributing Guide
- Suggest Features: Share ideas in the CNCF Slack #openfeature channel
- Improve Documentation: Help us improve SDK installation guides and examples
Join the CNCF Slack to get involved.
Alternatives
Related Skills
Browse all skillsUI design system toolkit for Senior UI Designer including design token generation, component documentation, responsive design calculations, and developer handoff tools. Use for creating design systems, maintaining visual consistency, and facilitating design-dev collaboration.
Answer questions about the AI SDK and help build AI-powered features. Use when developers: (1) Ask about AI SDK functions like generateText, streamText, ToolLoopAgent, embed, or tools, (2) Want to build AI agents, chatbots, RAG systems, or text generation features, (3) Have questions about AI providers (OpenAI, Anthropic, Google, etc.), streaming, tool calling, structured output, or embeddings, (4) Use React hooks like useChat or useCompletion. Triggers on: "AI SDK", "Vercel AI SDK", "generateText", "streamText", "add AI to my app", "build an agent", "tool calling", "structured output", "useChat".
Master API documentation with OpenAPI 3.1, AI-powered tools, and modern developer experience practices. Create interactive docs, generate SDKs, and build comprehensive developer portals. Use PROACTIVELY for API documentation or developer portal creation.
Use when working with the OpenAI API (Responses API) or OpenAI platform features (tools, streaming, Realtime API, auth, models, rate limits, MCP) and you need authoritative, up-to-date documentation (schemas, examples, limits, edge cases). Prefer the OpenAI Developer Documentation MCP server tools when available; otherwise guide the user to enable `openaiDeveloperDocs`.
Guide for building TypeScript CLIs with Bun. Use when creating command-line tools, adding subcommands to existing CLIs, or building developer tooling. Covers argument parsing, subcommand patterns, output formatting, and distribution.
Integrate Vercel AI SDK applications with You.com tools (web search, AI agent, content extraction). Use when developer mentions AI SDK, Vercel AI SDK, generateText, streamText, or You.com integration with AI SDK.