JetBrains IDE WebSocket Monitor

JetBrains IDE WebSocket Monitor

dortegau

Adds WebSocket monitoring to JetBrains IDE MCP integration, broadcasting all tool calls between MCP clients and IDEs in real-time for debugging purposes.

JetBrains MCP Server fork that broadcasts real-time WebSocket monitoring of all tool calls between clients and IDEs for debugging and extending integration functionality.

1346 views2Local (stdio)

What it does

  • Monitor MCP tool calls via WebSocket
  • Proxy MCP requests between clients and JetBrains IDEs
  • Broadcast real-time notifications on port 27042
  • Maintain full compatibility with original JetBrains MCP server

Best for

Debugging MCP integrations with JetBrains IDEsMonitoring IDE automation workflowsExtending MCP functionality with custom tooling
Real-time WebSocket monitoringFork of official JetBrains MCP server

About JetBrains IDE WebSocket Monitor

JetBrains IDE WebSocket Monitor is a community-built MCP server published by dortegau that provides AI assistants with tools and capabilities via the Model Context Protocol. Monitor all JetBrains IDE WebSocket tool calls in real time for advanced debugging and integration with JetBrains IDE We It is categorized under developer tools.

How to install

You can install JetBrains IDE WebSocket Monitor in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

JetBrains IDE WebSocket Monitor is released under the Apache-2.0 license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

MCP Proxy Sidecar

npm version License Publish to NPM

A fork of the JetBrains MCP Server that adds WebSocket monitoring capabilities, created by @dortegau.

This project extends the original MCP server functionality with WebSocket support while maintaining compatibility with all features of the original implementation.

Architecture

graph LR
    A[MCP Client<br>e.g. Claude<br>Desktop App]
    B[MCP Proxy<br>Sidecar<br>with WebSocket]
    C[JetBrains IDE]
    D[WebSocket Clients<br>Monitoring]
    
    A <--MCP requests/responses--> B
    B <--IDE commands/responses--> C
    B --WebSocket notifications<br>port 27042--> D

    style A fill:#f5f5f5,stroke:#333,stroke-width:2px
    style B fill:#e1f5fe,stroke:#333,stroke-width:2px
    style C fill:#f5f5f5,stroke:#333,stroke-width:2px
    style D fill:#f5f5f5,stroke:#333,stroke-width:2px

The diagram above illustrates the system architecture and data flow:

  1. MCP Clients (like Claude Desktop App) communicate with the Sidecar using MCP protocol
  2. The Sidecar translates and forwards commands to JetBrains IDE
  3. Responses from the IDE are sent back through the Sidecar
  4. All tool calls are broadcasted via WebSocket for monitoring purposes

Features

This fork adds WebSocket notifications that allow you to monitor all MCP tool calls in real-time. Each tool call is broadcasted through WebSocket with detailed information about the endpoint and arguments.

WebSocket Message Format

interface MCPNotification {
  type: 'mcp-notification';
  payload: {
    endpoint: string;     // Tool name that was called
    content: any;         // Call arguments
    timestamp: string;    // ISO timestamp
  }
}

WebSocket Configuration

The WebSocket server runs on port 27042 by default. You can customize this port using the WS_PORT environment variable in your configuration:

"env": {
  "WS_PORT": "<custom port number>"  // Example: "8080"
}

Usage

Install MCP Server Plugin

https://plugins.jetbrains.com/plugin/26071-mcp-server

Usage with Claude Desktop

To use this with Claude Desktop, add the following to your claude_desktop_config.json. The full path on MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json, on Windows: %APPDATA%/Claude/claude_desktop_config.json.

{
  "mcpServers": {
    "ide": {
      "command": "npx",
      "args": ["-y", "mcp-proxy-sidecar"],
      "env": {
        "WS_PORT": "27042"  // Optional: customize WebSocket port
      }
    }
  }
}

Configuration Options

The following environment variables can be configured in your claude_desktop_config.json:

VariableDescriptionDefault
WS_PORTPort for WebSocket server27042
IDE_PORTSpecific port for IDE connectionAuto-scans 63342-63352
HOSTHost address for IDE connection127.0.0.1
LOG_ENABLEDEnable debug loggingfalse

Example configuration with all options:

{
  "mcpServers": {
    "ide": {
      "command": "npx",
      "args": ["-y", "mcp-proxy-sidecar"],
      "env": {
        "WS_PORT": "27042",
        "IDE_PORT": "63342",
        "HOST": "127.0.0.1",
        "LOG_ENABLED": "true"
      }
    }
  }
}

Note: If IDE_PORT is not specified, the sidecar will automatically scan ports 63342-63352 to find the IDE.

Development

Requirements

  • Node.js 20.x
  • pnpm (latest version)

Build

  1. Install dependencies:
    pnpm install --frozen-lockfile
    
  2. Build the project:
    pnpm build
    

Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Publishing

This package is published to npm with:

  • Provenance enabled for supply chain security
  • Automated releases via GitHub Actions when creating a new release
  • Public access on npm registry

To publish a new version:

  1. Update version in package.json
  2. Create and push a new tag matching the version
  3. Create a GitHub release from the tag
  4. The workflow will automatically build and publish to npm

Changelog

1.0.0

  • Initial fork from @jetbrains/mcp-proxy
  • Added WebSocket support for real-time tool call monitoring
  • Renamed package for clarity
  • Updated documentation and configuration examples

Credits

This is a fork of the JetBrains MCP Proxy Server. All credit for the original implementation goes to the JetBrains team.

Alternatives

Related Skills

Browse all skills
ai-sdk

Answer questions about the AI SDK and help build AI-powered features. Use when developers: (1) Ask about AI SDK functions like generateText, streamText, ToolLoopAgent, embed, or tools, (2) Want to build AI agents, chatbots, RAG systems, or text generation features, (3) Have questions about AI providers (OpenAI, Anthropic, Google, etc.), streaming, tool calling, structured output, or embeddings, (4) Use React hooks like useChat or useCompletion. Triggers on: "AI SDK", "Vercel AI SDK", "generateText", "streamText", "add AI to my app", "build an agent", "tool calling", "structured output", "useChat".

6
openai-knowledge

Use when working with the OpenAI API (Responses API) or OpenAI platform features (tools, streaming, Realtime API, auth, models, rate limits, MCP) and you need authoritative, up-to-date documentation (schemas, examples, limits, edge cases). Prefer the OpenAI Developer Documentation MCP server tools when available; otherwise guide the user to enable `openaiDeveloperDocs`.

4
cli-builder

Guide for building TypeScript CLIs with Bun. Use when creating command-line tools, adding subcommands to existing CLIs, or building developer tooling. Covers argument parsing, subcommand patterns, output formatting, and distribution.

3
crewai-developer

Comprehensive CrewAI framework guide for building collaborative AI agent teams and structured workflows. Use when developing multi-agent systems with CrewAI, creating autonomous AI crews, orchestrating flows, implementing agents with roles and tools, or building production-ready AI automation. Essential for developers building intelligent agent systems, task automation, and complex AI workflows.

2
environment-setup-guide

Guide developers through setting up development environments with proper tools, dependencies, and configurations

1
codex-monitor

Browse OpenAI Codex session logs stored in ~/.codex/sessions. Provides list/show/watch helpers via the local Swift project at ~/Developer/CodexMonitor.

1