
JetBrains IDE WebSocket Monitor
Adds WebSocket monitoring to JetBrains IDE MCP integration, broadcasting all tool calls between MCP clients and IDEs in real-time for debugging purposes.
JetBrains MCP Server fork that broadcasts real-time WebSocket monitoring of all tool calls between clients and IDEs for debugging and extending integration functionality.
What it does
- Monitor MCP tool calls via WebSocket
- Proxy MCP requests between clients and JetBrains IDEs
- Broadcast real-time notifications on port 27042
- Maintain full compatibility with original JetBrains MCP server
Best for
About JetBrains IDE WebSocket Monitor
JetBrains IDE WebSocket Monitor is a community-built MCP server published by dortegau that provides AI assistants with tools and capabilities via the Model Context Protocol. Monitor all JetBrains IDE WebSocket tool calls in real time for advanced debugging and integration with JetBrains IDE We It is categorized under developer tools.
How to install
You can install JetBrains IDE WebSocket Monitor in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
JetBrains IDE WebSocket Monitor is released under the Apache-2.0 license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
MCP Proxy Sidecar
A fork of the JetBrains MCP Server that adds WebSocket monitoring capabilities, created by @dortegau.
This project extends the original MCP server functionality with WebSocket support while maintaining compatibility with all features of the original implementation.
Architecture
graph LR
A[MCP Client<br>e.g. Claude<br>Desktop App]
B[MCP Proxy<br>Sidecar<br>with WebSocket]
C[JetBrains IDE]
D[WebSocket Clients<br>Monitoring]
A <--MCP requests/responses--> B
B <--IDE commands/responses--> C
B --WebSocket notifications<br>port 27042--> D
style A fill:#f5f5f5,stroke:#333,stroke-width:2px
style B fill:#e1f5fe,stroke:#333,stroke-width:2px
style C fill:#f5f5f5,stroke:#333,stroke-width:2px
style D fill:#f5f5f5,stroke:#333,stroke-width:2px
The diagram above illustrates the system architecture and data flow:
- MCP Clients (like Claude Desktop App) communicate with the Sidecar using MCP protocol
- The Sidecar translates and forwards commands to JetBrains IDE
- Responses from the IDE are sent back through the Sidecar
- All tool calls are broadcasted via WebSocket for monitoring purposes
Features
This fork adds WebSocket notifications that allow you to monitor all MCP tool calls in real-time. Each tool call is broadcasted through WebSocket with detailed information about the endpoint and arguments.
WebSocket Message Format
interface MCPNotification {
type: 'mcp-notification';
payload: {
endpoint: string; // Tool name that was called
content: any; // Call arguments
timestamp: string; // ISO timestamp
}
}
WebSocket Configuration
The WebSocket server runs on port 27042 by default. You can customize this port using the WS_PORT environment variable in your configuration:
"env": {
"WS_PORT": "<custom port number>" // Example: "8080"
}
Usage
Install MCP Server Plugin
https://plugins.jetbrains.com/plugin/26071-mcp-server
Usage with Claude Desktop
To use this with Claude Desktop, add the following to your claude_desktop_config.json.
The full path on MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json, on Windows: %APPDATA%/Claude/claude_desktop_config.json.
{
"mcpServers": {
"ide": {
"command": "npx",
"args": ["-y", "mcp-proxy-sidecar"],
"env": {
"WS_PORT": "27042" // Optional: customize WebSocket port
}
}
}
}
Configuration Options
The following environment variables can be configured in your claude_desktop_config.json:
| Variable | Description | Default |
|---|---|---|
WS_PORT | Port for WebSocket server | 27042 |
IDE_PORT | Specific port for IDE connection | Auto-scans 63342-63352 |
HOST | Host address for IDE connection | 127.0.0.1 |
LOG_ENABLED | Enable debug logging | false |
Example configuration with all options:
{
"mcpServers": {
"ide": {
"command": "npx",
"args": ["-y", "mcp-proxy-sidecar"],
"env": {
"WS_PORT": "27042",
"IDE_PORT": "63342",
"HOST": "127.0.0.1",
"LOG_ENABLED": "true"
}
}
}
}
Note: If IDE_PORT is not specified, the sidecar will automatically scan ports 63342-63352 to find the IDE.
Development
Requirements
- Node.js 20.x
- pnpm (latest version)
Build
- Install dependencies:
pnpm install --frozen-lockfile - Build the project:
pnpm build
Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Publishing
This package is published to npm with:
- Provenance enabled for supply chain security
- Automated releases via GitHub Actions when creating a new release
- Public access on npm registry
To publish a new version:
- Update version in package.json
- Create and push a new tag matching the version
- Create a GitHub release from the tag
- The workflow will automatically build and publish to npm
Changelog
1.0.0
- Initial fork from @jetbrains/mcp-proxy
- Added WebSocket support for real-time tool call monitoring
- Renamed package for clarity
- Updated documentation and configuration examples
Credits
This is a fork of the JetBrains MCP Proxy Server. All credit for the original implementation goes to the JetBrains team.
Alternatives
Related Skills
Browse all skillsAnswer questions about the AI SDK and help build AI-powered features. Use when developers: (1) Ask about AI SDK functions like generateText, streamText, ToolLoopAgent, embed, or tools, (2) Want to build AI agents, chatbots, RAG systems, or text generation features, (3) Have questions about AI providers (OpenAI, Anthropic, Google, etc.), streaming, tool calling, structured output, or embeddings, (4) Use React hooks like useChat or useCompletion. Triggers on: "AI SDK", "Vercel AI SDK", "generateText", "streamText", "add AI to my app", "build an agent", "tool calling", "structured output", "useChat".
Use when working with the OpenAI API (Responses API) or OpenAI platform features (tools, streaming, Realtime API, auth, models, rate limits, MCP) and you need authoritative, up-to-date documentation (schemas, examples, limits, edge cases). Prefer the OpenAI Developer Documentation MCP server tools when available; otherwise guide the user to enable `openaiDeveloperDocs`.
Guide for building TypeScript CLIs with Bun. Use when creating command-line tools, adding subcommands to existing CLIs, or building developer tooling. Covers argument parsing, subcommand patterns, output formatting, and distribution.
Comprehensive CrewAI framework guide for building collaborative AI agent teams and structured workflows. Use when developing multi-agent systems with CrewAI, creating autonomous AI crews, orchestrating flows, implementing agents with roles and tools, or building production-ready AI automation. Essential for developers building intelligent agent systems, task automation, and complex AI workflows.
Guide developers through setting up development environments with proper tools, dependencies, and configurations
Browse OpenAI Codex session logs stored in ~/.codex/sessions. Provides list/show/watch helpers via the local Swift project at ~/Developer/CodexMonitor.