
System Information
Provides real-time system monitoring data including CPU usage, memory, disk space, network stats, and running processes through a standardized interface.
Provides real-time system monitoring capabilities through psutil, enabling access to CPU usage, memory statistics, disk information, network status, running processes, and system uptime data with cross-platform compatibility and performance optimization.
What it does
- Monitor CPU usage and core information
- Track memory and swap statistics
- Check disk usage across mount points
- View network interface statistics
- List and filter running processes
- Get system uptime and temperature data
Best for
About System Information
System Information is a community-built MCP server published by dknell that provides AI assistants with tools and capabilities via the Model Context Protocol. Monitor your machines in real time with our cloud based network monitoring system for CPU, memory, disk, and network ana It is categorized under developer tools, analytics data.
How to install
You can install System Information in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
System Information is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
System Information MCP Server
A Model Context Protocol (MCP) server that provides real-time system information and metrics. This server exposes CPU usage, memory statistics, disk information, network status, and running processes through a standardized MCP interface.
Features
π οΈ Tools Available
get_cpu_info- Retrieve CPU usage, core counts, frequency, and load averageget_memory_info- Get virtual and swap memory statisticsget_disk_info- Disk usage information for all mounts or specific pathsget_network_info- Network interface information and I/O statisticsget_process_list- Running processes with sorting and filtering optionsget_system_uptime- System boot time and uptime informationget_temperature_info- Temperature sensors and fan speeds (when available)
π Resources Available
system://overview- Comprehensive system overview with all metricssystem://processes- Current process list resource
β Key Features
- Real-time metrics with configurable caching
- Cross-platform support (Windows, macOS, Linux)
- Security-focused with sensitive data filtering
- Performance optimized with intelligent caching
- Comprehensive error handling
- Environment variable configuration
Installation
Using uvx (Recommended)
The easiest way to install and use this MCP server is with uvx:
uvx install mcp-system-info
Then configure it in your MCP client (like Claude Desktop):
{
"mcpServers": {
"system-info": {
"command": "uvx",
"args": ["mcp-system-info"]
}
}
}
Development Installation
For local development:
-
Clone the repository:
git clone <repository-url> cd mcp-system-info -
Install dependencies:
uv sync -
Run the server:
uv run mcp-system-info
Development
Project Structure
mcp-system-info/
βββ src/
β βββ system_info_mcp/
β βββ __init__.py
β βββ server.py # Main FastMCP server
β βββ tools.py # Tool implementations
β βββ resources.py # Resource handlers
β βββ config.py # Configuration management
β βββ utils.py # Utility functions
βββ tests/ # Comprehensive test suite
βββ pyproject.toml # Project configuration
βββ README.md
Development Setup
-
Install development dependencies:
uv sync --dev -
Run tests:
uv run pytest -
Run tests with coverage:
uv run pytest --cov=system_info_mcp --cov-report=term-missing -
Format code:
uv run black src/ tests/ -
Lint code:
uv run ruff check src/ tests/ -
Type checking:
uv run mypy src/
Building and Publishing
Build the Package
# Build distribution files
uv build
This creates distribution files in the dist/ directory:
mcp_system_info-*.whl(wheel file)mcp_system_info-*.tar.gz(source distribution)
Local Testing with uvx
Test the package locally before publishing:
# Test running the command directly from wheel file
uvx --from ./dist/mcp_system_info-*.whl mcp-system-info
# Test with environment variables
SYSINFO_LOG_LEVEL=DEBUG uvx --from ./dist/mcp_system_info-*.whl mcp-system-info
Publishing to PyPI
# Publish to PyPI (requires PyPI account and token)
uv publish
# Or publish to TestPyPI first
uv publish --repository testpypi
Note: You'll need to:
- Create a PyPI account at https://pypi.org
- Generate an API token in your account settings
- Configure uv with your credentials or use environment variables
Environment Configuration
The server supports configuration through environment variables:
Core Settings
SYSINFO_CACHE_TTL- Cache time-to-live in seconds (default: 5)SYSINFO_MAX_PROCESSES- Maximum processes to return (default: 100)SYSINFO_ENABLE_TEMP- Enable temperature sensors (default: true)SYSINFO_LOG_LEVEL- Logging level (default: INFO)
Transport Configuration
SYSINFO_TRANSPORT- Transport protocol:stdio,sse, orstreamable-http(default: stdio)SYSINFO_HOST- Host to bind to for HTTP transports (default: localhost)SYSINFO_PORT- Port to bind to for HTTP transports (default: 8001)SYSINFO_MOUNT_PATH- Mount path for SSE transport (default: /mcp)
Transport Modes
1. STDIO (Default)
# Uses standard input/output - no network port
uv run mcp-system-info
2. SSE (Server-Sent Events)
# HTTP server with real-time streaming
SYSINFO_TRANSPORT=sse SYSINFO_PORT=8001 uv run mcp-system-info
# Server will be available at http://localhost:8001/mcp
3. Streamable HTTP
# HTTP server with request/response
SYSINFO_TRANSPORT=streamable-http SYSINFO_PORT=9000 uv run mcp-system-info
Complete Example:
SYSINFO_TRANSPORT=sse \
SYSINFO_HOST=0.0.0.0 \
SYSINFO_PORT=8001 \
SYSINFO_CACHE_TTL=10 \
SYSINFO_LOG_LEVEL=DEBUG \
uv run mcp-system-info
Usage Examples
Tool Usage
Get CPU Information
# Basic CPU info
{
"name": "get_cpu_info_tool",
"arguments": {
"interval": 1.0,
"per_cpu": false
}
}
Get Process List
# Top 10 processes by memory usage
{
"name": "get_process_list_tool",
"arguments": {
"limit": 10,
"sort_by": "memory",
"filter_name": "python"
}
}
Get Disk Information
# All disk usage
{
"name": "get_disk_info_tool",
"arguments": {}
}
# Specific path
{
"name": "get_disk_info_tool",
"arguments": {
"path": "/home"
}
}
Resource Usage
System Overview
# Request comprehensive system overview
{
"uri": "system://overview"
}
Process List Resource
# Get top processes resource
{
"uri": "system://processes"
}
Integration with Claude Desktop
Adding to Claude Desktop
-
Locate your Claude Desktop config file:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
- macOS:
-
Add the MCP server configuration:
Using uvx (Recommended)
{
"mcpServers": {
"system-info": {
"command": "uvx",
"args": ["mcp-system-info"],
"env": {
"SYSINFO_CACHE_TTL": "10",
"SYSINFO_LOG_LEVEL": "INFO"
}
}
}
}
For Local Development
{
"mcpServers": {
"system-info": {
"command": "uv",
"args": [
"--directory",
"/path/to/mcp-system-info",
"run",
"mcp-system-info"
],
"env": {
"SYSINFO_TRANSPORT": "stdio",
"SYSINFO_CACHE_TTL": "10",
"SYSINFO_LOG_LEVEL": "INFO"
}
}
}
}
For HTTP Transport (SSE)
{
"mcpServers": {
"system-info-http": {
"command": "uvx",
"args": ["mcp-system-info"],
"env": {
"SYSINFO_TRANSPORT": "sse",
"SYSINFO_HOST": "localhost",
"SYSINFO_PORT": "8001",
"SYSINFO_MOUNT_PATH": "/mcp"
}
}
}
}
- Restart Claude Desktop to load the new server.
Using with Claude
Once configured, you can ask Claude to:
- "What's my current CPU usage?"
- "Show me the top 10 processes using the most memory"
- "How much disk space is available?"
- "What's my system uptime?"
- "Give me a complete system overview"
Testing
Running Tests
# Run all tests
uv run pytest
# Run with verbose output
uv run pytest -v
# Run specific test file
uv run pytest tests/test_tools.py
# Run with coverage report
uv run pytest --cov=system_info_mcp --cov-report=html
Test Structure
tests/test_config.py- Configuration validation teststests/test_tools.py- Tool implementation teststests/test_resources.py- Resource handler teststests/test_utils.py- Utility function tests
All tests use mocked dependencies for consistent, fast execution across different environments.
Performance Considerations
- Caching: Intelligent caching reduces system calls and improves response times
- Configurable intervals: Adjust cache TTL based on your needs
- Lazy loading: Temperature sensors and other optional features load only when needed
- Async support: Built on FastMCP for efficient async operations
Security Features
- Read-only operations: No system modification capabilities
- Sensitive data filtering: Command-line arguments are filtered for passwords, tokens, etc.
- Input validation: All parameters are validated before processing
- Error isolation: Failures in one tool don't affect others
Platform Support
- macOS - Full support including temperature sensors on supported hardware
- Linux - Full support with hardware-dependent sensor availability
- Windows - Full support with platform-specific optimizations
Troubleshooting
Common Issues
- Permission errors: Some system information may require elevated privileges
- Missing sensors: Temperature/fan data availability varies by hardware
- Performance impact: Reduce cache TTL or limit process counts for better performance
Debug Mode
Enable debug logging for troubleshooting:
SYSINFO_LOG_LEVEL=DEBUG uv run mcp-system-info
Verifying Installation
Test that tools work correctly:
uv run python -c "from system_info_mcp.tools import get_cpu_info; print(get_cpu_info())"
Contributing
- Fork the repository
- Create a feature branch
- Make your changes with tests
- Run the full test suite
- Submit a pull request
Code Standards
- Follow PEP 8 style guidelines
- Add type hints to all functions
- Write tests for new functionality
- Update d
README truncated. View full README on GitHub.
Alternatives
Related Skills
Browse all skillsUse when building MCP servers or clients that connect AI systems with external tools and data sources. Invoke for MCP protocol compliance, TypeScript/Python SDKs, resource providers, tool functions.
CCXT cryptocurrency exchange library for TypeScript and JavaScript developers (Node.js and browser). Covers both REST API (standard) and WebSocket API (real-time). Helps install CCXT, connect to exchanges, fetch market data, place orders, stream live tickers/orderbooks, handle authentication, and manage errors. Use when working with crypto exchanges in TypeScript/JavaScript projects, trading bots, arbitrage systems, or portfolio management tools. Includes both REST and WebSocket examples.
UI design system toolkit for Senior UI Designer including design token generation, component documentation, responsive design calculations, and developer handoff tools. Use for creating design systems, maintaining visual consistency, and facilitating design-dev collaboration.
Comprehensive backend development guide for Langfuse's Next.js 14/tRPC/Express/TypeScript monorepo. Use when creating tRPC routers, public API endpoints, BullMQ queue processors, services, or working with tRPC procedures, Next.js API routes, Prisma database access, ClickHouse analytics queries, Redis queues, OpenTelemetry instrumentation, Zod v4 validation, env.mjs configuration, tenant isolation patterns, or async patterns. Covers layered architecture (tRPC procedures β services, queue processors β services), dual database system (PostgreSQL + ClickHouse), projectId filtering for multi-tenant isolation, traceException error handling, observability patterns, and testing strategies (Jest for web, vitest for worker).
Answer questions about the AI SDK and help build AI-powered features. Use when developers: (1) Ask about AI SDK functions like generateText, streamText, ToolLoopAgent, embed, or tools, (2) Want to build AI agents, chatbots, RAG systems, or text generation features, (3) Have questions about AI providers (OpenAI, Anthropic, Google, etc.), streaming, tool calling, structured output, or embeddings, (4) Use React hooks like useChat or useCompletion. Triggers on: "AI SDK", "Vercel AI SDK", "generateText", "streamText", "add AI to my app", "build an agent", "tool calling", "structured output", "useChat".
Comprehensive CrewAI framework guide for building collaborative AI agent teams and structured workflows. Use when developing multi-agent systems with CrewAI, creating autonomous AI crews, orchestrating flows, implementing agents with roles and tools, or building production-ready AI automation. Essential for developers building intelligent agent systems, task automation, and complex AI workflows.