Riza

Riza

Official
riza-io

Provides a secure sandboxed environment for LLMs to write, save, and execute code through Riza's isolated code interpreter API. Tools persist across conversations for reuse.

Provides a secure bridge between LLMs and Riza's isolated code interpreter API, enabling writing, saving, editing, and executing code safely in a sandboxed environment with persistent tool management across conversations.

12296 views8Local (stdio)

What it does

  • Execute arbitrary code in isolated sandbox
  • Create and save reusable code tools
  • Edit existing saved tools
  • Fetch tool source code for modification
  • List all available saved tools
  • Run code safely without local execution risks

Best for

LLM applications requiring safe code executionBuilding persistent coding assistantsPrototyping without local environment setupEducational coding environments
Isolated sandbox executionPersistent tools across sessionsNo local code execution needed

About Riza

Riza is an official MCP server published by riza-io that provides AI assistants with tools and capabilities via the Model Context Protocol. Riza offers a secure bridge between LLMs and a sandboxed code interpreter, allowing safe code execution and persistent t It is categorized under developer tools.

How to install

You can install Riza in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

Riza is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

Riza MCP Server

Riza offers an isolated code interpreter for your LLM-generated code.

Our MCP server implementation wraps the Riza API and presents endpoints as individual tools.

Configure with Claude Desktop as below, or adapt as necessary for your MCP client. Get a free Riza API key in your Riza Dashboard.

{
  "mcpServers": {
    "riza-server": {
      "command": "npx",
      "args": [
        "@riza-io/riza-mcp"
      ],
      "env": {
        "RIZA_API_KEY": "your-api-key"
      }
    }
  }
}

The Riza MCP server provides several tools to your LLM:

  • create_tool: Your LLM can write code and save it as a tool using the Riza Tools API. It can then execute these tools securely on Riza using execute_tool.
  • fetch_tool: Your LLM can fetch saved Riza tools, including source code, which can be useful for editing tools.
  • execute_tool: Executes a saved tool securely on Riza's code interpreter API.
  • edit_tool: Edits an existing saved tool.
  • list_tools: Lists available saved tools.
  • execute_code: Executes arbitrary code safely on Riza's code interpreter API, without saving it as a tool.

Alternatives

Related Skills

Browse all skills
dotnet-backend

.NET/C# backend developer for ASP.NET Core APIs with Entity Framework Core. Builds REST APIs, minimal APIs, gRPC services, authentication with Identity/JWT, authorization, database operations, background services, SignalR real-time features. Activates for: .NET, C#, ASP.NET Core, Entity Framework Core, EF Core, .NET Core, minimal API, Web API, gRPC, authentication .NET, Identity, JWT .NET, authorization, LINQ, async/await C#, background service, IHostedService, SignalR, SQL Server, PostgreSQL .NET, dependency injection, middleware .NET.

109
ui-design-system

UI design system toolkit for Senior UI Designer including design token generation, component documentation, responsive design calculations, and developer handoff tools. Use for creating design systems, maintaining visual consistency, and facilitating design-dev collaboration.

18
math-tools

Deterministic mathematical computation using SymPy. Use for ANY math operation requiring exact/verified results - basic arithmetic, algebra (simplify, expand, factor, solve equations), calculus (derivatives, integrals, limits, series), linear algebra (matrices, determinants, eigenvalues), trigonometry, number theory (primes, GCD/LCM, factorization), and statistics. Ensures mathematical accuracy by using symbolic computation rather than LLM estimation.

16
ai-sdk

Answer questions about the AI SDK and help build AI-powered features. Use when developers: (1) Ask about AI SDK functions like generateText, streamText, ToolLoopAgent, embed, or tools, (2) Want to build AI agents, chatbots, RAG systems, or text generation features, (3) Have questions about AI providers (OpenAI, Anthropic, Google, etc.), streaming, tool calling, structured output, or embeddings, (4) Use React hooks like useChat or useCompletion. Triggers on: "AI SDK", "Vercel AI SDK", "generateText", "streamText", "add AI to my app", "build an agent", "tool calling", "structured output", "useChat".

6
api-documenter

Master API documentation with OpenAPI 3.1, AI-powered tools, and modern developer experience practices. Create interactive docs, generate SDKs, and build comprehensive developer portals. Use PROACTIVELY for API documentation or developer portal creation.

4
openai-knowledge

Use when working with the OpenAI API (Responses API) or OpenAI platform features (tools, streaming, Realtime API, auth, models, rate limits, MCP) and you need authoritative, up-to-date documentation (schemas, examples, limits, edge cases). Prefer the OpenAI Developer Documentation MCP server tools when available; otherwise guide the user to enable `openaiDeveloperDocs`.

4