
ipybox
Runs Python code in sandboxed Docker containers with persistent IPython sessions. Includes file transfer capabilities and network security controls for safe AI agent code execution.
What it does
- Execute Python code in isolated Docker containers
- Maintain stateful IPython kernels across executions
- Upload files from host to container
- Download files from container to host
- Reset kernel to clean state
- Stream real-time code execution output
Best for
Tools (4)
Execute Python code in a stateful IPython kernel within a Docker container. The kernel maintains state across executions - variables, imports, and definitions persist between calls. Each execution builds on the previous one, allowing you to build complex workflows step by step. Use '!pip install package_name' to install packages as needed. The kernel has an active asyncio event loop, so use 'await' directly for async code. DO NOT use asyncio.run() or create new event loops. Executions are sequential (not concurrent) as they share kernel state. Use the reset() tool to clear the kernel state and start fresh. Returns: str: Output text from execution, or empty string if no output.
Upload a file from the host filesystem to the container's /app directory. Makes a file from the host available inside the container for code execution. The uploaded file can then be accessed in execute_ipython_cell using the path '/app/{relpath}'.
Download a file from the container's /app directory to the host filesystem. Retrieves files created or modified during code execution from the container. The file at '/app/{relpath}' in the container will be saved to the specified location on the host. Parent directories are created automatically if they don't exist.
Reset the IPython kernel to a clean state. Creates a new kernel instance, clearing all variables, imports, and definitions from memory. Installed packages and files in the container filesystem are preserved. Useful for starting fresh experiments or clearing memory after processing large datasets.