MCP Hub
Back to servers

Python Sandbox

A secure, containerized Python execution environment designed for data science and machine learning, featuring automatic persistence for matplotlib figures and integration with LM Studio.

Stars
1
Updated
Sep 28, 2025
Validated
Mar 16, 2026

Python Sandbox and Gradio UI for LM Studio

This project provides a complete ecosystem for executing Python code in a secure sandbox, tightly integrated with a Gradio-based web UI for interacting with Large Language Models from services like LM Studio.

Note: This setup has been tested and optimised for LM Studio v0.3.x using the qwen.qwen3-coder-30b-a3b-instruct model for robust tool-based code execution.

The full stack includes three main components:

  1. Python Sandbox: A secure, containerised environment for running Python code with a rich data science and machine learning stack.

  2. Artifact Server: A lightweight web server to store and serve files (plots, images, data) generated by the sandbox.

  3. Gradio UI: A user-friendly web interface to chat with an LLM, which can delegate code execution tasks to the Python Sandbox and display the results, including images and files.

This project is designed to work with Linux ecosystem, I am using Ubuntu 22.04 LTS, I have not tested in on windows, or windows docker.

It is important that you read through these instructions properly to get to a successful outcome.

You would require some familiarity with editing python, docker code and LM studio.

To get started first clone the repo and the starts with building the docker image of core MCP server. Docker file is locate under "MCP_core_server"

Command on Terminal "docker build -t python-sandbox-integrated_v2:latest".

If the build is successful you can test it like this.

"docker run --rm -i -p 8000:8000 -e PUBLIC_BASE_URL=http://{YOUR_LOCAL_IP}:8000 python-sandbox-integrated_v2:latest python /app/mcp_server.py --stdio"

For this point there are two ways to run the Python sandbox.

I am not going into the details on how to configure LM studio for MCP usage. There is plenty of documentation around this.

Local within the LM studio chat interface.

  1. Copy the mcp file provided , edit as required for local IP
  2. Make sure that you local firewall provided exceptions to the ports used and local volumes defined in JSON and Docker compose are local to your system.
  3. Make sure that you launch the side-car with docker compose before you start using the MCP server.
  4. It is as simple as docker compose up -d , side-car provides image persistence.
  5. Using the mcp file provided , LM studio will manage everything for you. No need to run docker manually.
  6. Important for python environment to work that you use the system prompt provided "system_prompt.txt".
  7. The Model you are using is capable of tool calling like "qwen.qwen3-coder-30b-a3b-instruct"

Now lets look at the seconds option. Using the Gradio Interface with LM studio devlopers API. This is a bit tricky however doable. In the developers mode of LM studio your mcp.json file will not work. It only works in the local chat mode.

Navigate to "python_mcp_LMstudio_developers_api".

Form this point we will assume that you have already built you base python sandbox docker image, so the concentration would be to get the Gradio working.

  1. Make use the LM studio developers API are working and can listen on external interface NIC.
  2. Edit the .env file as per your own local system.
  3. Before you proceed to this part please edit the docker compose file to make sure that, all local volumes and subfolders are actually pointing to your local system. It is necessary to use absolute paths
  4. Execute this command "docker compose --env-file .env -f docker-compose.external_v3.yml build --no-cache ui".
  5. This will build out your Gradio / Python-sandbox environment.
  6. To Run "docker compose --env-file .env -f docker-compose.external_v3.yml up -d"
  7. Now we need to check that each component is working properly.
  8. Execute this command "docker logs mcp_lmstudio_ui_ext --tail 120". You should see UI listening of set interface.

Finally, some troubleshooting tips.

  1. I have tested this extensively on LM local chat as well as Gradio Interface.
  2. If you run into any issues , please check the following.
  3. Are you mounting the local volumes to docker properly.
  4. Are you setting up environmental variable like IP adders of your machine properly.
  5. Allow exceptions for all ports used, if you are using some kind of firewall.
  6. Edit .env file properly.
  7. Edit "docker-compose.external_v3.yml" to have local volumes mounted properly.
  8. When running the sandbox in local LM chat or Gradio interface if you get a URL like this. "XGBoost Complete Analysis" Simply copy and the URL part to you browser, This is not a bug.

Reviews

No reviews yet

Sign in to write a review