Start long-running services, stream their output, send signals, and manage process lifecycles inside sandboxes.
Process operations use the sandbox proxy URL: https://<sandbox-id>.sandbox.tensorlake.ai
Start a Background Process
from tensorlake.sandbox import SandboxClient
client = SandboxClient()
with client.create_and_connect() as sandbox:
# Start a background process
proc = sandbox.start_process("python", ["-m", "http.server", "8080"])
print(f"PID: {proc.pid}")
# Run a command in the background using shell syntax
tl sbx exec <sandbox-id> bash -c "python -m http.server 8080 &"
curl -X POST https://<sandbox-id>.sandbox.tensorlake.ai/api/v1/processes \
-H "Authorization: Bearer $TL_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"command": "python",
"args": ["-m", "http.server", "8080"]
}'
Response (201 Created):{
"pid": 42,
"status": "running",
"command": "python",
"args": ["-m", "http.server", "8080"],
"stdin_writable": false,
"started_at": 1710000000000
}
Request options:{
"command": "python",
"args": ["-m", "http.server", "8080"],
"env": {"PORT": "8080"},
"working_dir": "/workspace",
"stdin_mode": "pipe",
"stdout_mode": "capture",
"stderr_mode": "capture"
}
List Processes
# List is available via the process manager
procs = sandbox.list_processes()
for p in procs:
print(f"PID {p.pid}: {p.status}")
curl https://<sandbox-id>.sandbox.tensorlake.ai/api/v1/processes \
-H "Authorization: Bearer $TL_API_KEY"
Response:{
"processes": [
{
"pid": 42,
"status": "running",
"command": "python",
"args": ["-m", "http.server", "8080"],
"stdin_writable": false,
"started_at": 1710000000000
}
]
}
Stream Process Output
Monitor process output in real time as it produces stdout/stderr:
with client.create_and_connect() as sandbox:
proc = sandbox.start_process("python", ["-c", """
import time
for i in range(10):
print(f"Processing item {i+1}/10")
time.sleep(1)
"""])
# Stream output line by line
for event in sandbox.follow_output(proc.pid):
print(event.line, end="")
# Follow combined output via SSE
curl -N https://<sandbox-id>.sandbox.tensorlake.ai/api/v1/processes/<pid>/output/follow \
-H "Authorization: Bearer $TL_API_KEY"
SSE stream:event: output
data: {"line":"Processing item 1/10\n","timestamp":1710000000000,"stream":"stdout"}
event: output
data: {"line":"Processing item 2/10\n","timestamp":1710000001000,"stream":"stdout"}
event: eof
data: {}
Send Signals
Send POSIX signals to running processes:
import signal
with client.create_and_connect() as sandbox:
proc = sandbox.start_process("python", ["-m", "http.server", "8080"])
# Gracefully stop the process
sandbox.send_signal(proc.pid, signal.SIGTERM)
# Send SIGTERM (15)
curl -X POST https://<sandbox-id>.sandbox.tensorlake.ai/api/v1/processes/<pid>/signal \
-H "Authorization: Bearer $TL_API_KEY" \
-H "Content-Type: application/json" \
-d '{"signal": 15}'
# Send SIGKILL (9)
curl -X POST https://<sandbox-id>.sandbox.tensorlake.ai/api/v1/processes/<pid>/signal \
-H "Authorization: Bearer $TL_API_KEY" \
-H "Content-Type: application/json" \
-d '{"signal": 9}'
Kill a Process
sandbox.send_signal(proc.pid, signal.SIGKILL)
curl -X DELETE https://<sandbox-id>.sandbox.tensorlake.ai/api/v1/processes/<pid> \
-H "Authorization: Bearer $TL_API_KEY"
Write to Stdin
Send input to a running process with stdin in pipe mode:
# Start a process with stdin pipe
curl -X POST https://<sandbox-id>.sandbox.tensorlake.ai/api/v1/processes \
-H "Authorization: Bearer $TL_API_KEY" \
-H "Content-Type: application/json" \
-d '{"command": "python", "args": ["-i"], "stdin_mode": "pipe"}'
# Write to stdin
curl -X POST https://<sandbox-id>.sandbox.tensorlake.ai/api/v1/processes/<pid>/stdin \
-H "Authorization: Bearer $TL_API_KEY" \
-H "Content-Type: application/octet-stream" \
--data-binary "print('hello')\n"
# Close stdin
curl -X POST https://<sandbox-id>.sandbox.tensorlake.ai/api/v1/processes/<pid>/stdin/close \
-H "Authorization: Bearer $TL_API_KEY"
Learn More