Skip to main content
This guide will walk you through the process of writing, deploying, and calling Tensorlake Applications. You will learn how to build a serverless agentic web-scrapper with Anthropic’s Claude Agent SDK under 5 minutes. Let’s start with a simple “Hello, World!” application, to make sure your environment is set up correctly.
1

Install the Tensorlake SDK

pip install tensorlake
2

Get an API Key

You can get an API key from the Tensorlake Dashboard.
export TENSORLAKE_API_KEY=<your-api-key>
3

Create an application

Applications are defined by Python functions. Let’s start with a template, that greets a user by name.
tensorlake new hello_world
This creates a file named hello_world/hello_world.py with the following content:
hello_world.py
from tensorlake.applications import application, function
@application()
@function()
def greet(name: str) -> str:
    return f"Hello, {name}!"
4

Deploy It

Deploy your application referencing your application’s source file.
tensorlake deploy hello_world/hello_world.py
That’s it — you now have a distributed app running in the cloud.

Call Applications

Tensorlake gives you an HTTP endpoint, for calling your application remotely.
https://api.tensorlake.ai/applications/<app_name>
1

Get an API Key

Fetch a key from the Tensorlake Dashboard and export it as an environment variable:
export TENSORLAKE_API_KEY=<API_KEY>
2

Make a request

curl https://api.tensorlake.ai/applications/hello_world \
-H "Authorization: Bearer $TENSORLAKE_API_KEY" \
--json '"John"'
# {"request_id":"beae8736ece31ef9"}
This will return a request ID that you can use to track the progress of your request.
3

Check progress

Requests may run seconds to hours depending on your workload.
curl -X GET https://api.tensorlake.ai/applications/hello_world/requests/{request_id} \
-H "Authorization: Bearer $TENSORLAKE_API_KEY"
# {
#   "id":"B0IwzHibTTfn5mCXHPGsu",
#   "outcome":"success",
#   "failure_reason":null,
#   "request_error":null,
#   .... other fields ...
#}
The outcome field will be success or failure depending on whether the request completed successfully. It will be null if the request is still in progress.
4

Get the output

curl -X GET https://api.tensorlake.ai/applications/hello_world/requests/{request_id}/output \
-H "Authorization: Bearer $TENSORLAKE_API_KEY"
# "Hello, John!"

Testing Locally

Tensorlake Applications can run locally on your laptop. You can run them like regular python scripts.
hello_world.py
# At the end of the file
from tensorlake.applications import run_local_application, Request
if __name__ == "__main__":
    request: Request = run_local_application(greet, 'John')
    output = request.output()
    print(output)

Building an Agentic Code Interpreter

Now let’s build a real agentic application. We will build a code interpreter agent with OpenAI Agent SDK. The tensorlake application function will be the main agentic loop, and we will use a Tensorlake function to execute code, and pass it as a tool to the agent. Whenever the agent needs to execute code, it will call the Tensorlake function and pass the code as a tool call. The Tensorlake function will execute the code in an isolated container and return the output to the agent.
1

Add Your OpenAI API Key as a Secret

The agent needs access to the OpenAI API. Add your API key as a secret using the Tensorlake CLI:
tensorlake secrets set OPENAI_API_KEY <your-openai-api-key>
This securely stores your API key so it can be injected into your application at runtime. The secret is referenced in the function decorator which uses the OpenAI Agent SDK and will be available as an environment variable.
2

Create the Application

code_interpreter.py
import sys
from io import StringIO
from tensorlake.applications import application, function, Image

# Image for the code execution container - has data science libraries
code_exec_image = (
    Image(name="python:3.11-slim")
    .run("pip install numpy pandas matplotlib")
)

# Image for the agent container - has the OpenAI Agent SDK
agent_image = (
    Image(name="python:3.11-slim")
    .run("pip install openai-agents")
)

@function(image=code_exec_image, cpu=2, memory=4)
def execute_code(code: str) -> str:
    """Execute Python code in a secure sandbox and return the output."""
    stdout_capture = StringIO()
    old_stdout = sys.stdout
    
    try:
        sys.stdout = stdout_capture
        exec_globals = {"__builtins__": __builtins__}
        exec(code, exec_globals)
        sys.stdout = old_stdout
        return stdout_capture.getvalue()
    except Exception as e:
        sys.stdout = old_stdout
        return f"Error: {e}\nOutput: {stdout_capture.getvalue()}"

@application()
@function(image=agent_image, secrets=["OPENAI_API_KEY"])
def code_interpreter_agent(user_request: str) -> str:
    """Run the agentic loop and return the final answer."""

    from agents import Agent, Runner, function_tool
    
    @function_tool
    def execute_python(code: str) -> str:
        """Execute Python code in a secure sandbox. Use this for calculations or data analysis."""
        return execute_code(code)
    
    agent = Agent(
        name="Code interpreter",
        model="gpt-4o",
        instructions="You are a helpful assistant that can execute Python code to solve problems.",
        tools=[execute_python],
    )
    
    result = Runner.run_sync(agent, user_request)
    return result.final_output
3

Deploy and Run

Deploy your application and call it:
tensorlake deploy code_interpreter.py
curl https://api.tensorlake.ai/applications/code_interpreter_agent \
  -H "Authorization: Bearer $TENSORLAKE_API_KEY" \
  --json '"What is the square root of 273 * 312821 plus 1782?"'
On Lambda or Vercel, running arbitrary code execution would require complex sandboxing, security policies, and resource management — all in the same container as your main application.With Tensorlake, the execute_code function runs in a completely isolated container with its own CPU, memory, and dependencies. If code execution needs heavy compute or specialized libraries, it scales independently from your agent logic.You get secure, isolated code execution without managing infrastructure.
Tensorlake handles the infrastructure complexity so you can focus on building powerful AI tools.

Next Steps

Here are some of the next things to learn about: