Skip to main content
Orchestrate is a serverless runtime for adding data orchestration capabilities to Agents. You can build orchestration APIs without deploying containers, workers or queues. Functions starts running when they are called and scale down to zero after finishing work. Some use cases are -
  1. Creating multi-stage tools that needs to be retried until they complete.
  2. Data ingestion worklfow APIs.
  3. Scale out processing using distributed map and reduce.
from tensorlake.applications import application, function

@function()
def summarize(doc: str) -> int:
   summary = call_llm(doc)
   return summary

@function()
def summarize_files(docs: List[str]) -> List[str]:
    summaries = docs.map(summarize)
    return summaries
  1. Tensorlake functions are a unit of compute which is executed in a sandbox and retried based on a user provided retry policy.
  2. Functions decorated with @applications becomes callable from external systems and exposed as HTTP APIs.
  3. Function calls are automatically queued durably when they are called when there is not enough compute to handle the requests.
  4. Each function’s inputs and outputs are check-pointed durably so they can be retried.
  5. Every function can have different resource asks, making it possible to allocate more resources to functions which are more compute or memory intensive.

Quickstart

Let’s build a simple application that greets a user by name.
1

Install the Tensorlake SDK

pip install tensorlake
2

Get an API Key

You can get an API key from the Tensorlake Dashboard.
export TENSORLAKE_API_KEY=<your-api-key>
3

Create an application

Applications are defined by Python functions. Let’s start with a template, that greets a user by name.
tl new hello_world
This creates a file named hello_world/hello_world.py with the following content:
hello_world.py
from tensorlake.applications import application, function
@application()
@function()
def greet(name: str) -> str:
    return f"Hello, {name}!"
4

Deploy It

Deploy your application referencing your application’s source file.
tl deploy hello_world/hello_world.py

Invoke Orchestrate Functions

Orchestrate endpoints can be invoked using HTTP requests or the Python SDK.

HTTP Endpoint

https://api.tensorlake.ai/applications/<app_name>
1

Make a request

curl https://api.tensorlake.ai/applications/hello_world \
-H "Authorization: Bearer $TENSORLAKE_API_KEY" \
--json '"John"'
# {"request_id":"beae8736ece31ef9"}
This will return a request ID that you can use to track the progress of your request.
2

Check progress

Requests may run seconds to hours depending on your workload.
curl -X GET https://api.tensorlake.ai/applications/hello_world/requests/{request_id} \
-H "Authorization: Bearer $TENSORLAKE_API_KEY"
# {
#   "id":"B0IwzHibTTfn5mCXHPGsu",
#   "outcome":"success",
#   "failure_reason":null,
#   "request_error":null,
#   .... other fields ...
#}
The outcome field will be success or failure depending on whether the request completed successfully. It will be null if the request is still in progress.
3

Get the output

curl -X GET https://api.tensorlake.ai/applications/hello_world/requests/{request_id}/output \
-H "Authorization: Bearer $TENSORLAKE_API_KEY"
# "Hello, John!"

Testing Locally

Tensorlake Applications can run locally on your laptop. You can run them like regular python scripts.
hello_world.py
# At the end of the file
from tensorlake.applications import run_local_application, Request
if __name__ == "__main__":
    request: Request = run_local_application(greet, 'John')
    output: str = request.output()
    print(output)
# "Hello, John!"
Deploying agents which starts complex workflows, or multi-stage tool calls requires building complex distributed systems with queues, workers or orchestration engines. It takes away time from focusing and building the agentic logic. Orchestrate helps to solve this problem by letting you write orchestration endpoints and solves the coordination of functions, retries and autoscaling.

Examples

Deep Research Agent

Multi-agent research pipeline with parallel web search and report synthesis using OpenAI Agents SDK.

Code Interpreter

Execute LLM-generated code safely in isolated containers with data science libraries.

Agent with Tool Calling

Claude agentic loop that chains tool calls, each running in its own isolated container.

Personal Finance Manager

Parse bank statements, categorize transactions, and answer spending questions with Claude.

Web Scraper

Serverless web crawler that scrapes websites N levels deep using headless Chrome.

Weather Agent

Conversational weather agent powered by Claude, deployed as an HTTP API.

Next Steps

Deploy Your First Agent

Follow our quick start guide to build and deploy a serverless agentic code interpreter in under 5 minutes.