Skip to main content
Tensorlake lets you deploy Agentic apps, data workflows, and background tasks in plain Python — and exposes them as HTTP endpoints without you touching Kubernetes, queues, or any infra. If you have built a Python application that works on your laptop, you can deploy it as a scalable, fault tolerant, and durable application in minutes. Tensorlake removes the need for the following backend services:
  • Queues like Kafka, SQS
  • Durable Execution engines like Temporal
  • Kubernetes, Docker, Terraform, etc for managing compute resources.
  • Spark/Ray for distributed data processing.
This guide will walk you through the process of writing, deploying and calling Tensorlake Applications.

Deploy your first application

1

Install the Tensorlake SDK

pip install tensorlake
2

Authenticate With Tensorlake

tensorlake login
3

Create an application

Applications are defined by Python functions. Let’s start with a template, that greets a user by name.
tensorlake new hello_world
This creates a file named hello_world/hello_world.py with the following content:
hello_world.py
from tensorlake.applications import application, function
@application()
@function()
def greet(name: str) -> str:
    return f"Hello, {name}!"
4

Deploy It

Deploy your application referencing your application’s source file.
tensorlake deploy hello_world.py
That’s it — you now have a distributed app running in the cloud.

Call Applications

Tensorlake gives you an HTTP endpoint, for calling your application remotely.
https://api.tensorlake.ai/applications/<app_name>
1

Get an API Key

Fetch a key from the Tensorlake Dashboard and export it as an environment variable:
export TENSORLAKE_API_KEY=<API_KEY>
2

Make a request

curl https://api.tensorlake.ai/applications/hello_world \
-H "Authorization: Bearer $TENSORLAKE_API_KEY" \
--json '"John"'
# {"request_id":"beae8736ece31ef9"}
This will return a request ID that you can use to track the progress of your request.
3

Check progress

Requests may run seconds to hours depending on your workload.
curl -X GET https://api.tensorlake.ai/applications/hello_world/requests/{request_id} \
-H "Authorization: Bearer "$TENSORLAKE_API_KEY"" 
# {
#   "id":"B0IwzHibTTfn5mCXHPGsu",
#   "outcome":"success",
#   "failure_reason":null,
#   "request_error":null,
#   .... other fields ...
#}
The outcome field will be successful or failed depending on whether the request completed successfully. It will be null if the request is still in progress.
4

Get the output

curl -X GET https://api.tensorlake.ai/applications/hello_world/requests/{request_id}/output \
-H "Authorization: Bearer "$TENSORLAKE_API_KEY"" 
# "Hello, John!"

Testing Locally

Tensorlake Applications can run locally on your laptop. You can run them like regular python scripts.
hello_world.py
# At the end of the file
from tensorlake.applications import run_local_application, Request
if __name__ == "__main__":
    request: Request = run_local_application(greet, 'John')
    output = request.output()
    print(output)

Complex Applications

If you are wondering what makes Tensorlake Applications different than other serverless function platforms, consider this example:
complex_app.py
import re
from typing import List, Tuple
import urllib.request

from pydantic import BaseModel
from tensorlake.applications import application, function, Image

app_image = (Image(name="python:3.9-slim")
      .run("pip install pydantic")
    )

EXCLUDE = {
    "the","and","of","to","in","a","is","for","on","as","by","with",
    "that","from","at","an","be","it","or","are","was","this","which","also"
}

@function(description="Return top-K non-stopword tokens from a text blob")
def topk_words(text: str, k: int = 25) -> List[Tuple[str, int]]:
    counts = {}
    for tok in re.findall(r"[A-Za-z]{2,}", text.lower()):
        if tok in EXCLUDE:
            continue
        counts[tok] = counts.get(tok, 0) + 1
    return sorted(counts.items(), key=lambda x: x[1], reverse=True)[:k]

class RequestParams(BaseModel):
    rfc: str
    k: int = 25

@application()  # you can add retries/region/serializers here if you want
@function(image=app_image)
def rfc_topk(request: RequestParams) -> dict:
    url = f"https://www.ietf.org/rfc/{request.rfc}.txt"
    with urllib.request.urlopen(url, timeout=20) as resp:
        text = resp.read().decode("utf-8", errors="ignore")

    topk = topk_words(text, request.k)

    return {"rfc": request.rfc, "k": request.k, "topk": topk}
On Lambda or Vercel, both functions would normally live in the same container. That means they share CPU, memory, dependencies, and storage — even if their workloads have nothing in common. If one task needs lots of CPU and another barely uses any, you’re stuck provisioning for the worst case. Costs go up, performance tuning goes sideways.To fix that, you’d split them into separate services. But now you have to wire them together using queues, retries, state passing, and whatever glue logic keeps them in sync. That’s operational overhead, not application logic.Tensorlake removes all of that.You define your functions together in one application, but each function can declare its own compute, storage, and dependency image. Tensorlake runs them in separate containers, scales them independently, and handles all communication between them.
In the example above:
  • The functions topk_words and rfc_topk run in separate containers with their own dependencies and resource allocations.
  • Data flows between them automatically — no RPC calls, no queues, no persistence layer, no orchestration code.
Tensorlake also provides durability across function calls. If topk_words fails mid-pipeline, the request doesn’t start over; rfc_topk resumes from the last successful step.
By doing this, Tensorlake replaces:
  • Queues: Internal function-to-function communication is managed for you. Traffic spikes are buffered by Tensorlake.
  • Durable Execution: State is persisted across steps, so failures don’t reset the whole workflow/application.
  • Kubernetes/Terraform: Tensorlake manages the containers, autoscaling, and resource allocation.
You write Python functions. Tensorlake turns them into a distributed, durable application.

Next Steps

Here are some of the next things to learn about: