- Deploying the Indexify Server
- Building and Deploying Container Images capable of running functions of your workflow.
- Deploying your Graphs to Indexify Server using the SDK.
- Bare Metal and VMs
- Docker Compose
- Kubernetes (or any other container orchestrator)
Bare Metal
Download Indexify Server and Python SDK
- The server can be downloaded from here.
- Install the Python SDK using pip.
Start Server
Start the server on one machine. Read the configuration reference to understand how to customize the server to use blob stores for storing function outputs.indexify_storage
folder. Deleting that folder will require redeploying the graphs.
We have a replicated mode for the server, based on Raft consensus protocol. It’s not public yet because
we are still figuring out how to make it easy to configure, operate and use by developers.
If you are interested in using it, please reach out to us.
Start Executor
Start as many executors you want in different machines.server-addr
the address of the indexify server to connect to. Default:127.0.0.1:8900
.image-version
the version of the image exposed by this executor. Default:1
. Enables identifying the version of images run by this executor for function placement.
Docker Compose
You can spin up the server and executor using docker compose, and deploy and run in a production-like environment. Copy the docker-compose.yaml file from here.replicas
field for the executor in docker compose to add more executors (i.e parallelism) to the workflow.
This uses a default executor container based on Debian and a vanilla Python installation.
We generally provide docker compose files for local testing of every example project in the repository.