Best practices to use workers - Python / Docker

Hi there,

I have a python (Flask) app where users can upload an image.
My app then does some image processing, which can take a couple of seconds and uses some memory (500 MB to 1.5 GB). All of this works well, but I also have basically zero traffic at the moment :slight_smile:

Now I wanted to take care of scaling it and put the whole image processing part into a worker.

Since I am not very familiar with Python or Docker I would love to hear about the best practices, especially how to do it on Fly.io

I read a about “Running Multiple Processes Inside A Fly.io App” but I am unsure how this would work in python.

In general I want to achieve something like this:

worker.addToQueue(payload) # Starts another independent python program

onWorkerFinished(result)

Thank you for any help!

I think running multiple processes is what you’re looking for. You basically want to define the web app and the worker in the services section and expose the web service in the [http_service] section of the fly.toml like so:

[processes]
app = "gunicorn app:app --log-level 'info'" //or whichever run command you prefer to use
worker = "celery -A appname worker -l INFO" // assuming you're using celery for your worker
...
//some other fly.toml config
...
[http_service]
  processes = ["app"]

As for your worker, you might want to look into a task queue like celery to handle image processing. Here’s an example of a fly.toml file in a python app if it’s helpful → Preview: multi process apps (get your workers here!) - #13 by lmzbonack

Thank you, that points me into a good direction!

1 Like