I have a python (Flask) app where users can upload an image.
My app then does some image processing, which can take a couple of seconds and uses some memory (500 MB to 1.5 GB). All of this works well, but I also have basically zero traffic at the moment
Now I wanted to take care of scaling it and put the whole image processing part into a worker.
Since I am not very familiar with Python or Docker I would love to hear about the best practices, especially how to do it on Fly.io
I read a about “Running Multiple Processes Inside A Fly.io App” but I am unsure how this would work in python.
In general I want to achieve something like this:
worker.addToQueue(payload) # Starts another independent python program
onWorkerFinished(result)
I think running multiple processes is what you’re looking for. You basically want to define the web app and the worker in the services section and expose the web service in the [http_service] section of the fly.toml like so:
[processes]
app = "gunicorn app:app --log-level 'info'" //or whichever run command you prefer to use
worker = "celery -A appname worker -l INFO" // assuming you're using celery for your worker
...
//some other fly.toml config
...
[http_service]
processes = ["app"]