Hey!
I am deploying a Django app with Celery tasks in a separate machine (worker).
Now, my worker is a shared-cpu-1x with 1GB of RAM (aprox. each task uses 800mb) because we handle files, OCR, and other stuff.
If I scale the VM to shared-cpu-2x with 1GB of RAM, then the Celery worker scales to concurrency=2. It means 1GB each task or 1GB shared between the two?
I already scaled up from shared-cpu-1x@1024MB to shared-cpu-2x@1024MB going through shared-cpu-2x@524MB.
The Grafana logs of memory usage show 1024 MB but it the tasks are not crushing even when processing two tasks concurrently:
- shared-cpu-1x@1024MB
- shared-cpu-2x@524MB
- shared-cpu-2x@1024MB
Pricing is
shared-cpu-2x@1024MB is $7.64/mo
shared-cpu-4x@1024MB is $7.78/mo
so if the 1024 MB is not shared, then why the pricing increase is so low?
Also, I don’t understand the graphs. The “Total” is for the worker (yellow) or the web (green) process?
If nothing I said make any sense, please tell me. Thanks!