Pricing about CPUs and Celery Worker

Hey!

I am deploying a Django app with Celery tasks in a separate machine (worker).

Now, my worker is a shared-cpu-1x with 1GB of RAM (aprox. each task uses 800mb) because we handle files, OCR, and other stuff.

If I scale the VM to shared-cpu-2x with 1GB of RAM, then the Celery worker scales to concurrency=2. It means 1GB each task or 1GB shared between the two?

I already scaled up from shared-cpu-1x@1024MB to shared-cpu-2x@1024MB going through shared-cpu-2x@524MB.

The Grafana logs of memory usage show 1024 MB but it the tasks are not crushing even when processing two tasks concurrently:

  1. shared-cpu-1x@1024MB
  2. shared-cpu-2x@524MB
  3. shared-cpu-2x@1024MB

Pricing is
shared-cpu-2x@1024MB is $7.64/mo
shared-cpu-4x@1024MB is $7.78/mo

so if the 1024 MB is not shared, then why the pricing increase is so low?

Also, I don’t understand the graphs. The “Total” is for the worker (yellow) or the web (green) process?

If nothing I said make any sense, please tell me. Thanks!

Hi @Caco

To clarify, VMs don’t share memory.

So if you have two shared-cpu-1x@1024MB VMs, you have 2048MB of memory allocated in total, 1024MB for each VM.

The grafana chart is a little confusing when it says “Total”. It really should say “Maximum” because it’s not aggregating the total of memory allocated across your VMs, its simply showing the highest allocated memory of any of the VMs.

Ok, perfect.

So two shared-cpu-1x@1024MB VMs equals shared-cpu-2x@1024MB?

What’s the difference with two machines of shared-cpu-1x@1024MB?

Sorry for the confusion.

Each VM is separate, two shared-cpu-1x@1024MB does not equal shared-cpu-2x@1024MB.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.