Out of memory while Postgres pg_dump - hobby tier

Hi !

I’m a lucky owner of a hobby tier.

I’ve got a postgres database application, and i would like to pg_dump it to perform some tests locally before going “live” on my fly instance.

Unfortunately, it seems that my instance is now too big to be "pg_dump"ed without causing a out of memory on the pg instance due tu the hobby tier limitations.

What I do :
→ flyctl proxy 15432:5432 -a my-postgres
→ pg_dump -h localhost -p 15432 -U user_postgres db_postgres

and then…
on my prompt : pg_dump: error: error reading large object 1555059: server closed the connection unexpectedly
On Fly console : Out of memory: Killed process 3912 (postgres)

Can you help me ?
I’m not a postgres guru, and i think my dump attempt is pretty naive : Is there a way to perform a more effective dump without exceeding the hobby tier limits ?

Is there an other way to download my data locally to perform my tests ?

Thanks a lot :slight_smile:

Hey, pg_dump probably keeps a bunch of data in memory when it’s copying it over. One way I can think to walk around it is to temporary scale memory to something like 1GB with fly scale memory 1024 --app <your-postgres> then scale back when you’re done.

The time it takes for pg_dump shouldn’t raise your usage beyond the free tier as long as you don’t forget to scale back when you’re done

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.