Docker image size limit raised from 2GB to 8GB

Previously, when deploying a docker image that was over 2GB in size, would throw an error. You’re now able to deploy images that are up to roughly 8GB.

We’re expecting this is particularly useful to those of you doing work on machine learning, or running developer environments on We’re interested in knowing about your use cases! Does this increase let you to run something you couldn’t run before? Should we raise the limit even higher?


Have been keeping an eye on using fly machines to augment our game server orchestration - larger image size limit is helpful just even to know if our server images get even larger in the future we won’t hit a bottleneck at that layer. It also begins to get closer to parity with PlayFab servers’ 10gb asset limit.

Last time I took a stab at adding fly machines to our matchmaker-based orchestration believe I hit a snag configuring machine sizes. That was last year before the machines improvements / public API addition so am excited to give it another go!


I still run into issues because a layer push fails at approx 4GB. I’m doing some ML stuff.

Can you re-run it with LOG_LEVEL=debug and share the output?

I agree, it is not possible to push a layer above 4.3GB or so. It simply fails repeatedly to push the layer around that point. A layer of pip install autogluon.tabular[all] should load enough cursed dependencies to trigger this issue.