Is it possible to get (easy?) stale-while-revalidate caching when using remix?
I’ve got the cache headers being sent from the app, but there doesn’t appear to be any caching in effect.
I’m trying to move over from Vercel so I can get multiple regions as Vercel currently only deploys remix to a single location server less function, but I need to have effective caching in place as my initial page response times are too slow due to network data fetching.
I am also migrating from Vercel and facing this issue, did you find out if its possible @michaelward82?
Fly doesn’t run a caching CDN in front of apps like Vercel. But there are ways to do this yourself. These involve work, but have the upside of giving you more control over the cache.
One way is to run nginx alongside your app. Nginx supports stale-while-revalidate
. Here’s an example of how to configure nginx for caching: GitHub - fly-apps/nginx-cluster: A horizontally scalable NGINX caching cluster. And an example of how to run multiple processes in your VM: Running Multiple Processes Inside A Fly.io App. You could also run nginx as a separate app and forward requests to your Remix app.
For this use case - caching slow actions - we like to see people building the behavior directly into the app with a simple in-memory cache. This makes sense since your app is already deployed in the regions you select.
See how @kentcdodds did this for his blog.
There he uses a mix of Redis and an in-memory cache. Using a single cache per region would work the same way as it does on Vercel, which does not share cache content across regions.
Also, the Vercel cache invalidates on deploy, which would be the default behavior on Fly if you stored cache in memory. If you attached a persistent disk to each VM, your cache could survive across deployments. But the cache would be harder to expire manually.