Hi,
We are using fly.io to host next.js app. Our app is quiet simple - prerenders 30x html and every 2 hours or change of data htmls are invalidated and renegerated.
We would like to make this scale to 0 - there is no much traffic especially in the middle of the night. But it seems that when app is scale down to 0 ISR stop working.
When app goes down to 0 all regenerated data are discarded and old data is served during first request. Probably a solution for this would be to put static assets to some volume, but it doesnt seems to be straightforward. Is there any fly.io way to do this ? Thanks
@rubys I have postgres. Seems like quiet nice idea. I am guessing that I have to override cache handler to use db connection.
One idea I had was to use flat-cache - npm but I see that there may be some issues when i will have two instances and first layer of cache will be in memory (just like default in next) - this may de-desynchronize. Am i right ?
Tigris S3 is easy to use, and has a generous free allowance. Getting started is as simple as:
fly storage create
And connecting is as simple as:
const AWS = require('aws-sdk');
const s3 = new AWS.S3();
As to your question, depending on your use case I wouldn’t bother with multiple levels of caching (at least initially) or even recomputing data in the middle of the night, but just saving results as they are generated and reusing any previous result if it is less than 2 hours old.
With the help of copilot, here is an example application, using express; adapt as required.
const express = require('express');
const AWS = require('aws-sdk');
const app = express();
const port = 3000;
// Configure AWS SDK
const s3 = new AWS.S3();
const bucketName = process.env.BUCKET_NAME;
if (!bucketName) {
console.error('Error: BUCKET_NAME environment variable is not set.');
process.exit(1);
}
// Middleware to check cache in S3
async function checkCache(req, res, next) {
const cacheKey = req.originalUrl;
try {
const data = await s3.getObject({ Bucket: bucketName, Key: cacheKey }).promise();
const cachedData = JSON.parse(data.Body.toString());
// Check if the cache has expired
const now = new Date();
const cacheTime = new Date(cachedData.timestamp);
const cacheDuration = 2 * 60 * 60 * 1000; // 2 hours in milliseconds
if (now - cacheTime < cacheDuration) {
console.log('Cache hit');
res.send(cachedData.data);
} else {
console.log('Cache expired');
next();
}
} catch (err) {
if (err.code === 'NoSuchKey') {
console.log('Cache miss');
next();
} else {
console.error('Error checking cache:', err);
res.status(500).send('Internal Server Error');
}
}
}
// Route with caching
app.get('/data', checkCache, async (req, res) => {
const data = { message: 'Hello, world!' }; // Replace with your actual data fetching logic
const cacheKey = req.originalUrl;
const cacheData = {
timestamp: new Date().toISOString(),
data: data
};
try {
await s3.putObject({
Bucket: bucketName,
Key: cacheKey,
Body: JSON.stringify(cacheData),
ContentType: 'application/json'
}).promise();
console.log('Data cached');
} catch (err) {
console.error('Error caching data:', err);
}
res.send(data);
});
app.listen(port, () => {
console.log(`Server is running on http://localhost:${port}`);
});