Hi,
I would like to delete and seed my mongodb database. I have a script that does exactly this. At the moment I did not figure out, how to schedule a cron job on the fly.io machine. I thought, that I need to include necessary commands to run the script in the dockerfile. Unfortunately, It does not work.
Here is the dockerfile:
# syntax = docker/dockerfile:1
ARG NODE_VERSION=20.11.1
FROM node:${NODE_VERSION}-slim as base
LABEL fly_launch_runtime="Node.js"
WORKDIR /app
ENV NODE_ENV="production"
RUN apt-get update && apt-get install -y --no-install-recommends cron && rm -rf /var/lib/apt/lists/*
FROM base as build
RUN apt-get update -qq && \
apt-get install --no-install-recommends -y build-essential node-gyp pkg-config python-is-python3
COPY --link package-lock.json package.json ./
RUN npm ci
COPY --link . .
FROM base
COPY --from=build /app /app
COPY databaseCleanUp.js /app/database.js
RUN echo "*/5 * * * * node /app/database.js >> /var/log/cron.log 2>&1" > /etc/cron.d/database_schedule
RUN chmod 0644 /etc/cron.d/database_schedule && crontab /etc/cron.d/database_schedule
EXPOSE 3000
CMD cron && npm run start
If you are using a Cloud service, the problem won’t disappear by switching services. All modern Cloud services are based on usage, i.e. the machine is only running, when it’s actually used.
This is why for certain use cases, you can usually set up always-on instances, which do not shut down and keep running 24/7.
I know, there is a functionality like that for AWS, not sure about fly.io.
Maybe, it is this.
If not, you could still have something access the machine constantly, which would keep it running 24/7.