My Fly subdomain is getting crawled

I have a Fly hosted website:

The web domain I have purchased and pointed towards the site is

I have a robots.txt file in my code that allows google to crawl my site when I am in production. My issue is that the robots are crawling my fly provided domain ( and my custom domain (

How do I stop the address appearing in Google and encourage only my own domain ( to be used?

Do I need to build some custom logic in to the robots.txt or is there a more accepted or standardised way to do this? Thanks in advance!

I have set up a redirect now at the root layer of my app via middleware, but the question above still stands. Can I use the fly.toml to achieve the same effect at a server level?

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.