Wondering if something change with a recent change? I was following this guide: Scaling Large Language Models to zero with Ollama · The Fly Blog to setup ollama. I was able to deploy the actual ollama server and was able to create a ephemeral machine to connect to the server.
The steps say I should be able to run ollama commands but they are returning no such host errors
Am I missing something? Do i need to open ports to get the tutorial working?
