How to connect to the GoogleCloud SQL through Wireguard VPN?

It’s not running, and logs are empty!
That’s why I assumed there might be some restrictions with running background processes on fly.io (no IRC bouncers ;))

But if I login into the VM, execute the command by hand, and then log out, then it stays: man_shrugging:

So I moved lines starting the application & cloud sql proxy in the entry point script, and the effect is roughly the same - the sql proxy didn’t start, and logs are empty.

# Setup Cloud SQL Auth Proxy
RUN curl https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 > /usr/local/bin/cloud_sql_proxy
RUN chmod +x /usr/local/bin/cloud_sql_proxy

ADD deploy/entrypoint.prd.sh /usr/local/bin/entrypoint.prd.sh

USER nobody

ENTRYPOINT ["/usr/local/bin/entrypoint.prd.sh"]
#!/bin/sh

cd /app
/usr/local/bin/cloud_sql_proxy -instances=obfuscated-cloud-project-uri=tcp:127.0.0.1:5433 -credential_file=obfuscated-credentials-file.json  >>/var/log/cloudsql.log 2>&1  &
doppler run -c prd -- /app/bin/server

Now, when I’m thinking about this, I’m leaning towards running Cloud SQL Proxy in a separate VM because this script is pretty naive and asks for production troubles.

Background processes are not an issue with fly. Look at this tutorial on how to run tailscale in your fly VM: Tailscale on Fly.io · Tailscale
Now set up something similar for your cloud_sql proxy.
Add it to Docker, replace the start script of your docker image and kick off the cloud_sql proxy before starting your real app. Pay close attention that you get paths, etc right.

I think I found the issue

2022-08-29T11:21:49.244 app[e476a066] lhr [info] Error: UnhandledIoError(Os { code: 13, kind: PermissionDenied, message: "Permission denied" })
2022-08-29T11:21:49.245 app[e476a066] lhr [info] [ 0.117802] Kernel panic - not syncing: Attempted to kill init! exitcode=0x00000100
2022-08-29T11:21:49.246 app[e476a066] lhr [info] [ 0.118987] CPU: 0 PID: 1 Comm: init Not tainted 5.12.2 #1
2022-08-29T11:21:49.247 app[e476a066] lhr [info] [ 0.119927] Call Trace:
2022-08-29T11:21:49.247 app[e476a066] lhr [info] [ 0.120347] show_stack+0x52/0x58
2022-08-29T11:21:49.248 app[e476a066] lhr [info] [ 0.120903] dump_stack+0x6b/0x86
2022-08-29T11:21:49.248 app[e476a066] lhr [info] [ 0.121471] panic+0xfb/0x2bc
2022-08-29T11:21:49.249 app[e476a066] lhr [info] [ 0.121990] do_exit.cold+0x60/0xb0
2022-08-29T11:21:49.250 app[e476a066] lhr [info] [ 0.122590] do_group_exit+0x3b/0xb0
2022-08-29T11:21:49.250 app[e476a066] lhr [info] [ 0.123212] __x64_sys_exit_group+0x18/0x20
2022-08-29T11:21:49.251 app[e476a066] lhr [info] [ 0.123874] do_syscall_64+0x38/0x50
2022-08-29T11:21:49.252 app[e476a066] lhr [info] [ 0.124459] entry_SYSCALL_64_after_hwframe+0x44/0xae
2022-08-29T11:21:49.252 app[e476a066] lhr [info] [ 0.125274] RIP: 0033:0x7f67fa735eb9
2022-08-29T11:21:49.255 app[e476a066] lhr [info] [ 0.125832] Code: eb ef 48 8b 76 28 e9 a5 03 00 00 64 48 8b 04 25 00 00 00 00 48 8b b0 b0 00 00 00 e9 af ff ff ff 48 63 ff b8 e7 00 00 00 0f 05 <ba> 3c 00 00 00 48 89 d0 0f 05 eb f9 66 2e 0f 1f 84 00 00 00 00 00
2022-08-29T11:21:49.256 app[e476a066] lhr [info] [ 0.128605] RSP: 002b:00007ffcab0a8618 EFLAGS: 00000246 ORIG_RAX: 00000000000000e7
2022-08-29T11:21:49.257 app[e476a066] lhr [info] [ 0.129459] RAX: ffffffffffffffda RBX: 00007f67fa59b180 RCX: 00007f67fa735eb9
2022-08-29T11:21:49.257 app[e476a066] lhr [info] [ 0.130172] RDX: 0000000000000000 RSI: 0000000000000000 RDI: 0000000000000001
2022-08-29T11:21:49.258 app[e476a066] lhr [info] [ 0.130935] RBP: 0000000000000001 R08: 00007f67fa809c08 R09: 0000000000000000
2022-08-29T11:21:49.259 app[e476a066] lhr [info] [ 0.131779] R10: 0000000000000000 R11: 0000000000000246 R12: 00007ffcab0a8678
2022-08-29T11:21:49.260 app[e476a066] lhr [info] [ 0.132587] R13: 00007ffcab0a8688 R14: 0000000000000000 R15: 0000000000000000
2022-08-29T11:21:49.260 app[e476a066] lhr [info] [ 0.133806] Kernel Offset: disabled
2022-08-29T11:21:49.261 app[e476a066] lhr [info] [ 0.133806] Rebooting in 1 seconds..

this might be throw by cloud_sql_proxy

Update: I’ve figured it out. There was a series of overlapping mistakes that I made along the road, but now everything is working like a charm :slight_smile:

2 Likes