About rate limiting

Since Fly is controlling the networking layer, scaling, etc, are you planning on adding some rate limiting features in the future?

2 Likes

We’re not planning rate limiting features, per se. However, I know of a few people who’ve deployed rate limiting with nginx or Envoy on Fly and it works very well. They were mostly using it to throttle API access, I believe.

We definitely plan to have example/demo apps that implement rate limiting though! What are you trying to solve with it?

Nothing special, just regular throttling for an API and some public facing application. :slight_smile:

Obviously I will implement it on the application, but was wondering what would happen with scaling on Fly in case of a DDOS or something.

Oh I see what you’re asking.

We have a bunch of network level DDoS protection in place. Our general feeling is that app level DDoS protection that involves capchtas kind of sucks, and most apps just need some of what nginx offers (if they need anything at all).

That said, our policy is to not charge for usage resulting from an attack. You can limit how much your app scales as well, but we’ll wave charges if things go bonkers even without that.

6 Likes

Could you please share examples of that, if any? I’d be interested to learn how to do that.

Here’s an example using Module ngx_http_limit_req_module ; note that I’ve set the limit so low (10r/s) that the initial request isn’t able to load all of the fonts, scripts, and images.

# syntax = docker/dockerfile:1

FROM nginx:latest

COPY <<-"EOF" /etc/nginx/conf.d/default.conf
limit_req_zone $binary_remote_addr zone=mylimit:10m rate=10r/s;

server {
  listen 8080;
  listen [::]:8080;
  server_name localhost;

  location / {
    limit_req zone=mylimit;
    proxy_ssl_server_name on;
    proxy_pass https://fly.io/;
  }

  error_page 500 502 503 504 /50x.html;
  location = /50x.html {
    root /usr/share/nginx/html;
  }
}
EOF

EXPOSE 8080
1 Like