Trying to figure out how best to configure the http_service.concurrency setting.
Right now I have type: connections (although I am changing this to requests as these are web servers) and a soft limit of 100 and hard limit of 200.
Here is the config
[http_service]
internal_port = 4000
force_https = true
auto_stop_machines = true
auto_start_machines = true
min_machines_running = 2
processes = ["app"]
[http_service.concurrency]
type = "requests"
hard_limit = 200
soft_limit = 100
So I would assume that if my app is consistently below 100 requests then eventually my app would scale down to 2 machines. I have 6 total. Instead I am consistently seeing ~4 running. When I look at the concurrency graph I donât see any one server ever going over 100. Shouldnât that mean that my server would scale down to 2 machines and chill there until I start hitting the limits?
Thanks!
