To hibernate or not to hibernate?


So we are running an application where a node is actively maintaining several thousand websocket (cowboy) connections. Memory use is higher than I want so I’m thinking of putting connection handling processes that are not too busy into hibernation. Unfortunately the client sends a ping every 45 seconds which will awaken these processes.

My question is how expensive is the action of taking processes in and out of hibernation, there could be a few hundred at a time? Anyone have any experience on how latency is impacted? Getting data locally is a pain due to certain legal obligations.



If you have CPU to spare, and willing to trade that for some memory benefits, it’s an acceptable solution. Overhead depends on the kind of processing your processes do. Mainly from the forced major GC, rarely exceeding 5-7% CPU, and it happens when the process is hibernating. Taking it out of hibernation is cheap, I would not expect significant latency regression.
Actual impact can only be measured in your environment - it does depend on the host machine CPU more than on anything else.