Do you have any good optimization suggestions for memory usage

hi,This is the memory usage rate calculated by my system using recon.alloc, which is really too low.
Do you have a good optimization plan.

Memory used by erlang processes: 873621304 Bytes (833 MB)
Memory allocated by erlang processes: 875520192 Bytes (835 MB)
Memory allocated by ETS: 663091648 Bytes (632 MB)
Memory total allocated: 2368366272 Bytes (2259 MB)
Memory by erlang vm: 5013745664 Bytes (4781 MB)
Memory pool usage: 0.454267819836503

machine environment :centos7,opt25


It really depends on what the structure of the data actually is, but common ones that come to mind for me:

  • Replace list-strings with utf8-binaries, since they are typically much more compact since they don’t need a reference for every character to point to the next node of the list.
  • Swap lists for tuples for similar reasons to the above, where applicable. If you have fixed-sized lists, consider whether tuples may be a better fit.
  • Replace constant strings with atoms (e.g. if you are tagging data with strings rather than atoms), since you essentially get simple and cheap interning (sharing) by using atoms instead.
  • Reduce the creation of garbage or duplicated data by sharing as much of values as possible.
  • Investigating compressing your ETS tables, by using the compressed option on the table itself, or directly using a compression function on the data.

Thank you, your answer has greatly inspired me. I have been searching for this question recently. I’m not sure which part of the problem it is at the moment. Is there any relevant tool available to troubleshoot it?

Not too sure about memory profilers - that’s probably something that can just be searched for online - but the erts_debug module is handy for comparing the sizes of terms. See the size and flat_size functions.