In application on server have some active processes that is generating some events. Every process has own logbook. For this functionality written own solution based on keeping files opened and handling it with OTP gen_server. All is working.
But trying to find any erlang-in-box solution. Is there any?
Tried to find something in this manuals but still not clear of how to make logging per process:
The functionality that is required by application:
– rotating logbook by time or file size
– starting at time of process start and stopping when process down
– archiving logs if needed
The disk_log module provides an API for managing logs of sequential terms written to disk. It supports wrap logs which allow you to choose how many files and of what maximim size. These log files may be efficient internal format or external (i.e. plain text) format.
In your use case you would open/1 a log when your procesess started and then periodically log/2 (internal) or blog/2 (external) to the log.
Found troubles with using disk_log. If one of started disk_logs crashed, all other opened disk_logs crashing on this node. All of them working together. Do you know of any ability to prevent crashing all logs if one crashed?
A process is created to handle a disk_log when it is (first) opened and the process which does so will be an owner. If an owner terminates the disk_log is closed. You may open a disk_log as an anonymous user not owner, with the linkto option.