I am ramping up on VerneMQ, and I am interested to know whether there is guidance on how to export or route logs going through VerneMQ to a cold storage technology in Azure or AWS. Some sort of a connector maybe? I can’t seem to find any guidance regarding that.
VerneMQ logs to files, console or to a Syslog server. There’s no customized exporter to Azure or AWS.
To centralize logs I think Grafana Loki looks very promising. It runs a collector on the local node to forward file based logs.
I’m sure there’s more options.
Thanks afa for the info.
I was thinking more towards route actual messages that are received by the broker to a storage, as opposed to VerneMQ logs.
Please excuse if the question sounds vague/unclear. Still trying to ramp up on VerneMQ and MQTT in general.
Oh, okay. No problem.
You have 2 general approaches to this: you either grab a message in a plugin hook, and do whatever you want with that message in the VerneMQ plugin (store to a message DB archive etc.)
This includes plugins forwarding messages based on topic mappings towards streaming components (like AWS Kinesis, Kafka, Pulsar).
Some of those plugins exist, but I’m more about the general approach: you can extend VerneMQ by developing your own forward-style or message-archive-style extensions.
(Here’s an example: GitHub - redclawtech/vernemq_kinesis: VerneMQ plugin that aggregates and sends MQTT messages to AWS Kinesis.)
Second, you can grab messages via an MQTT client application that subscribes to chosen topics, and then processes/stores the messages. Some components even have their own MQTT connectors. (Apache NiFi would be an example).
EDIT: I forgot a 3rd option: bridging between MQTT servers. How to ingest MQTT data from VerneMQ into your Data Lake using IoT Core - DEV Community
I really appreciate it afa. Thanks for the tips.