I needed this for the project I am working on related to AI agents, so I thought it might be useful for the rest of the world.
Guanco is an Erlang client for Ollama that provides various functionalities such as generating completions, chat completions, showing model information, and generating embeddings.
Features
Chat Completion: Generate AI-powered chat completions using various models.
Model Info: Fetch information about a specific model.
Embeddings: Generate embeddings for input text.
Configurable: Ollama API URL, host, and port can be customized via sys.config.
Worker Pooling: Efficient resource management using poolboy.
Thank you so much for sharing this library ! It’s truly a valuable addition, and the Erlang community could definitely benefit from more tools like this. While reviewing your library, I came up with a few suggestions that might be useful. Some of these may not fit perfectly with your plans, but I hope you find them helpful nonetheless. Here they are:
Add Base CI
Setting up a base CI pipeline to run common tests is crucial. It would be great to cover at least the last three OTP versions, if possible. You can use the official Erlang Docker image to build containers as part of the CI process.
Publish on Hex.pm
Publishing your library on hex.pm would make it more accessible. This can be done manually or automated with a CI pipeline to publish on each tagged release.
Adopt Semantic Versioning
Following Semantic Versioning (SemVer) ensures clarity in your releases. Rebar3 supports SemVer and can manage your vsn automatically without manual changes. You can find more information in the Rebar3 documentation.
Create Tags and Releases
Adding tags and releases to your repository helps users understand version history. You can automate release generation using GitHub. More information is available here and here.
Move meck to Profiles
Since the meck library is only used for testing, it makes sense to move it to a separate profile in your rebar.config. Check out this guide for more details.
Consider OTP 27’s Native JSON Module
While using jiffy is a great approach, starting with OTP 27, a native JSON module is available. To maintain compatibility with OTP 25, 26, and 27, you could implement conditional logic using predefined macros. This strategy is common in many popular libraries. More details here.
Add jiffy to .app.src
The jiffy dependency is missing in the .app.src file. This could cause issues in projects where jiffy isn’t already initialized. Adding it ensures smooth integration.
Explore the gun Library
While hackney is an excellent library, for more flexible request/response handling, consider using gun. It’s powerful and potentially faster but requires tuning. If you’re not experienced with gun, sticking with hackney might be a safer choice.
Consider Alternatives to poolboy
Since poolboy hasn’t been updated in over four years, you might explore other maintained libraries like worker_pool from the Inaka project. This ensures compatibility with the latest OTP versions.
Replace io:format with logger
Avoid using io:format for debugging, as it can overload the system under heavy traffic. Instead, use the built-in Logger for better performance and flexibility.
Add Benchmarks
Including benchmarks to demonstrate how the library performs under load would be a great addition. You could use tools like Tsung or create custom benchmarks.
Write a Blog Post
Consider writing a blog post showcasing real-world use cases for your library. This could include a demo project (even a mock one) that highlights its capabilities and potential applications.
If you need any help with these suggestions, feel free to reach out to me.
@vkatsuba thanks a lot for this wonderful set of instructions! I will follow those and try to apply them. I’ll keep the community posted on the progress here.