erlang_python embeds the Python interpreter into the BEAM VM using dirty NIFs. It allows calling Python functions, evaluating expressions, and streaming from generators without blocking Erlang schedulers. While it works with any Python code, it was designed with ML/AI workloads in mind, ie. embeddings, inference, data processing.
Basic usage:
{ok, 4.0} = py:call(math, sqrt, [16]).
{ok, Json} = py:call(json, dumps, [#{foo => bar}]).
{ok, 45} = py:eval(<<"sum(range(10))">>).
From Elixir:
{:ok, 4.0} = :py.call(:math, :sqrt, [16])
{:ok, json} = :py.call(:json, :dumps, [%{foo: "bar"}])
{:ok, 45} = :py.eval("sum(range(10))")
Async, streaming, and parallel execution:
% async calls
Ref = py:call_async(slow_module, compute, [Data]),
{ok, Result} = py:await(Ref).
% iterate over generators - useful for LLM token streaming
{ok, Chunks} = py:stream(mymodule, generate, []).
% native asyncio support
Ref = py:async_call(aiohttp, get, [Url]),
{ok, Response} = py:async_await(Ref).
% parallel execution using sub-interpreters (Python 3.12+)
{ok, Results} = py:parallel([
{numpy, dot, [A, B]},
{numpy, dot, [C, D]}
]).
The library handles GIL contention depending on your Python version:
- Python < 3.12: multiple executor threads sharing the GIL
- Python 3.12+: sub-interpreters, each with its own GIL
- Python 3.13t: free-threaded build, no GIL
Erlang/Elixir functions can be registered and called from Python:
py:register_function(my_func, fun(Args) -> do_stuff(Args) end).
import erlang
result = erlang.call('my_func', arg1, arg2)
Virtual environment support is included for dependency isolation.
Requirements: OTP 27+, Python 3.11+, CMake 3.18+
Platforms: Linux, macOS, FreeBSD
Just released erlang_python 1.0.0.
Hex: erlang_python | Hex
Docs: erlang_python v1.0.0 — Documentation
Repo:
Feedback welcome.