Favorite Ways To Parse JSON Responses?

Title says it all. If you’re querying postgres via epgsql, how are you handling responses?My postgres function returns JSON, however, as we know, it’s nested basically as result.rows[0].function_name…

I’ve been doing something like this but it just FEELS ugly.

case Result of
            {ok, _Cols, []} ->
                {error, not_found};
            {ok, Cols, [Row|_]} ->
                Values = tuple_to_list(Row),
                case length(Cols) =:= length(Values) of
                    true ->
                        RowMap = maps:from_list(lists:zipwith(
                            fun({column, Name, _, _, _, _, _, _, _}, V) -> {Name, V};
                               (Name, V) when is_binary(Name) -> {Name, V};
                               (Name, V) -> {Name, V}
                            end,
                            Cols, Values))...

Has to be a cleaner way?

  1. It’s a bit hard to guess what you mean by JSON here, because based on your snippet it looks more like a tuple rather than actual JSON, which you’re then converting to a list and iterating over.
  2. The check case length(Cols) =:= length(Values) of seems redundant. Even if there are no values for Cols, there will usually be default values like null, undefined, or something similar depending on your schema. If for some reason you really need to compare them, size/1(erlang — erts v16.0.2) would be a better fit than converting to a list and calling length/1.
  3. You can simplify things significantly by using pattern matching directly on the tuple inside the case block instead of converting to a list and zipping.
  4. As mentioned earlier, it would be very helpful if you could share the raw data you’re getting back from epgsql, so the community can provide a cleaner and more tailored solution.

In other words, your example is doing a lot of unnecessary iterations, which could eventually affect the performance of your application under heavy load.

2 Likes