Use cases for duplicate_bag tables?

I have lately been discussing ets table types with a colleague, and we found that we couldn’t come up with any use cases for the duplicate_bag type, not even any far-fetched contrived ones.

The stumbling block is always the fact that, while one can insert any number of identical objects in such a table, it is not possible to delete or update just a single one: It’s always either all or nothing.
And while it might be possible to dream up something remotely useful involving traversal, there is no order to the objects, either.

So, much as we tried, we couldn’t find any case where duplicate_bag tables would be a fitting solution.

1 Like

phoenix_pubsub does use duplicate_bag as discussed in The Road to 2 Million Websocket Connections in Phoenix - Phoenix Blog

3 Likes

Hm, that is interesting, thanks :smile:

However, reading the surrounding material, it sounds more like a hack that was luckily possible, not a typical use case?

Well, there is insert order at lookup :grin::

1> T = ets:new(x,[duplicate_bag]).
2> ets:insert(T, {key, 1}).
3> ets:insert(T, {key, 2}).
4> ets:insert(T, {key, 1}).
5> ets:insert(T, {key, 3}).
6> ets:lookup(T, key).
[{key,1},{key,2},{key,1},{key,3}]

Don’t know if that makes the duplicate_bag more useful.

2 Likes

Indeed it does, a little :slight_smile: Does the same apply for take/2? By quickly trying it out, it looks like it does, but the documentation says nothing in that line, so I’m not sure.

A problem I see with insertion order only present in lookup (and maybe take) is that you never know how much you might get. Traversal with match/1,3 and select/1,3 would alleviate this by limiting the result at a time, but alas, for those insertion order for same keys seems not to be preserved :woman_shrugging:

1 Like

@sverker On a slightly different track… In the article that @LostKobrakai linked, they were able to boost performance by going for duplicate_bag instead of a bag table, which was possible just like that because the surrounding application was guaranteed to not insert any duplicates in the first place. I assume that performance boost came about because, other that with bag tables, there are no checks for duplicate objects needed in a duplicate_bag table?

2 Likes