yeah, this was a rabbithole for the last half hour for me

gob decoder has a pretty severe memory overhead, the encoder, it can be reused but the decoder wants to make all these maps and shit each time, and if you use it to decode the same type again it says "no can decode same type again with this decoder" wtaf?

nice catch anyway, i guess this is not an optimization, nor would be using bytes.Buffer as it is, especially not because i can see an easy size precalculation that would never require a reallocation (just based on the lengths of segments of the json)

and that reminds me of the fact that badger's internal binary encoder is actually protobuf

Reply to this note

Please Login to reply.

Discussion

No replies yet.