fiatjaf was very daring to create a custom binary encoding for events here:

https://github.com/nbd-wtf/go-nostr/tree/master/binary

first of all, it's completely hand-written, all of the offsets are manually in place and this is very error prone

secondly, as it says in the readme, it wasn't intended to anyhow be a wire format

thirdly, the real reason for it is storing binary code in a runtime-embedded key value store like Badger or LMDB for a Go app

so, you gotta wonder how much fiatjaf knows about Go if he never heard of this:

https://pkg.go.dev/encoding/gob

and the article referred to in the top section of this documentation here explains the logic of it all:

https://go.dev/blog/gob

Gob is a binary encoding for Go applications, and it is the fastest binary encoder you can use for Go, its limitation is that software on the other end can't understand it because its format is specifically optimized for Go's type system and for efficient encoding and decoding of the data

as it says in this article:

the decision to use JSON for the protocol of nostr was probably a bad idea for efficiency

IMHO, like used with LND and bazillions of shitcoins and cloud databases, it should have used protobuf encoding

the JSON format of nostr is ambiguous in several places requiring read-ahead scanning to discover the subtypes of count and auth messages

if it had to be JSON, then why is the "envelope" not JSONRPC2 like Bitcoin?

if i'm writing an application in Go, and its storing data in a key value store locally, and not going to be accessed by any other application let alone from any other language, what is the logic of using anything other than Gob?

i have written custom binary encoders on two occasions already, and in both cases i simplified and automated the TLV type formatting of the data so that each field had a length prefix but actually you can get away with a lot less labor by the computer for fixed formats, and also by shifting the fixed size parts of a structure to the top and if possible, only one variable length field at the end

obviously this isn't possible with events because of tags as well as the content field, and for reasons of efficiency in taking this binary data and rapidly converting it to JSON, it makes more sense to store it in the order that it will be written into the JSON encoding buffer

but even this is moot, really, because for reasons of simplicity and less things to go wrong, i'm just flat out changing the binary marshal and unmarshal functions to just be wrappers for a Gob encode and decode, and then i can just call my also optimized JSON encoders (which use an intermediate key/value collection type), which will be a lot less work on the part of the computer because JSON encoding in Go normally requires reflection, and this is patently unnecessary with one single simple document in every case

just kinda funny to see the comments and the way the code was written, and having had much experience from zero to pretty much mastering writing binary codecs, i know that he's never done this before, and he doesn't know anything about what he's doing, and did it because of ignorance, since unless it was going to go on the wire what was the purpose? To enable binary formatting for network connected SQL databases? You'd just use JSON for those too, why complicate things

gob is good

Reply to this note

Please Login to reply.

Discussion

No replies yet.