you can skip gzip and json decode using this format, and its smaller. thats lots of cpu savings.
Nah.. the wire is gzip compressed with almost every relay these days. The gain for comms is minimum.
Discussion
I don't think it is smaller. Did you test it? A 300 people contact list goes from 23kb of minimized json to 12kb as gzip. And those have the lowest sparcity. Any text note compresses like crazy.
yes I tested it:
36746 39742 zstd beats it: 35530 but you still have to un-zstd and json decode..
i'll try to see how much faster it is with benchmarks
Yeah, those numbers make sense. I don't know, I feel like beating a dead horse when we try to beat compression algorithms.
another nice thing is that this format ensures the id is at the start, meaning its very easy to reject parsing/verifying the entire thing when when checking if you already have it:
for field in packed_note {
// first field peeked when looking at the bytes
if let ParsedField::Id(id) = field {
if cache.contains(id) {
break;
}
}
}
I had to really hack this in my json parser on nostrdb to get this working
its not bad nostr:note1t3cnac2fqr4wcl6a8eve63q3577e77p8rgrgpak8mn77py4dhnfqxwrnyc