I _hope_ you are joking.

One of the reasons Nostr is fast and easy to host is exactly because relays do not need to do that. If you had to run multi-pass checks (a and b and c) over an event both in recv and send transit, that'd basically worsen performance by a lot. Not a big problem on small relays - but think Primal's main relay. That would completely change their infra requirements XD

Kinds are, by far, the fastest way to do this. Strings are a usable second - but if you are starting to parse out an entire object and then checking all its details, you are increasing complexity - both in implementation and actual instructions to get those done. Maybe I am thinking a little too low level - but a simple kind == 1 is basically a three-instruction (load left, load right, compare and jump)...and well, you can guess what this would look like if you had to unpack the object first and then crawl all the tags (let alone checking if the required ones even exist in the first place...). Aside from SIMDJSON, json parsing isn't the fastest.

Reply to this note

Please Login to reply.

Discussion

i'm not joking.

kind definititons are very nebulous, some subprotocols use one, with tags (eg kind 1) and others use multiple kinds for a single subprotocol.

indexing all tags is not complicated, that's what hash functions are for.

i was building a db engine that could do it, it wasn't that complicated. kinds, however, are not self describing. neither are single letter tags. it's my biggest disagreement on the design of nostr aside from the use of JSON instead of an email-style line structured sentinel based format. the reason i prefer this type of encoding is a) it's also plain text b) it's self describing and c) it's cheaper to parse it, json's english-style structuring nearly doubles the processing requirement.

also, fuck primal. their days of big reach are coming to an end with KYC all the things. i don't think that KYC friendly normies will like the content and probably all kinds of bluehairs will complain to google about it and get their listing chopped despite all the bending over and presenting of the fuck hole that they do.

if primal forks off the main nostr network GOOD

also, you probably didn't know this because you haven't actually written a relay, you have to unpack the whole event. it's not capn proto or flatbuffers style read on demand. but even if it was, i don't think that really changes anything. to store events in a database you have to decode the entire event anyway. single letter tags have to be indexed by spec anyway. i just think why not also index all the tags. it's really not that complicated to implement. databases use hash functions for this reason, to enable indexing words.

further, all the relays that use RDBs and NoSQL DBs are automatically indexing all those tags whether you like it or not.

i write database engines from scratch using a KV store, and it's not really that difficult, and the tailoring of the implementation greatly improves performance. i don't think i'm exaggerating when i say that the #orly database engine is the fastest in all of nostr. it uses an engine that is used for a moderately popular graph database called dgraph, and you can't do that shit without highly optimized iteration and log storage strategies.

also, my custom nostr-specific JSON unmarshal function is nearly as fast as a binary decoder for an optimized binary encoding. about 20% slower. it's also the fastest json event decoder in all nostr land.

btw, https://orly.dev uses SIMD for hex encoding and SHA256 hashes, this is probably part of the reason why it's so much faster.

the json decoder state machine i wrote probably could translate quite well into a pure SIMD implementation. that would be retardedly fast

one of the things i love about it is it uses goto to register the state of the decoder, instead of additionally using the stack to store state (and thus requiring it to read several objects on and off all the time). the state is the point where the PC is. this is how the best state machines work. i'm pretty sure the Go lexical analyser uses this technique.