I'm hesitant about doing it for kind 1 because there is such a wide context, lots of notes collected over lots of relays - don't think it would really work because for the user it there is so many unrelated notes. But imagine a pool of relays focused on some content:

nostr:nevent1qvzqqqqqqypzphzv6zrv6l89kxpj4h60m5fpz2ycsrfv0c54hjcwdpxqrt8wwlqxqyghwumn8ghj7mn0wd68ytnhd9hx2tcprdmhxue69uhhg6r9vehhyetnwshxummnw3erztnrdakj7qpqcu2498m77mq80cj5tuh3l8rnw97e8vx54alva35pzmr207lt4xuqqz5htc

Even if you group across multiple relays, a user is choosing to connect to related content. A user is reading some related articles that have been embedded and so you (service provider) or the client is able to construct some sort of HNSW search, and pass some sort of subnetwork within some distance close to your note.

Reply to this note

Please Login to reply.

Discussion

I suppose it would depend on how good binary vector embedding actually is. 😂

Its a compressed representation, so you lose some resolution to search through, and the space of topics that can be covered in a feed of kind1 notes over some length of time is huge. Less so with a themed community. So by going with something focused you and only improve in performance.

* you can only improve

Dropping a demo in case you'd want to play around with it

https://github.com/limina1/nostr-binary-embedding-demo

Good point