Okay, how about this idea - relays chirp at each other at intervals, saying, "hey I got these events, you got them?" But it would really work by taking all events from a certain npub for the past 3 hrs, hashing them, comparing that string with the same hashed output from your own relay, then going forward with a bigger exchange if there's a mismatch. So your own relay would do that for each npub you follow every three hours or so.
This might be where relays start having to specialize - I see devs pushing for us setting up relays for certain functions, but it doesn't seem necessary right now.
There is a protocol for syncing distributed databases called negentropy that does something similar to what you're describing. And although your solution works on paper, there are trade offs. 1. Relays become bloated as every relay needs a copy of every event making it harder for anyone to run a relay. 2. This would certainly make queries for events slower as a result (since now there are so many events to go through to find the right ones). 3. Relays can no longer serve small isolated communities that know nothing about the rest of the nostr network and instead serve one giant public square removing some flexibility in the network.
Thread collapsed