Good Morning Nostroids,

Today I'm working on the batching of requests in more-speech. Rather than pull several thousand events at a time, I'm walking backwards in time by batches of 100 events, waiting for them to be processed, and then requesting the next batch.

This allows new events to come in and be displayed in approximate real time while old events continue to trickle in.

This is fun!

Reply to this note

Please Login to reply.

Discussion

That's utterly awesome...!🙏😁💜😆

I've been trying to get anyone working on Amethyst to do that... So far without success.🥲 I prefer Amethyst overall, but it bogs down and siezes up when l access "popular people's" accounts...

It's pretty easy to do. NIP-01 says that if you use the "limit":n argument then the _latest_ n events are sent. That's perfect. You you get that batch, find the minimum created-at set that to the next "until" and back the "since" up.

You have to coordinate with the EOSE messages, and I've found it wise to send a CLOSE before the next REQ.

There are a few tricky issues like empty batches, and bunches of events with the same created_at time. But if you are careful you can inch your way back in time without too much trouble.