Wow. nostr:nostr:npub12262qa4uhw7u8gdwlgmntqtv7aye8vdcmvszkqwgs0zchel6mz7s6cgrkj you weren't kidding. My connection is slow as heck, but that was nearly instantaneous publishing.

WTF dark magic is this? 😂

Reply to this note

Please Login to reply.

Discussion

Or was it you, nostr:npub1qdjn8j4gwgmkj3k5un775nq6q3q7mguv5tvajstmkdsqdja2havq03fqm7 ?

You did something with HTTP2, right?

But, wow. That's the fastest publishing I've ever seen on Nostr. Amazing.

Yeah that was setup, but it only helps initial loads without cache. Deno supports server push so nginx is supposed to accelerate those but I havent confirmed it's working.

Aargh, nostr:npub1636uujeewag8zv8593lcvdrwlymgqre6uax4anuq3y5qehqey05sl8qpl4 I programmed in a "nostr:nostr:" bug, sorry. 🙈

Also sends it off to the event cannon before returning an OK.

Geez.

Coming soon is a new optimized query executor, and cursors. Cursors can double throughput and reduce latency on paginated queries.

nostr:npub1wqfzz2p880wq0tumuae9lfwyhs8uz35xd0kr34zrvrwyh3kvrzuskcqsyn wanted to talk to you about all that. I don't really understand it, but he's revamping all of the connections and paging and... Le Search Bar.

*wiggles eyebrows*

Oh yeah and also we'll want to get partial responses. Like, just event IDs and certain tags and stuff.

It's time to start leveraging some cool backend capabilities.

nostr:npub1wqfzz2p880wq0tumuae9lfwyhs8uz35xd0kr34zrvrwyh3kvrzuskcqsyn and nostr:npub1qdjn8j4gwgmkj3k5un775nq6q3q7mguv5tvajstmkdsqdja2havq03fqm7 have been tweaking stuff and moving some stuff server-side and rawdogging the websockets, so that's probably why there's so little friction. I couldn't tell that it had gotten faster anywhere else, as the clients are so slow, but that one spot is like a peak into the near feature. Seamless.

Turns out if you trim down to just one or two WebSocket connections rather than the half-dozen a lot of apps use with most user configs, the relays are natively really fast.