I'm totally new to programming for the browser but I'm starting to appreciate the challenge (actually quite a lot more difficult than systems level programming in many ways).

The main gotchya I'm hitting is that the browser is built around a single threaded event loop. It alternates between doing computation, and updating the page. When a page feels all clunky and slow it's (probably) because the computation part is taking a long time and blocking all the other updates waiting to be rendered on the page.

But it turns out there are these things called web workers, which allow computation to be done "on the side" and not block anything.

If I can manage to get all the relay/subscription and event computation done within web workers, this should make the user experience much smoother and generally improve the performance of the nostrocket client.

I'm experimenting here if anyone wants to follow the progress: https://github.com/nostrocket/buttr

Reply to this note

Please Login to reply.

Discussion

💯 I have a similar issue with slidestr.net which needs to extract all image urls from notes - have tried to avoid web workers for now, but it’s the best solution I suppose.

It really depends on the situation. Browser performance has a lot of facets.

I recommend using a profiler like the one built into Chrome. It’ll tell you what’s taking all the time.

Generally, in my experience, it’s not computation that’s the issue. More likely memory. If your code is making lots of small objects, iterating them and deleting, these operations tend to dominate the non-DOM time.

Using the profiler has saved me so many times.

Also your right about memory, if you create too many object you can trigger the garbage collection to run and that always has a hit on performance

Web workers are a good solution when you have to process a lot of data. But there is a hidden cost to using them. Its takes some time to serialize the data to send it to and from the worker

So in some cases they can be slower than running in the main thread