The desktop case:
Modern firefox does >1000, but Chrome (v81) was limited to 256. My current gossip is connected to 28 relays, but sometimes that spikes to 50 or 60. If I followed a lot more people, and those people were very divergent, maybe I would break 100. If it ever came to be that we couldn't open enough connections to simultaneously pull from all relays I wanted something from, then I would have to queue and batch these requests, which is a bit more complicated but doable. I've always known this was going to be fine on desktop.
The mobile phone case:
As much as it pains me to say it 😉 I completely agree with Vitor. I think local relays make sense for low-powered devices as long as you can trust them. I wouldn't call it centralization since you have your choice of many such services to choose from, and I can't think how any one such service could capture the market (except by being bundled with a client! in which case the FTC should break up that monopoly 😱)
Test this. Pick 50 relays and to a filter to download everything to Gossip. Then compare the amount of data/second with the data limits of your connection. On phones I can only get to maybe 30% of the bandwidth available. Maybe on desktops it will be clearer.
I can get ~100% of the bandwidth as a single download, but not if I split in many connections.
I presume you are counting the event bytes, and not counting the per-connection overheads. But still, 70% of the bytes used for connection overheads seems high enough to be worth some investigation. So long as each relay is returning at least 20 events or so, it should be better than this. If each relay is only returning 1 event, then this number makes sense though.
Thread collapsed
Thread collapsed