That's exactly what I'm planning next. First, I'll remove relay sets and put all saved relays into a single set to simplify relay management and allow viewing others' saved relays. After that, I'll focus on improving the following feed.
Do you have any suggestions for the following feed? Most people still use a few large relays for writing, so it seems difficult to avoid sending requests with a large number of pubkeys to some relays 🤔
maybe what would help is if you kept a census of the amount of each npub events are found as a time series, like, broken into day segments or something, then you could prioritise fetching the ones that most frequently post a lot
this is an obvious subject matter for a "DVM" - gonna pencil this one into my mental notes for possible index and search functions - a time series data of events found in the index so you can prioritise the most likely stuff to have new content is an easy solution to make smoother on the back end too
Thread collapsed
If someone doesn't post frequently, their posts might be harder for others to discover with this strategy 🤔
it's just for getting the majority of the new stuff on the feed, you could always sort it by "first seen" and put the rarer stuff at the top
incidentally, sorting by reverse chronological order of timestamp opens up abuse of timestamps, it's still better to mark events by a seen date
hmmm
Thread collapsed
Thank you for your suggestion, I will keep it in mind
Thread collapsed
Thread collapsed
Thread collapsed
Thread collapsed
I don't see a problem problem with sending requests with a large number of pubkeys to the big relays if the people are on those big relays, as long as you're fetching the other pubkeys from the other small relays they publish to.
Thread collapsed