The more important thing is to first distinguish personalization algos and general lists with different sorting criteria
Then come to the point that even though current mediaās algos are annoying with a lot of ads and attention catchers, you donāt really want absence of algos and just seeing your followers ā you still want to see new content in order to expand your network, your views, your interests
And only after those steps you have to understand how you want to connect different algos to your feed, who will implement it and how and what is the base cost of such a feature (which I doubt anyone tried to implement earlier in production)
Sounds great until you start implementing this and find out you canāt simply solve it on the client side and solving this on server side is, to say it mildly, unclear
Define bad. Car crashes, violence and murders are not bad in context of internet ā they are reflection of real world.
Making any business with rusia is much better indicator of stupidity than thinking on what bitcoin is backed by
Arenāt resort countries with access to beautiful and warm seas/oceans supposed to see price increase each time the season starts especially after joining Schengen and experiencing such an influx in tourists?
Thats an international issue hundreds of millions of people will be addressing for the next decades if not centuries.
Endless stream of green substance is a symbol of how dollars and life energy rapidly runs away from more and more generations (existing and future) of minions serving this dwarf, donāt you know it? Its a common knowledge today
Yeah thats shady and requires jailbreak device but possible
Reproducible build are pain in the ass but the only solution for that
Usecaseless until you have a team rejecting anything non-js
Centralized part of nostr should definitely evolve, too, the main question is to how properly integrate it into decentralized design
There is a problem with any type of aggregation at current stage. Any relay (even taking the best attempts to collect all the data from all other relays) ā will never have a complete picture of events. And thats rather good for the network.
However, aggregation in this case becomes the source of truth that is flawed by default. Of course, it may be used for some estimates but it locks you to the specific relay instance
On the other hand client aggregating all the events from different relay (say, regarding zaps to specific note) is very inefficient and slow but it is able to find āsource of truthā on its own
Wow, great then, Iāll revisit it. Is it on bu default or somewhere in the settings?
*but I decided
On my todo list from day one I decided to postpone it once Iāve found out not a single popular client implements authentication yet
Looks very interesting!
But I canāt find an answer anywhere to the most important thing in my opinion ā how providers supposed to compete?
Lets say, provider receives new NIP-90 event - should it send some āportfolioā (probably adjusted to request) back and wait for approval? Should it generate the image for free (maybe with a watermark) beforehand and wait for a payment to provide full image?
Also, is it supposed users and providers must be connected to the same relays qt this point? Did you think how to approach this?
There are a lot of the things better tbh today. For example disarming rusia and breaking it into pieces making dozens of occupied nations free, as on of the examples

