https://relay.guide is where I'm trying to maintain a searchable list of relays. Next up is being able to manage your preferred relays right from here.
Just did some tweaking to my algo for measuring usage. Here's the top ten relays by zap activity.
It's crude but it takes the latest 100 zap receipt events and measures the time between the latest and earliest one and ranks by the difference between those numbers.
If there aren't 100 zap receipts they don't get a ranking for "usage"
Doing this in python is pretty terrible so I'm sure it's buggy and I'm open to suggestions on ways to do this better. It was at least a fun exercise in how bad I am mat async operations in python.

Here’s this week’s summary! nostr:note1wjx57nxkkpsm5dztnalkpwdtuh9kgeu0h38ma8qhf8ud6rj8z4tqye9qt4
Agreed however the curation happens will have to evolve with scale.
Agreed but is every relay operator going to spend the hours a day (at least ) to do that work?
Forcing this into individual relay operators seems like something that will force centralisation more than giving relay operators the ability to subscribe to blocklists maintained by third parties whose job it is to create and maintain them.
At first there will only be a few providers but people are very opinionated about how they want their content filtered (if at all) which will drive more competition and therefore decentralization around these block list providers.
I revisited the series recently and has the same thought. Including all the statist push back
nostr:npub1qny3tkh0acurzla8x3zy4nhrjz5zd8l9sy9jys09umwng00manysew95gx inspired me to dig up my original paperback. Check out that headline quote. 2009 was a different era.
nostr:note174fj0q3l9wguengcq9vg98mpmpcpru59fj3vqrdv9erzr2qfz0rs64l3gj
I love this series. Such an imaginative story.
nostr:note1mhl4hly9ta77p2rhypsnf8numvwgg7jfm7e5fe4zpwetfpdy7rgqy8hpap
This validates what we're currently seeing with many people saying it can't happen. It's just slowly then all at once.
That’s a good question. Based on my research about the fediverse (the only close approximation to Nostr) whoever manages the servers that host and distribute content are legally liable in a variety of ways.
I linked this at the top of my proposal because it’s helpful context: https://www.eff.org/deeplinks/2022/12/user-generated-content-and-fediverse-legal-primer?ref=gregwhite.blog
The western world generally adheres to this regime of who is responsible for the distribution of illegal content. China is far less permissive, so I’m not sure if there’s any point is trying to satisfy their legal demands.
That’s an interesting strategy and I believe in your right to contribute to the community in that way. I don’t know if that’ll scale when we have millions of active users, but I hope it does.
I wanna double down on this point. I don’t wanna mandate anything or garner support for any mandates.
I wanna build a solution that I think will help and if no one adopts it then the proof will be in the pudding and it will become clear it wasn’t the right solution.
I’m predicting a future where relay operators come under threat from law enforcement and we will be scrambling for ways to continue operating under that scrutiny.
Nostr cannot scale if it remains a niche offering that can only operate in jurisdictions unreachable by the US and China.
I don’t pretend to know the right answer, but I wanna make progress on an idea and start the conversation.
Thanks for engaging so deeply. I truly respect your opinions, your feedback, and what you do for the community.
We definitely outsource that to police officers at the scale of a city or a region, but if you have a community (a church, a group of friends, etc) and they’re doing something that’s harmful to the group or going to get the group in trouble with the law and the community doesn’t want any part in it…It’s the responsibility of the community to protect itself.
The motivation of this is to help users and communities to protect themselves from people *they* determine are bad actors in their space (their feed, their DMs, their relays for example).
This isn’t about nostr-wide moderation / censorship. That’s impossible and against the ethos of Nostr. This proposal is about giving people the tools to protect themselves from bad actors.
And I still am not sure I’m communicating this well, but I’m not trying to define what bad actors are. But I know that governments do, and they will go after relay operators that aren’t blocking content the governments want blocked.
Nost relays running in very permissive jurisdictions will have the luxury of not needing to do content moderation. And luckily the internet is still fairly open so you can connect to any relay you like from your Nostr clients. So if you want to use relays that don’t do any moderation that’s your right!
If you want to use a relay that operates in the US, then that relay operator needs tools to make sure they can stay within the boundaries of the law. I hope that Nostr chips away at the power of states so they give up on censorship.
But what would help relay operators prevent copyrighted material from spreading via their relays (even if they disagree with those laws) will also be a useful tool to prevent the spread of child porn and other content that a vast majority of Nostr users will agree has no place in their community. Let’s give each person and each community the tools to curate their own domain.
I wanna support your right I choose relays that don’t moderate at all!
As for kids. I was thinking about allow lists instead of block lists. I also wrote about it a while back
Should avoid some of what you see with YouTube but also not throw them to the wolves.
https://gregwhite.blog/how-to-safely-open-social-media-to-children/
Stopping the bad content is a matter of education, and every human on earth working to protect the people in their life from exploitation. It’s a generations-long effort.
but protecting Nostr can help in that effort and preserve the freedom-maximising effects Nostr can have on the world.
Couple things
1. Agreed there is no way to stop the content. That’s not the point of the proposal, this is about helping users to get it off their feed at scale, using lists that they choose (or choose not to use any)
2. I really think there’s an order to this. Users should do this primarily. Relays may have to because of their legal liability and they should do it as little as possible. And I hope clients never have to use blocklists. I believe the primary responsibility is on users to choose the kind of content they want (and want filtered out). But relays will also be forced to content moderate at some point.
I just haven’t heard a proposed solution that will help relay operators to not get taken down. Nostr can not be mainstream unless we have many relays in many jurisdictions. But forcing a user-only moderation strategy will have consequences at scale.
You will. By deciding which if any blocklists you want to use to filter your feed.
I agree that legal content that isn’t abusively spammy shouldn’t be blocked. UNLESS the user decides for themselves to mute it.
This proposal is all about how to help the user to do that muting when there’s tens of thousands of a abusive spam bots instead of the handful we have now.
It’s not about solving the social problem because I agree that required education and probably generations of it.
This is about allowing our community to isolate bad actors so they don’t poison our well.