Avatar
Greg
1c52ebc82654e443f92501b7d0ca659e78b75fddcb9c5a65f168ec945698c92a
I like to learn things and build things | bitcoin enthusiast | energy production maximalist | abundance advocate

Just did some tweaking to my algo for measuring usage. Here's the top ten relays by zap activity.

It's crude but it takes the latest 100 zap receipt events and measures the time between the latest and earliest one and ranks by the difference between those numbers.

If there aren't 100 zap receipts they don't get a ranking for "usage"

Doing this in python is pretty terrible so I'm sure it's buggy and I'm open to suggestions on ways to do this better. It was at least a fun exercise in how bad I am mat async operations in python.

Here’s this week’s summary! nostr:note1wjx57nxkkpsm5dztnalkpwdtuh9kgeu0h38ma8qhf8ud6rj8z4tqye9qt4

Agreed but is every relay operator going to spend the hours a day (at least ) to do that work?

Forcing this into individual relay operators seems like something that will force centralisation more than giving relay operators the ability to subscribe to blocklists maintained by third parties whose job it is to create and maintain them.

At first there will only be a few providers but people are very opinionated about how they want their content filtered (if at all) which will drive more competition and therefore decentralization around these block list providers.

I revisited the series recently and has the same thought. Including all the statist push back

Replying to Avatar Shawn

nostr:npub1qny3tkh0acurzla8x3zy4nhrjz5zd8l9sy9jys09umwng00manysew95gx inspired me to dig up my original paperback. Check out that headline quote. 2009 was a different era. nostr:note174fj0q3l9wguengcq9vg98mpmpcpru59fj3vqrdv9erzr2qfz0rs64l3gj

I love this series. Such an imaginative story.

That’s a good question. Based on my research about the fediverse (the only close approximation to Nostr) whoever manages the servers that host and distribute content are legally liable in a variety of ways.

I linked this at the top of my proposal because it’s helpful context: https://www.eff.org/deeplinks/2022/12/user-generated-content-and-fediverse-legal-primer?ref=gregwhite.blog

The western world generally adheres to this regime of who is responsible for the distribution of illegal content. China is far less permissive, so I’m not sure if there’s any point is trying to satisfy their legal demands.

That’s an interesting strategy and I believe in your right to contribute to the community in that way. I don’t know if that’ll scale when we have millions of active users, but I hope it does.

I wanna double down on this point. I don’t wanna mandate anything or garner support for any mandates.

I wanna build a solution that I think will help and if no one adopts it then the proof will be in the pudding and it will become clear it wasn’t the right solution.

I’m predicting a future where relay operators come under threat from law enforcement and we will be scrambling for ways to continue operating under that scrutiny.

Nostr cannot scale if it remains a niche offering that can only operate in jurisdictions unreachable by the US and China.

I don’t pretend to know the right answer, but I wanna make progress on an idea and start the conversation.

Thanks for engaging so deeply. I truly respect your opinions, your feedback, and what you do for the community.

We definitely outsource that to police officers at the scale of a city or a region, but if you have a community (a church, a group of friends, etc) and they’re doing something that’s harmful to the group or going to get the group in trouble with the law and the community doesn’t want any part in it…It’s the responsibility of the community to protect itself.

The motivation of this is to help users and communities to protect themselves from people *they* determine are bad actors in their space (their feed, their DMs, their relays for example).

This isn’t about nostr-wide moderation / censorship. That’s impossible and against the ethos of Nostr. This proposal is about giving people the tools to protect themselves from bad actors.

And I still am not sure I’m communicating this well, but I’m not trying to define what bad actors are. But I know that governments do, and they will go after relay operators that aren’t blocking content the governments want blocked.

Nost relays running in very permissive jurisdictions will have the luxury of not needing to do content moderation. And luckily the internet is still fairly open so you can connect to any relay you like from your Nostr clients. So if you want to use relays that don’t do any moderation that’s your right!

If you want to use a relay that operates in the US, then that relay operator needs tools to make sure they can stay within the boundaries of the law. I hope that Nostr chips away at the power of states so they give up on censorship.

But what would help relay operators prevent copyrighted material from spreading via their relays (even if they disagree with those laws) will also be a useful tool to prevent the spread of child porn and other content that a vast majority of Nostr users will agree has no place in their community. Let’s give each person and each community the tools to curate their own domain.

Replying to Wonteet Zebugs

I hope we never get censorship or any form of blocking at the relay level. I will be more than happy to pay for relays (wherever they may be) that DO NOT censor or block any content. All blocking, muting, censorship should be done at the client level. If the task becomes too heavy for a client-side app, we could have more beefy "personal" private relays that do that for us and our client-side app gets its data only from that personal relay (which would then rebroadcast our notes). Same goes for email : I choose to use servers that do not block any spam. I can do that for myself in my email client. I don't want someone else deciding for me.

All moderation/censorship lists should be private lists and I would stay away from any list from any of the current major social media players. If I wanted that censorship, I'd be on those platforms.

I would simply recommend that it be easy to share private lists between nostr users.

Kids-friendly moderation scares me if it's done by the likes of youtube. They actually *target* kids. I've seen it in action.

Small kids probably shouldn't be on social media, period. But if one wants them to be on nostr, I would like to have a personal, private, whitelist of people they can follow (family and family friends) and the client app limits what they can see to only what those few people share. Even then, I really don't think young developping minds should be putting out their every thought for all to see, for all eternity, with no possibility of taking any of it back.

There might be a case for only allowing a watch-only login (npub, no private key) for young kids. It might be a useful education tool which would provide different (and timely) topics of conversation with our kids.

Just my two sats.

I wanna support your right I choose relays that don’t moderate at all!

As for kids. I was thinking about allow lists instead of block lists. I also wrote about it a while back

Should avoid some of what you see with YouTube but also not throw them to the wolves.

https://gregwhite.blog/how-to-safely-open-social-media-to-children/

Couple things

1. Agreed there is no way to stop the content. That’s not the point of the proposal, this is about helping users to get it off their feed at scale, using lists that they choose (or choose not to use any)

2. I really think there’s an order to this. Users should do this primarily. Relays may have to because of their legal liability and they should do it as little as possible. And I hope clients never have to use blocklists. I believe the primary responsibility is on users to choose the kind of content they want (and want filtered out). But relays will also be forced to content moderate at some point.

I just haven’t heard a proposed solution that will help relay operators to not get taken down. Nostr can not be mainstream unless we have many relays in many jurisdictions. But forcing a user-only moderation strategy will have consequences at scale.

You will. By deciding which if any blocklists you want to use to filter your feed.

It’s not about solving the social problem because I agree that required education and probably generations of it.

This is about allowing our community to isolate bad actors so they don’t poison our well.