Or have I misunderstood your words or do you wish to defend freedom with censorship? Content that you think is illegal others may find legal.

Reply to this note

Please Login to reply.

Discussion

I’m always specifically speaking about child sexual abuse material, child sexual exploitation material, Ai generated child sexual abuse material, the sale of humans for labor or sex using force, fraud, or coercion. All violations of the NAP and human rights violations globally.

These issues specifically put the builders and others in an extremely venerable position. When you fully understand what is at stake and the scale of the threat you look for ways to protect a right to speak freely.

No government wants anything like Nostr. Expect any and all attacks at their disposal. That’s just the governments.

This is an adult conversation about the reality that the world has evil parts and there are solutions available that don’t silence anyone’s voice across the protocol while protecting their opportunity to have a voice.

Very few are able to have this type of conversation. Anyone who can’t have this specific discussion either doesn’t understand (no shade I don’t know everything) is disingenuous or is larping because they want freedom without the responsibility of protecting it.

The beautiful part about Nostr is that if you want Wild West clients there is nothing stopping you from that.

I haven’t spoken to a creator of a client yet that isn’t concerned and aggressively looking for solutions. They have skin in the game. So that’s why there is a conversation about possible solutions.

As you said here...

"The beautiful part about Nostr is that if you want Wild West clients there is nothing stopping you from that."

There is no way to stop this type of content.

Censoring, in my opinion, is a futile effort. Education is the way.

In time, today there are already clients who do moderation, zbd and primal.

Couple things

1. Agreed there is no way to stop the content. That’s not the point of the proposal, this is about helping users to get it off their feed at scale, using lists that they choose (or choose not to use any)

2. I really think there’s an order to this. Users should do this primarily. Relays may have to because of their legal liability and they should do it as little as possible. And I hope clients never have to use blocklists. I believe the primary responsibility is on users to choose the kind of content they want (and want filtered out). But relays will also be forced to content moderate at some point.

I just haven’t heard a proposed solution that will help relay operators to not get taken down. Nostr can not be mainstream unless we have many relays in many jurisdictions. But forcing a user-only moderation strategy will have consequences at scale.

Stopping the bad content is a matter of education, and every human on earth working to protect the people in their life from exploitation. It’s a generations-long effort.

but protecting Nostr can help in that effort and preserve the freedom-maximising effects Nostr can have on the world.

Okay, looks like we've come to an agreement that it's not possible to stop any kind of content in #nostr

If the intention is to help with a cleaner feed for new users who don't want or can't for whatever reason be their own algorithm, we have to look at filter.nostr.wine, I use it to access global, it brings me notes of who I I follow and who they follow, great service that keeps my global feed clean. I always talk about this service to help those who are arriving. What we need is more options for this type of service. Not blocks, simply because it won't work.

There's one thing that intrigued me, why do you think that at some point relays will be forced to moderate content?

child porn.

That's a good reason, but if we haven't been able to eliminate this type of content on centralized platforms until today, I believe that on a decentralized protocol we won't be able to. But yeah, I wish my relays didn't feed me that content. But I want transparency, I want to know what content is censored and a way to check.

here’s the thing…

even if child sex trafficking wasn’t a multi billion dollar industry (which makes me want to go on a Rambo outing), the FBI would plant child porn on the network at some point. Or “terrorist communications” to justify whatever they wanted to do.

Nostr is a sitting duck until it solves for these things. Unfortunately, short sighted soy devs who also want to censor the world for their fefes, make up the majority of the “moderation” crowd. So thinking through how to deal with this in a new way has to happen.

Either easy way of dealing with this, censorship or removal of anonymity, are the things the corrupt State wants. And as soon as the door is open, the Feds will come in.

The problem comes from dealing with it using centralized power. So the question becomes how can it be dealt with on the individual level without changing relay’s simplicity, or making all clients censorship tools of a corrupt state, and groups of crying soy devs?

How does one censor content without censoring content?

It’s a real question…

I think we see the problem the same way. Im open to more ideas!

we are all looking for ideas.

it concerns me that most solutions aren’t simple. and the temptation for hero position seeking behavior around this is strong.

I’ve always been partial to Slashdot’s mod point system and you can surf at whatever level you want.

I’m not talking about optional levels of engagement. if that were the discussion, i wouldn’t be engaging in this conversation.

My concern is purely about illegal content, and how that could be used by the censorship industrial complex to trojan horse nostr.

The best way to mitigate that is to come up with a system that handles that problem without offering the Feds an attack surface they can use in other ways. Which is no small feat.

It just dawned on me that keeping media delivery separate is the key.

clients could incorporate external media delivery services like nostr.build, and those services would bear the brunt of dealing with illegal content. If people then used external links to bypass media delivery systems, this is something outside of the control of nostr, and outside of client developers responsibility. That is the purview of the Feds.

If then, a media delivery service got weird and started censoring for reasons outside of illegality, they could be replaced, or bypassed by posting external links, preventing full centralized censorship.

If a client starts moving past this, and centralizes censorship for arbitrary reasons, they can be replaced.

That could do it 🤔 but then how would they deal with it?

That would really be up to them.

They could use reporting, AI, and even go so far as to tap into the GIFCT database…

That’s a good question. Based on my research about the fediverse (the only close approximation to Nostr) whoever manages the servers that host and distribute content are legally liable in a variety of ways.

I linked this at the top of my proposal because it’s helpful context: https://www.eff.org/deeplinks/2022/12/user-generated-content-and-fediverse-legal-primer?ref=gregwhite.blog

The western world generally adheres to this regime of who is responsible for the distribution of illegal content. China is far less permissive, so I’m not sure if there’s any point is trying to satisfy their legal demands.

That’s an interesting strategy and I believe in your right to contribute to the community in that way. I don’t know if that’ll scale when we have millions of active users, but I hope it does.

I wanna double down on this point. I don’t wanna mandate anything or garner support for any mandates.

I wanna build a solution that I think will help and if no one adopts it then the proof will be in the pudding and it will become clear it wasn’t the right solution.

I’m predicting a future where relay operators come under threat from law enforcement and we will be scrambling for ways to continue operating under that scrutiny.

Nostr cannot scale if it remains a niche offering that can only operate in jurisdictions unreachable by the US and China.

I don’t pretend to know the right answer, but I wanna make progress on an idea and start the conversation.

Thanks for engaging so deeply. I truly respect your opinions, your feedback, and what you do for the community.

That's the magic of #nostr we have different views for the same problem and that's ok, here we can debate freely. 🫂

I actually deeply agree with this. Unfortunately education and awareness doesn’t always stop bad things from happening in the world. Greg is making suggestions for when the human rights violations are in process or have already occurred.

You down to work on a program to educate and raise awareness to the youth about internet safety? That’s the wave that I’m on because I don’t work in tech. 💜

I would be happy to help on a project like this. I'm a dev and I can help you with anything you need in that regard.