I totally respect this position.
How many npubs a day would you be willing to mute? I’m just asking would you be willing to spend an hour a day or more muting bots and illegal content? Just curious.
I totally respect this position.
How many npubs a day would you be willing to mute? I’m just asking would you be willing to spend an hour a day or more muting bots and illegal content? Just curious.
Currently i mute about 3 per day as is needed but I would do more. Freedom is more important to me and my country is moving to an authoritarian stance, so the need to get unfiltered information trumps my personal preferences
You didn’t answer my question, how many a day would you be willing to mute?
Example: Would you spend 2 hrs a day doing it to see one human or non illegal post about an actual news article from an independent journalist about trump?
I said I'd do more. Yet to mute anything but bots as I primarily use Paid relays
Yes I'd mute whatever I needed. I'm not going to hand control to any entity
But you won’t need to by design, since man over machine makes the decision a man’s privilege and responsibility.
Entities only provide a temporal causal fact of the unknown.
💎
#mainvolume 
🐉
✉️ 
Ok so if some of the different suggestions above or something similar isn’t implemented, you hand over control of Nostr to bad actors at some point. You will lose the freedom.
Once you have freedom online or offline, you have to keep it. You have to defend it.
You are talking about getting freedom but not keeping it. A few mutes a day is no defending or protecting freedom on a large scale.
The conversation above is about protecting it and keeping it on a large scale.
You won’t be forced or coerced into defending freedom here. It will become unusable for you personally if you opt out of different types of defense and protections when it grows. That’s the conversation that we are having. “We have freedom, how do we protect it.”
Or have I misunderstood your words or do you wish to defend freedom with censorship? Content that you think is illegal others may find legal.
I’m always specifically speaking about child sexual abuse material, child sexual exploitation material, Ai generated child sexual abuse material, the sale of humans for labor or sex using force, fraud, or coercion. All violations of the NAP and human rights violations globally.
These issues specifically put the builders and others in an extremely venerable position. When you fully understand what is at stake and the scale of the threat you look for ways to protect a right to speak freely.
No government wants anything like Nostr. Expect any and all attacks at their disposal. That’s just the governments.
This is an adult conversation about the reality that the world has evil parts and there are solutions available that don’t silence anyone’s voice across the protocol while protecting their opportunity to have a voice.
Very few are able to have this type of conversation. Anyone who can’t have this specific discussion either doesn’t understand (no shade I don’t know everything) is disingenuous or is larping because they want freedom without the responsibility of protecting it.
The beautiful part about Nostr is that if you want Wild West clients there is nothing stopping you from that.
I haven’t spoken to a creator of a client yet that isn’t concerned and aggressively looking for solutions. They have skin in the game. So that’s why there is a conversation about possible solutions.
As you said here...
"The beautiful part about Nostr is that if you want Wild West clients there is nothing stopping you from that."
There is no way to stop this type of content.
Censoring, in my opinion, is a futile effort. Education is the way.
In time, today there are already clients who do moderation, zbd and primal.
Couple things
1. Agreed there is no way to stop the content. That’s not the point of the proposal, this is about helping users to get it off their feed at scale, using lists that they choose (or choose not to use any)
2. I really think there’s an order to this. Users should do this primarily. Relays may have to because of their legal liability and they should do it as little as possible. And I hope clients never have to use blocklists. I believe the primary responsibility is on users to choose the kind of content they want (and want filtered out). But relays will also be forced to content moderate at some point.
I just haven’t heard a proposed solution that will help relay operators to not get taken down. Nostr can not be mainstream unless we have many relays in many jurisdictions. But forcing a user-only moderation strategy will have consequences at scale.
Stopping the bad content is a matter of education, and every human on earth working to protect the people in their life from exploitation. It’s a generations-long effort.
but protecting Nostr can help in that effort and preserve the freedom-maximising effects Nostr can have on the world.
Okay, looks like we've come to an agreement that it's not possible to stop any kind of content in #nostr
If the intention is to help with a cleaner feed for new users who don't want or can't for whatever reason be their own algorithm, we have to look at filter.nostr.wine, I use it to access global, it brings me notes of who I I follow and who they follow, great service that keeps my global feed clean. I always talk about this service to help those who are arriving. What we need is more options for this type of service. Not blocks, simply because it won't work.
There's one thing that intrigued me, why do you think that at some point relays will be forced to moderate content?
child porn.
That's a good reason, but if we haven't been able to eliminate this type of content on centralized platforms until today, I believe that on a decentralized protocol we won't be able to. But yeah, I wish my relays didn't feed me that content. But I want transparency, I want to know what content is censored and a way to check.
here’s the thing…
even if child sex trafficking wasn’t a multi billion dollar industry (which makes me want to go on a Rambo outing), the FBI would plant child porn on the network at some point. Or “terrorist communications” to justify whatever they wanted to do.
Nostr is a sitting duck until it solves for these things. Unfortunately, short sighted soy devs who also want to censor the world for their fefes, make up the majority of the “moderation” crowd. So thinking through how to deal with this in a new way has to happen.
Either easy way of dealing with this, censorship or removal of anonymity, are the things the corrupt State wants. And as soon as the door is open, the Feds will come in.
The problem comes from dealing with it using centralized power. So the question becomes how can it be dealt with on the individual level without changing relay’s simplicity, or making all clients censorship tools of a corrupt state, and groups of crying soy devs?
How does one censor content without censoring content?
It’s a real question…
I think we see the problem the same way. Im open to more ideas!
we are all looking for ideas.
it concerns me that most solutions aren’t simple. and the temptation for hero position seeking behavior around this is strong.
I’ve always been partial to Slashdot’s mod point system and you can surf at whatever level you want.
I’m not talking about optional levels of engagement. if that were the discussion, i wouldn’t be engaging in this conversation.
My concern is purely about illegal content, and how that could be used by the censorship industrial complex to trojan horse nostr.
The best way to mitigate that is to come up with a system that handles that problem without offering the Feds an attack surface they can use in other ways. Which is no small feat.
It just dawned on me that keeping media delivery separate is the key.
clients could incorporate external media delivery services like nostr.build, and those services would bear the brunt of dealing with illegal content. If people then used external links to bypass media delivery systems, this is something outside of the control of nostr, and outside of client developers responsibility. That is the purview of the Feds.
If then, a media delivery service got weird and started censoring for reasons outside of illegality, they could be replaced, or bypassed by posting external links, preventing full centralized censorship.
If a client starts moving past this, and centralizes censorship for arbitrary reasons, they can be replaced.
That’s a good question. Based on my research about the fediverse (the only close approximation to Nostr) whoever manages the servers that host and distribute content are legally liable in a variety of ways.
I linked this at the top of my proposal because it’s helpful context: https://www.eff.org/deeplinks/2022/12/user-generated-content-and-fediverse-legal-primer?ref=gregwhite.blog
The western world generally adheres to this regime of who is responsible for the distribution of illegal content. China is far less permissive, so I’m not sure if there’s any point is trying to satisfy their legal demands.
That’s an interesting strategy and I believe in your right to contribute to the community in that way. I don’t know if that’ll scale when we have millions of active users, but I hope it does.
I wanna double down on this point. I don’t wanna mandate anything or garner support for any mandates.
I wanna build a solution that I think will help and if no one adopts it then the proof will be in the pudding and it will become clear it wasn’t the right solution.
I’m predicting a future where relay operators come under threat from law enforcement and we will be scrambling for ways to continue operating under that scrutiny.
Nostr cannot scale if it remains a niche offering that can only operate in jurisdictions unreachable by the US and China.
I don’t pretend to know the right answer, but I wanna make progress on an idea and start the conversation.
Thanks for engaging so deeply. I truly respect your opinions, your feedback, and what you do for the community.
That's the magic of #nostr we have different views for the same problem and that's ok, here we can debate freely. 🫂
I actually deeply agree with this. Unfortunately education and awareness doesn’t always stop bad things from happening in the world. Greg is making suggestions for when the human rights violations are in process or have already occurred.
You down to work on a program to educate and raise awareness to the youth about internet safety? That’s the wave that I’m on because I don’t work in tech. 💜
I would be happy to help on a project like this. I'm a dev and I can help you with anything you need in that regard.
You sound like you want to be the arbiter of truth. Who will be deciding what is to be seen?
You will. By deciding which if any blocklists you want to use to filter your feed.
Sorry I was trying to reply to eliza. I think we agreed on client side moderation. Sounds like she is advocating for more than this
Nah.
I’m an-cap so I have these types of conversations regularly. Child abusers and those who violate the NAP wouldn’t be allowed in my community.
Like adults we would discuss solutions to keep the community safe.
You can have your community in ancapistan and do whatever. Your community would not be my problem.
Now put that online and that’s the conversation that I’m having.
That’s freedom. You choose what you prefer and I’ll choose what I prefer. Don’t use a client that has any safeguards etc if that’s what you prefer. If that’s your personal preference this conversation has nothing to do with you.
I wish you the best of luck with your version of freedom. Truly. 💜
You replied to my response to a dev. Don't tell me that a conversation you chimed in on has nothing to do with me. This is the part that makes me think your just here to try and gain a position of power. Authoritarians make me sick