#[0] why do you think nostr relays and damus have to comply with moderation requirements?

Reply to this note

Please Login to reply.

Discussion

Compliance aside, labels of explicit content would be nice to have in general.

Different accounts, blur, and circle of trust fix this, no?

We can’t micromanage content and if we allow the AI bots to do it, we’re creating a pretty awful world for ourselves as human beings.

Yeah that’s all great but if I’m scrolling through global while in public and click a blurred image and a pair of tits pop up on my screen it’s pretty embarrassing

bookmark for later

we’ve all been there, it’s awkward, so I don’t do it

We've seen that bots can be trained in different ways. I could see trusting a bot to have a voice in how content is moderated, and then have humans to spot check the bots and deal with problems that are escalated. Bots are really the only way to do this at the scale that will be needed.

Everyone has their list of things they'd like filtered or warned. CSAM is usually the one the most people agree on. Explicit adult content isn't on my list. Which is kinda my point - it's about individual preferences.

Legally? I guess at some point. Especially in the EU they are cracking down HARD on social media. It's insanity around here

Why at some point? Clients are more like browsers and not social media sites, right?

Doesn't matter what it technically is. If it looks like social media, labels itself like social media and is used like social media, then it's gonna be regulated the same way.

Don't get me wrong. I don't think it should be regulated. I'm just concerned that if nostr grows to a certain size, the same bullshit rules they are trying to enforce on other social media will also apply

If you see regulation as inevitable, where do you see this all ending up?

Personally Nostr gives me hope. If we can come up with a individualistic, culturally-neutral, bottom-up approach to content moderation and show that it can work. It's a framework that could be put into law. Other aspects of the tech world could be brought into the picture as well. For example IETF's Privacy Pass could be used to validate device-based content preferences and pass them onto sites which would allow parents to "moderate" what happens on their kids' phones.

I really love that Nostr exists and we can have these discussions and possibly set an example of how things should be done. It's not something corporate America would ever manage to do properly.

Same here.

Also, with the possibility to pick which clients we use and relays we connect to, we can decide in which social circles we want to be.

Wanna have Twitter 2.0? There is gonna be apps/relays for that

Maybe something filtered for teenagers? Sure thing

How about no filtering at all? There is surely gonna be someone who is going to host such a relay

Everyone can make nostr what they want it to be

In Utah and Arkansas social media sites with >100,000 users from that state now have to do 3rd party age verification for ALL users from that state.

There are all sorts of questions about how that might or might not apply to Nostr. But it's just one of many things that can trip Nostr up legally.

(For the record I think age verification is a horrible idea on many levels. But the laws exist, so you can't ignore them.)

Just to be clear I'm not saying relays or Damus or whatever have to do moderation on every possible category that someone somewhere might want to moderate. Only that there are certain minimal levels of moderation that are required by law.

Relays need to comply with the laws where their server is located and where the owner has a business presence (or where they're a resident/citizen if it's owned by an individual).

Damus and other iOS apps have to comply with Apple's rules - which are based primarily on US/EU law. That might change somewhat when Apple introduces support for other app stores. But given that it's the EU mandating that change I can't imagine those app stores will be legal free-for-alls.

And I'd add that anyone that runs a web client on their domain has to comply with certain laws as well or they also risk problems. They can be sued, they can be charged with crimes, their domain can be seized.

It'll be interesting to see how it all plays out legally. I expect you're going to hear the Client apps/applications say "that's not my data - I got it from a bunch of relays - in fact most of those relays were chosen by the user - not by us". Meanwhile the relays will say "I didn't present the information to the user. I gave it to the Client app/application - it was the Client's job to figure out what to present and what to ignore." There are some really interesting legal questions presented by Nostr…

You don't want to be the guy who gets sued or prosecuted first. Lawyers are expensive. Let someone else be the guinea pig.

Damus is better thought of as a web browser than a website.

Relay defaults might come under that scrutiny sure, but ultimately, it will be relays that have to answer those legal questions.

(Nudge nudge, run your relay over for for goodness sake)

*Tor