What’s acceptable is subjective, but generally not entirely individual. We tend to listen to our peers a lot to learn how we feel about things.

I was talking to a friend who at the time was a member of parliament for the Pirate Party in Iceland, and he said, jokingly, that he wanted to past a law that forced Facebook to allow full nudity in Facebook for users in Iceland. And ban anything to do with guns or violence, including stuff from Hollywood which wasn’t real.

This wasn’t serious. It was a thought experiment. Why did Icelandic users, in Iceland, talking to other people in Iceland, have to conform to the content standards of a California corporation?

If Germany can pass a law banning nazi symbols and companies all over the world enforce it upon users in Germany, why shouldn’t other countries so something similar. What about forcing companies to allow content they’d otherwise block?

Anyway, on a more real level, for nostr, I think users should be able to self declare content warnings for individual posts or their entire feed. And it should be done with a tagging system, I believe Tumblr does this, so I decide I don’t agree with Icelandic social norms and don’t want to see nudity or sexual content. The Icelandic nostriches (is that what we’re called?) might not self declare warnings. So we clearly need to be able label other people’s posts or feeds.

I think apps should then only trust those labels based on the social graph. They shouldn’t display a random person’s content warning, but if it’s somebody you follow, then you probably trust them enough to use their warnings.

Reply to this note

Please Login to reply.

Discussion

Thank you for your thorough response, per your example, you see it gets complicated.. How am I supposed to delete any sexual photos on valentines, even if some consider it ‘vulgar’ other may think it funny or art… Obviously I am currently restrained by the laws of the USA giving us a lot more flexibility than Facebook or Twitter, but still constrained in some ways…

We are having a session on this at #nostrica to discuss all of this. Ultimately I agree that it should be based on community feedback, but my platform is a bit different in that there is no context behind the images, and they are coming from all over the world.. Is it an underwear model, or someone being exploited? Is someone supporting this swastika or talking about how terrible it is, or showing a traditional symbol that’s been around long before Hitler…..

Once we expand nostr.build beyond AWS, and US laws, it will get even more interesting. I absolutely do not want nostr.build to turn into some porn/shock photo/borderline illegal thing… I just want it to be free from ads and manipulation…

I've been working in the adult industry for nearly 15 years. I'd love to work with you to figure out how to deal with adult content. @npub1wnwwcv0a8wx0m9stck34ajlwhzuua68ts8mw3kjvspn42dcfyjxs4n95l8 (and probably other Nostr clients) are implementing your service with no warning that adult content isn't allowed on your servers. So it behooves everyone to figure this out quickly.

Porn stars are a huge potential market for Nostr. They constantly have problems with Twitter, Instagram, etc. and desperately need a censorship-free alternative. Built-in tipping is a huge plus for them. As you probably know - many of them have HUGE followings. If they start migrating to Nostr their fans will follow. And all their followers are just regular folks who will then use Nostr for other purposes as well. But the porn stars won't come if they can't easily upload images and videos…

Whatever I can do to help, let me know.

Thank you, oddly enough, this is actually my biggest issue. Not hate, not guns, not controversial ads, but legit porn. My initial solution will be to add an age warning and separate it from the view-all feed…

Would love to chat more about your ideas, DM sent.

it’s better to *not* encourage or support the proliferation of porn or the porn industry. despite the fact that many people rely on it to make a living, it is simply not a good thing. it is actually harmful for everyone involved.

I don’t want to encourage porn, absolutely don’t want nostr.build to be known as any kind of porn site, and I have been deleting most porn, but there are a ton of artistic images, that I can’t just delete, but have to do something..

i totally feel for you. it is tough with the boundary cases & taking on all this responsibility 🙏. 🖖🤙

If you need names of porn stars with PhDs who would tell you that you're wrong, I can provide them. There was a group of them who just yesterday got booted from a panel at the University of Ohio when they started refuting the exact point of view you just expressed.

Porn is not inherently evil or harmful. For example there's no exploitation of women in gay porn. And quite a few people in porn find it rewarding and affirming. I have personal friends who are in porn. My knowledge is first hand.

That said, yes, as in any workplace there are problems. But corporate America is often harmful as well.

I agree. Going back over a decade there have been multiple attempts to classify various types of content certain people didn't like. The problem with most of them was they were based on specific cultural norms. "Inappropriate for someone under 18 years of age" means very different things in Finland and Saudi Arabia. The classification systems that were possibly capable of being culturally neutral were too complicated to implement.

This isn't a new problem, and what history has taught is is that there's no simple solution. Germany requires age verification for all adult content (but only for .de domains). The UK tried to figure out age verification and failed (multiple times). Louisiana just passed age verification requirements - but there are major privacy implications to the law. Now France is talking about it, and the list goes on and on.

The solution I'd like to see is for the same IETF draft standard that Apple (and soon Google) use for Private Access Tokens (which confirm the user is human) be used to say whether parental controls are in place on the device (and possibly what types of parental controls - nudity, sex, violence, etc.) Then websites can filter their content based on that data. The same IETF standard could be used to verify age if the states set up "mediator" services (a term defined in the standard). But parents would need to do minimal parenting to make sure the parental controls are in place.

https://www.ietf.org/archive/id/draft-private-access-tokens-01.html

I don't know how any of the anti-porn / age verification laws are going to deal with Nostr. It's a type of chaos and ambiguity they're not prepared to deal with. I can just see some clueless politician saying "We need to subpoena the CEO of Nostr to appear in front of our committee for questioning!" And then being completely confused that something so big has no corporate structure.

So true. Although a classic problem, I believe WE now have the ability to re-think all of this. It really applies to everyone also, especially as decentralization grows. nostr.build may get accused of spreading certain info, but how about Google, AWS, etc that empower nostr.build?!? Lol

Rethink, yes, but there's much to learn from the history of what's gone before.

The law has different rules for different types of adult material. If you're commercially publishing the material, there's one set of standards (documentation of age and consent). If you're an amateur uploading for personal reasons (e.g. dating) there's another set of rules. And if you're hosting random user-generated content there's yet another set of rules.

Ultimately @npub1nxy4qpqnld6kmpphjykvx2lqwvxmuxluddwjamm4nc29ds3elyzsm5avr7 , as the host, bears the legal responsibility for the content. They're going to get VERY familiar with DMCA take-down requests. The proposed changes to Section 230 will be very important to them.

Sorry, for talking about you in the 3rd person. For some reason I thought I was responding to @npub1wmr34t36fy03m8hvgl96zl3znndyzyaqhwmwdtshwmtkg03fetaqhjg240

Uggh…

I can recommend some very good first amendment lawyers, if you feel you need advice…

Corey Silverstein - http://porn.law

Larry Walters - https://www.firstamendment.com

But you're really more in the position of someone like Twitter or Facebook - a host that doesn't editorially approve what goes on their server. You wouldn't meet the 1/3rd threshold that some of the recent laws have put in place at which point age verification is necessary.

For the porn content - quite a bit of it can be legal. (Provided Congress doesn't change Section 230, and the GOP doesn't redefine "obscenity"). If you're not interested in hosting it, I am. And I'm sure others in the adult industry would also be interested.

I found the classification system that was most advanced in it's day - ICRA…

https://web.archive.org/web/20080622002259/http://www.icra.org/vocabulary/

It was discontinued and the website taken down because not enough sites adopted it. Luckily ChatGPT remembered the name and headed me to archive.org to find it.

If we could get feeds with "sensitive" content to classify themselves by even the broad categories in ICRA, that would be huge. They are:

- Nudity

- Sexual Material

- Violence

- Coarse Language

- Potentially Harmful Activities

There are many items under each… For example you can specify "Erotica" under Sexual Material to indicate your content is only soft-core.

And then they have the idea of "Context" which is also really useful in some cases…

- Artistic

- Educational

- Medical

- Sports

- News

IMHO, that's where any classification system should start. Just pick up with what was best from years ago and build on it. This time maybe it can be implemented more simply.