So true. Although a classic problem, I believe WE now have the ability to re-think all of this. It really applies to everyone also, especially as decentralization grows. nostr.build may get accused of spreading certain info, but how about Google, AWS, etc that empower nostr.build?!? Lol
I agree. Going back over a decade there have been multiple attempts to classify various types of content certain people didn't like. The problem with most of them was they were based on specific cultural norms. "Inappropriate for someone under 18 years of age" means very different things in Finland and Saudi Arabia. The classification systems that were possibly capable of being culturally neutral were too complicated to implement.
This isn't a new problem, and what history has taught is is that there's no simple solution. Germany requires age verification for all adult content (but only for .de domains). The UK tried to figure out age verification and failed (multiple times). Louisiana just passed age verification requirements - but there are major privacy implications to the law. Now France is talking about it, and the list goes on and on.
The solution I'd like to see is for the same IETF draft standard that Apple (and soon Google) use for Private Access Tokens (which confirm the user is human) be used to say whether parental controls are in place on the device (and possibly what types of parental controls - nudity, sex, violence, etc.) Then websites can filter their content based on that data. The same IETF standard could be used to verify age if the states set up "mediator" services (a term defined in the standard). But parents would need to do minimal parenting to make sure the parental controls are in place.
https://www.ietf.org/archive/id/draft-private-access-tokens-01.html
I don't know how any of the anti-porn / age verification laws are going to deal with Nostr. It's a type of chaos and ambiguity they're not prepared to deal with. I can just see some clueless politician saying "We need to subpoena the CEO of Nostr to appear in front of our committee for questioning!" And then being completely confused that something so big has no corporate structure.
Discussion
Rethink, yes, but there's much to learn from the history of what's gone before.
The law has different rules for different types of adult material. If you're commercially publishing the material, there's one set of standards (documentation of age and consent). If you're an amateur uploading for personal reasons (e.g. dating) there's another set of rules. And if you're hosting random user-generated content there's yet another set of rules.
Ultimately @npub1nxy4qpqnld6kmpphjykvx2lqwvxmuxluddwjamm4nc29ds3elyzsm5avr7 , as the host, bears the legal responsibility for the content. They're going to get VERY familiar with DMCA take-down requests. The proposed changes to Section 230 will be very important to them.
Sorry, for talking about you in the 3rd person. For some reason I thought I was responding to @npub1wmr34t36fy03m8hvgl96zl3znndyzyaqhwmwdtshwmtkg03fetaqhjg240
Uggh…
I can recommend some very good first amendment lawyers, if you feel you need advice…
Corey Silverstein - http://porn.law
Larry Walters - https://www.firstamendment.com
But you're really more in the position of someone like Twitter or Facebook - a host that doesn't editorially approve what goes on their server. You wouldn't meet the 1/3rd threshold that some of the recent laws have put in place at which point age verification is necessary.
For the porn content - quite a bit of it can be legal. (Provided Congress doesn't change Section 230, and the GOP doesn't redefine "obscenity"). If you're not interested in hosting it, I am. And I'm sure others in the adult industry would also be interested.