I agree. Going back over a decade there have been multiple attempts to classify various types of content certain people didn't like. The problem with most of them was they were based on specific cultural norms. "Inappropriate for someone under 18 years of age" means very different things in Finland and Saudi Arabia. The classification systems that were possibly capable of being culturally neutral were too complicated to implement.
This isn't a new problem, and what history has taught is is that there's no simple solution. Germany requires age verification for all adult content (but only for .de domains). The UK tried to figure out age verification and failed (multiple times). Louisiana just passed age verification requirements - but there are major privacy implications to the law. Now France is talking about it, and the list goes on and on.
The solution I'd like to see is for the same IETF draft standard that Apple (and soon Google) use for Private Access Tokens (which confirm the user is human) be used to say whether parental controls are in place on the device (and possibly what types of parental controls - nudity, sex, violence, etc.) Then websites can filter their content based on that data. The same IETF standard could be used to verify age if the states set up "mediator" services (a term defined in the standard). But parents would need to do minimal parenting to make sure the parental controls are in place.
https://www.ietf.org/archive/id/draft-private-access-tokens-01.html
I don't know how any of the anti-porn / age verification laws are going to deal with Nostr. It's a type of chaos and ambiguity they're not prepared to deal with. I can just see some clueless politician saying "We need to subpoena the CEO of Nostr to appear in front of our committee for questioning!" And then being completely confused that something so big has no corporate structure.