Everyone has _something_ they don't want to see - ergo, reporting. (CSAM is on most people's list).

For starters there are the legal issues with some content - whether you agree with the laws or not - some of us just don't want the hassle of being dragged into court. I have a forum site which had a pretty active section discussing escorts in London. But when FOSTA/SESTA came out, even though they weren't talking about US-based escorts, because I and my server were US-based I had to close that section of the site. Didn't want to, but had to anyway. The users on my forum site are pretty protective about the site. When they see potentially illegal content they report it. Not because they agree with the law, but because reporting it protects something they care about (the site).

Then there are cultural norms. Even if a user is OK with content, their culture may dictate that certain types of content aren't OK in certain situations (e.g. when they're at work). So content moderation is needed then too.

One of the pending PRs suggests turning Kind 1984 reports into "labels" to make it less about narcing on someone. I like the change in focus that would bring. Then you make use of the labels you find most useful.

So there are many reasons for content moderation besides "prudes will be prudes"…

Reply to this note

Please Login to reply.

Discussion

thank you for the well written out and thoughtful response :)

That's not what OP was saying though.

Yes of course we need a way to report CP.

But an adult who wants to post nudez on their account and tags those posts accordingly (nsfw) should be able to do so.

Those who don't want to see it can choose not to follow it. I would imagine a feature we'll have soon is filtering keywords from our feeds, it's an obvious functionality that's implemented on the bird app and I remember using a plugin on Tumblr for the same.

So if you post tits and tag #boobstr and #nsfw someone with those tags filtered wouldn't even see it.