We are OK with legal adult content, but are facing two big issues as a media hosting service;

- Legal liability when hosting adult content (what is legal in what countries, actual ages, verification, acts, etc. and

- CSAM filtering, is a lot easier to keep that smut off of nb servers..

I am not sure if our removal and reporting is actually helping the children, we hope so, but the bottom line is we don’t want that on nostr.build.

Reply to this note

Please Login to reply.

Discussion

Legal liability is inherent to hosting, so that's a valid concern.

That said, there is and will be a whole lot of "illegal" content that's political. Will you censor that too? Or only "adult" content? Just for everybody's education.

CSAM filtering does not work -- can't work -- unless you cast a net that also filters non CSAM content. And in any case it will not save you from "liability" if someone decides you are a target. It's just a tool and an excuse to put hosting agents like you (or Nostr relays, for that matter) in jeopardy so they comply with the political filtering, which is what State actors care about.

We would only filter obvious, adult pornographic material. We’ll never touch political ads or anything else and proudly host related content that others don’t. As long as it’s legal. Our #1 target is csam.

Is this considered child porn? Genuinely? Asking because from my view it’s definitely 💯 supportive of child porn. Since it’s #AI does it count? Some say yes and others say no.

Ideological differences are applicable here. Right?

https://nostrcheck.me/media/2aadfb8ac7d43aca6d164ed99248147910048269601ff60d4463c4d5b3abfdcd/ae04868f12b315ffd2b055b50ec9c5e9925b3c291ad6ce26296d3e836fbfcab4.webp