Per our usual arrangement, it’s nuanced.

AI generated CSAM can use a real child’s image to make content. Unfortunately, it’s being used as a form of sextortion. Source below but if you want a different source they are available via internet search.

https://fox8.com/news/how-artificial-intelligence-is-being-used-for-sextortion/

The federal law 18 U.S.C. § 1466A penalizes the creation, distribution, or possession of graphic portrayals of a minor engaged in sexual conduct.

https://www.govinfo.gov/app/details/USCODE-2011-title18/USCODE-2011-title18-partI-chap71-sec1466A nostr:note1uca8ky8y4y9y7etjaqyq6pv2g4ytvglmxfzkfl08zcv0zardgrzsy73e4d

Reply to this note

Please Login to reply.

Discussion

The key word in the federal law is, “portrayals.”

:sigh:

Humans suck.

Sextorters should be arrested and charged with extortion. But that would require actual police work.

Surprised Photoshop/Gimp aren't held to ransom politically like this already, surely they're even easier to use for this sort of thing? Unless you're trying to do it on a truly industrial scale...

It’s a nightmare that will take awhile to figure out. Innovation will be the solution. It won’t come from governments. I have hope in innovation and I see a potential for a societal change.

Unfortunately, I don’t believe that law enforcement has the resources to properly tackle any aspect of online child exploitation either synthetic or human globally.

Perhaps they should redirect some resources from narrative control to actual policing work.

The budgets involved are gargantuan, it can't be that hard to find people with journalism-major infiltration skills.

But they'd need to want to...