Replying to Avatar rabble

Research - Managing CSAM (Child Sexual Abuse Material) on decentralized social media protocols.

https://purl.stanford.edu/vb515nd6874

I thought this paper was worth a read. It goes in to how child sexual abuse material is discoved and shared on social media. Mostly it talks about the fediverse, nostr is mentioned only to say they’re aware of nostr and it has different issue than the fediverse. We will have to deal with this stuff, relay operators and media hosts in particular are potentially liable. Nostr.build already has a content moderation team, others are going to need to do something or risk losing their domain name like has happened to some mastodon servers.

分散型ソーシャルメディアプロトコルにおける児童性虐コンテンツの管理に関する論文らしい。

だいたいはFediverseへの言及で、Nostrについてはまた別の問題があると述べられているのみみたいだけど、リレー管理者やメディアのホスティングをやっている人たちは何らかの対処をする必要があるだろうと。

nostr:nevent1qqs28y0thztgptz7udkqluvgvs4ctneerxxg0uj7uq5lxqwsfugnm3spzamhxue69uhkv6tvw3jhytnwdaehgu3wwa5kuegzypmvwx4w8fy378v7a3ruhgt7y2wd5sgn5zamde4wzamdwep798905qcyqqqqqqgynaav2

Reply to this note

Please Login to reply.

Discussion

No replies yet.