Replying to Avatar Constant

Hello Nostr, if you are in a great mood just skip this post; its depressing.

So I had not encountered it before, but yesterday I crossed paths with Child Sexual Abuse Material on Nostr. In my regular internet usage over the years I have rarely come across this stuff, though I guess if I were to look for it I would find it eventually;

That is to say, the status quo is that it does exist, but most people most of the time wont have to deal with it. I think this is important to realize that the world is not perfect as it is, when reflecting on these matters in the context of Nostr.

It goes without saying, but just to be clear: yes I think we should all learn how to tie nooses and identify adequate oak trees.

However marginalized CSAM is, some people want governments to go above and beyond to combat it. Prime example currently is the ‘Chat control’ regulation proposed out of the EU, which wants to install bigbrother client side on your phone to scan every single thing you do in order to flag any suspicious behavior/content, before it gets encrypted. How understandable the motivation might be, even advocacy groups and agencies dealing with the CSAM problem are against this type of stuff, if not just simply because they are already swamped with work/processing of material as it is; opening the floodgates with false positives wont help anything and probably make the situation worse. Aside from the obvious objections to forcibly installing big brother on peoples hardware of course.

Back to Nostr. On the one hand we have the end-user, that does not want to get confronted by this material. From this perspective, CSAM is just one of the many things a user might want to filter out, along with other material that might not be illegal per se but just NSFW etc. Whatever means we find to do this, failure by those mechanisms to do so is bad, unwanted etc. but not a direct systemic risk to Nostr; like I mentioned in the beginning, it is not impossible to accidentally come across this type of stuff on the internet today as is, and the whole world is still using it.

But it does become a systemic issue from the relay perspective. Here, it is not some incidental bad experience that can be clicked away. It is a crime to host this type of material which brings in the risk of prosecution for ‘simply running a relay’ that some asshole decided to nuke with CSAM or other illegal material.

But here my optimism comes in. Nostr is pro censorship; the theory is that every relay can moderate to their hearts content, because users are ultimately always able to route around such obstacles (very much like ‘the internet’ itself). This means that that relays should be able to adjust their policies and methods of moderation to their capacity to deal with unwanted content and risk appetite. From a locked down white-list only relay on one side of the spectrum, all the way to an open relay with heavy sophisticated analytics for assessment and filtering, and everything in between: albeit that it wont deliver us a perfect solution in all cases, it will remove the dark cloud of systemic risk to the protocol/network, because we are able to sufficiently marginalize the phenomena.

On a last note: when talking about filtering/assessing for this content it gets complicated really quickly. You can imagine some AI performing such a task, or using lists of known content to filter; however you want to do it, you first come to the question on how you construct that stuff in the first place; it requires gathering such content and human eyes looking at it. And then subsequently you produce tooling that can be flipped around and used as a search engine to seek and find such material instead of filtering it away. So yeah, there are no graceful perfect solutions I am afraid.

Well, there is one of course….

https://cdn.satellite.earth/a92bdd80dbd45e00636a9db615061eef168c3164a0e1bfa1abfb0784e74cd24e.mp3

i know it's totally irrelevant but how did you get to this message, what hosting service was it, what kind of npub was it, filtering out images of people you don't follow is a smart move, even alt text seems like a smart move that way.

Reply to this note

Please Login to reply.

Discussion

Unfortunatly i can reliably reproduce the steps to get to the CSAM i came across, but i wont share it. But it was not something that popped up in my feed, but as a result of a search query (to something completely unrelated to the CSAM obviously).

In this case they were kind 1063 events, hosted on one of the bigger relays that a lot of people use.

Normally 1063 event contain a URL and content (in this case a picture) is hosted somewhere else. Here, it was not a URL but the raw file in base64 encoding, which the client is then supposed to translate to a webp (though this is not part of the NIP-94 spec).

How clients handle this varies, i happened to use one at the time that is able to handle this stuff, so it displays the picture direcly. Most other clients i have tried dont and just produce a raw base64-string (luckely in this case) without transforming it into a webp picture. Or a download button that does nothing (because there is no actual url there)

This is why I disabled global search on https://advancednostrsearch.vercel.app

What relays were they coming from?

Were the npub posting it NIP05 verified?