Avatar
s3x_jay
667205eb525aa4a794859b2bd2bdd16e64ff57fd600880500fc53cdbf476439e
I'm the guy behind https://s3x.social and a bunch of other sites. For ~15 years I've focused on building & running #gay sites. I'm also one of the Community Ambassadors at https://XBiz.net - the #porn industry's leading B2B discussion forum. #LGBT #NYC #Harlem
Replying to Avatar s3x_jay

I’m thinking Kind 1985, since that’s what comes right after a Kind 1984.

Read this… https://s3x.social/nostr-content-moderation It’s how #[3]​ & I see things working. And #[4]​ (who works for Rabble) pointed me to an awesome research paper that parallels what I laid out.

Are you on Git? Have you seen the issue I raised and the PR Rabble & I started?

Ignore my “are you on Git?” question. Duh. We’ve had that conversation.

I’m thinking Kind 1985, since that’s what comes right after a Kind 1984.

Read this… https://s3x.social/nostr-content-moderation It’s how #[3]​ & I see things working. And #[4]​ (who works for Rabble) pointed me to an awesome research paper that parallels what I laid out.

Are you on Git? Have you seen the issue I raised and the PR Rabble & I started?

Guess what I’m saying is you might want to future-proof your spec so it won’t be disruptive when it happens. Right now you’re mixing things that apply to all sizes (the sky tag) with things that apply to a single size (dims, hash).

One suggestion…

Nostr doesn't really have the idea of srcset like figure>img does in HTML. AFAIK NIP-94 doesn't really have the ability to do it either because it's one hash per event and each size of an image would have a separate hash.

But what you're doing could be modified to support the idea of srcset - so it lists all the sizes available and then the client would pic the one that's appropriate for the UI at the in the moment. It could reduce bandwidth usage by the client when multiple sizes are available.

You can read Rabble's & my approach to content moderation here…

https://s3x.social/nostr-content-moderation

We have a pull request proposing "NIP-69" as a first step in that process, and have a more general "issue" open on Github discussing the topic more generally. (The links for both are in the link above).

The basic vision is "bottom up" content moderation where communities can moderate what people in their community sees via interconnecting webs of trust relationships. The end user will pick the people/organizations/bots they trust to have a voice in moderating their feed and only reports by those people will be factored in when the user's feed is filtered by a client app.

So for example, I could mark "exosome" as someone to block and he will then go on the block list of everyone who has designated me as trusted moderator of their feed. And he could go on the block list for relays that trust me as a moderator.

These discussions are good, I just thought of how the whole model can be extended…

Everyone is talking about how the client apps will eventually support algorithms that filter users' feeds - pushing some content up in the feed and pushing other content down in the feed. This model can factor into that… So the same events that are used to block content can be used to promote content if they had the ability to endorse as well as suggesting blocks and warnings.

For example, if a fellow Nazi really liked "exosome" he could endorse him and people who trust him would see more of exosome's posts. And to extend things further a user could have reverse trusts lists that basically say "do the opposite of whatever this person says to do" . The possibilities are actually rather interesting.

My #[0]​ account just got its first pseudo death threat here on Nostr. #[1]​ #[2]​ #[3]​ #[4]​ #[5]​ Nostr needs to take content moderation seriously! “NIP-69” that #[6]​ & I proposed is the first step in that process. Let’s get it approved…

Replying to Avatar rabble

I shared this on the fediverse, bluesky, and twitter to get a sense of what folks thing from other communities. In particular I got feedback from one person who really knows the problem of content moderation, trust, and safety.

Yoel Roth, who ran twitter’s trust and safety team up until Elon Musk took over. He said

> “The taxonomy here is great. And broadly, this approach seems like the most viable option for the future of moderation (to me anyway). Feels like the missing bit is commercial: moderation has to get funded somewhere, and a B2C, paid service seems most likely to be user-focused. Make moderation the product, and get people used to paying for it.” - https://macaw.social/@yoyoel/110272952171641211

I think he’s right, we need to get the commercial model right for paying for the moderation work. It could be a crowd sourced thing, a company, a subscription service, etc… Lightning helps a ton here, because we can do easy fast cheap payments! Users can then choose which moderation service they want to use, or choose to not use any at all.

A group could even come together and setup a moderation service that was compliant with local laws. In the most extreme example, a company which provided a moderation service in China which was compliant with Chinese social media laws. If you installed a mobile app in china, it used that moderation service, locked in.

Another could be a church group which provided moderation services which met the cultural values of the group. That one doesn’t have the state, so it wouldn’t be locked for regions, but would still be very valuable to the users.

Perhaps there could be a nostr kids moderation service for users who are under 18?

Anyway, we need to find ways to fund and pay for these services. Especially since we can’t just take a cut of advertising revenue to cover the costs.

And as it’s open source and a permissionless network, what one app or user does isn’t imposed on others.

I see a huge role for NGOs in moderating Nostr. I can see a lot of organizations wanting a role if Nostr really becomes a leading social media platform. SLPC, ASACP, FSC, ACLU to name a few on the liberal side, but also their conservative counterparts. They could do fund raising to give them the money to have a voice.

Where I see corporate involvement is in “AI-ish” bots that do a “decent” job detecting certain types of content. Those won’t be perfect but they’ll cover a lot of ground quickly.

In related news… One of our challenges, which I discovered today is that nostream (and presumably other relays) won’t accept Kind 1984 events from non-paid users even though non-paid users can read from the relay. That needs to be fixed ASAP - it’s a huge legal problem for the relay operators. I would go as far as saying it should be added to NIP-56 — “paid relay operators must accept Kind 1984 events from any user if it relates to content in their relay”.

#[4]

I use NOS occasionally. Planetary is the Scuttlebutt app, no?

Can you explain to someone who’s relatively new to crypto what that does / what it means?

Some of the people who've seen what #[0] & I have been proposing in "NIP-69" seem to think the objective is censorship. So to day I sat down and wrote out the bigger "vision" of where I'd like to see content moderation go on Nostr. Feel free to give it a read:

https://s3x.social/nostr-content-moderation

Just realize it's a first draft and needs work. But the point I hope I get across is that I want to see something that's individual and "bottom up". To me censorship is always top down since at the core of censorship is some authority flexing their power and enforcing their idea of what's good and bad - overriding your idea of good and bad.

Instead I want to see a cacophony of voices with individuals choosing which voices in that chaos they want to listen to for filtering their feeds. (Or they could choose to listen to none and see it all.)

But systems have to be put in place to make that a reality. It won't happen by accident.

And yes, the government will always force a certain level of censorship on us. But there are ways around that. For example our relay can't have anything related to escorting on it thanks to FOSTA/SESTA (horrible law), but people who need to do posts related to escorting could use #[1]'s relay. And that's the whole point with Nostr - it's censorship-resistant, not censorship-proof. Nothing is censorship-proof…

Today I posted on an adult webmaster board saying I have a different, more positive perspective on crypto thanks to Nostr. One of the OGs warned me about talking positively about crypto… Apparently there's a huge amount of animosity towards everything crypto in porn despite the fact that the porn industry pays ~10-12% in fees and lives under constant threat of Visa & Mastercard refusing to work with them at which point their business will completely fold… [Not to mention all the problems with banks - which are enormous.]

Here's how the conversation went (edited to make it flow):

Him: that sounds good, except 95% of consumers won’t use crypto

Me: Nostr will change that. It was designed by Bitcoiners with the intention of “orange pilling” users (their term for “onboarding” them - getting them to use crypto in daily life).

Him: so, bitcoin can be as low as it want’s if consumers don’t use it, it’s not a large solution.  in the “alternative billing” niche…

Me: if you get a customer onto Nostr. They’ll soon have a crypto wallet, and then they can pay you in crypto with basically no fees.

Him: that’s the barrier.  getting them into Nostr and seeing up a wallet.  huge bottleneck for volume. i think your best bet for development is focusing on nostr…  stay out of billing completely.  it’s a shark pit and will burn you alive. the sheer amount of vitriol of all the people that have tried alternative billing “solutions” that branded then is horrific and they will attack anyone coming in with pretty much anything , regardless how good it is.  Thin ice to tread

Me: Let’s put it this way, the majority of people who sign up for Nostr will eventually enable Lightning payments because it enhances their experience of using Nostr. It’s then up to site owners to figure out if they want to take advantage of that new opportunity or not. Nothing is stopping them from continuing to do things the same way.

Him: just giving you the heads up to the pitfalls in this realm.  Thar Be Dragons! I can think of at least 6 people on here that will eat you alive on this topic.

Me: I’m not forcing anyone to do anything. And BTW, Lightning payments weren’t even a thing before 2019. They are a new, streamlined, “off-chain” way of processing crypto. So people may have tried crypto payments, but they probably didn’t try Lightning payments.

Him: just be careful. these OGs do shoot the messenger

Me: It’s like trying Pernod and deciding you never want to drink alcohol ever again. I’ll be like “sure… you do you, but this Manhattan is delicious!”

Him: The ones that will attack are [insert names here] and these guys are old school porn mob and don’t mince words

Me: Oh, and don’t forget all the billing companies!

Him: the billing companies will remain mute to see which way the wind blows inc are they thing there is an opportunity for them to add crypto to their offerings

[a bit more discussion, then…]

Him: Do as you like, but just giving you fair warning you are about to step on a landmine

And check out #[5] 😈

Sadly, that one post is all I can see of his profile (he must be posting to a relay that filter.nostr.wine doesn't pull from…)

Right. I really despise FOSTA/SESTA but I shut down a whole section of my forum site that discussed experiences with hiring escorts when it passed because with backpage.com being taken down and the Feds raiding and arresting the folks from RentBoy.com, it wasn’t something that me, a small webmaster, could ignore.

I don’t have the patience to explain these things to people who refuse to listen to those of us who’ve been there, done that and survived.

“Decades” would put you back to 2003. You used the past tense - so before that.

Somehow those numbers don’t really work. Was it a BBS?

I’ve run a forum site with rather extreme sexual content for 13 years. Keeping my legal head down is part of the game.

If you’re volunteering to get sued or charged instead of me - please… go right ahead…

When it comes to getting hauled into court intent matters. I can document a track record of working to prevent the dissemination of CSAM and other illegal content.

Rabble and I proposed “NIP-69” as a first step to get workable, bottom-up content moderation on Nostr.

Just today I somewhat publicly challenged the ASACP (the porn industry’s organization that fights CSAM) to get involved in the content moderation questions related to Nostr. I asked them to make statements supporting your approach to NIP-94 and to oppose NIP-95.

I’m doing my part. My intent is clear.