Avatar
j
2590201e2919a8aa6568c88900192aa54ef00e6c0974a5b0432f52614a841ec8
keep nostr weird

*he asks nervously*

Soyjack.PNG: Noooo! I'll block your files from my relays

Gigachad.PNG: OK. Here store these kind 1 notes with my file secretly encoded in the first character of every sentence.

haha, yeah I certaintly spend less time on it now. Why I included the "even a badly run one".

On a different note, I think most of the management lay people were recommending (rebalancing, frequent fee changes, etc) were counter productive.

Replying to Avatar rabble

I shared this on the fediverse, bluesky, and twitter to get a sense of what folks thing from other communities. In particular I got feedback from one person who really knows the problem of content moderation, trust, and safety.

Yoel Roth, who ran twitter’s trust and safety team up until Elon Musk took over. He said

> “The taxonomy here is great. And broadly, this approach seems like the most viable option for the future of moderation (to me anyway). Feels like the missing bit is commercial: moderation has to get funded somewhere, and a B2C, paid service seems most likely to be user-focused. Make moderation the product, and get people used to paying for it.” - https://macaw.social/@yoyoel/110272952171641211

I think he’s right, we need to get the commercial model right for paying for the moderation work. It could be a crowd sourced thing, a company, a subscription service, etc… Lightning helps a ton here, because we can do easy fast cheap payments! Users can then choose which moderation service they want to use, or choose to not use any at all.

A group could even come together and setup a moderation service that was compliant with local laws. In the most extreme example, a company which provided a moderation service in China which was compliant with Chinese social media laws. If you installed a mobile app in china, it used that moderation service, locked in.

Another could be a church group which provided moderation services which met the cultural values of the group. That one doesn’t have the state, so it wouldn’t be locked for regions, but would still be very valuable to the users.

Perhaps there could be a nostr kids moderation service for users who are under 18?

Anyway, we need to find ways to fund and pay for these services. Especially since we can’t just take a cut of advertising revenue to cover the costs.

And as it’s open source and a permissionless network, what one app or user does isn’t imposed on others.

Lol

Uber's value add is their basic reputation system. Easy to do in a centralized way. Not so much in a decentralized way.

A "routing" node, even a poorly managed one, is good for privacy since routing provides some plausible deniability against your channel counterparties.

Also good if you live on bitcoin. Can buy $50 bitrefill cards without checking the fee market.

Amethyst is pretty damn great. Even has multiple accounts!

bro, sorry to be blunt but your “idea” is stupid. clients aren’t going to implement a new note kind just so relays can spam their users with ads. if anything clients need to be better at removing spam.

I think there’s lots of good ideas and if you took a second to read through my timeline that would be obvious, but here’s a couple:

- “uncle jim” relays like wss://anon.computer where some tech savvy person whitelists their friends and family

- dropbox model: free but you get bugged over dm when your “storage space” is running out and asked to upgrade

- the pay to post relays we already have