Replying to Avatar Vitor Pamplona

Actually, I think the reporting function is pretty good in Amethyst. It does everything to get people to Block instead of reporting, which I think is a good call by nostr:npub1max2lm5977tkj4zc28djq25g2muzmjgh2jqf83mq7vy539hfs7eqgec4et at the time.

> Are you so sure this is the right way for users to engage in content filtering? By not being able to engage in it, but being dependent on someone else to have the same opinion as you?

I am not sure which filter you are talking about, but if it is a Report it does allow people to engage with it (it shows the "Show Anyway" button). If it is simple dumb spam, then the idea is to be temporary until those duplicates disappear. It can be better for sure, but somebody needs to spend time developing it.

> Instead, focus rather on a simplified content boosting approach, instead of content filtering.

We have more tools to boost content than to block it. So we are focusing on content boosting. But filtering is a necessity. No one wants the dick pics or non-family-friendly content when setting up their kid's accounts.

The majority of these features were developed because users didn't feel comfortable onboarding their own friends in the network. That feeling is changing but it's taking us a while to give them the confidence (and tools) to onboard others.

I definitely get where you're coming from with this, and the intention is right.

But, Devil's advocate...aside from children, what if my friends and family want to see the dick pics you mention? I don't, but I can hit block. It's not up to me or anyone else to decide what others see or partake in. If they choose to delegate that to be, great! I see a real use case for the ability to subscribe to another's mute list in the future. But I personally don't feel the default should be to remove content without user intervention. Perhaps a part of onboarding in the future can include a way to subscribe to community moderation from the start? As long as users are aware they are entering a filtered and curated experience.

Reply to this note

Please Login to reply.

Discussion

If you report, they can always see what you have marked. You are not deciding for them but adding information to their feed in the same way you do with likes, zaps, boosts or reports.

That's a solid approach, but I've still encountered individuals in my blocked list that I wouldn't have blocked and I never interacted with. I don't think that was the intention behind the filtering you set up, but humans are a wildcard.

It's tricky, there are a lot of accounts I follow and interact with that people close to me would never want to deal with. But they also may want to see things I block. I am mainly concerned with the end user being blind to what's happening, or individuals who may not realize their actions are getting them filtered ending up in a personal echo chamber.