Actually, I think the reporting function is pretty good in Amethyst. It does everything to get people to Block instead of reporting, which I think is a good call by nostr:npub1max2lm5977tkj4zc28djq25g2muzmjgh2jqf83mq7vy539hfs7eqgec4et at the time.

> Are you so sure this is the right way for users to engage in content filtering? By not being able to engage in it, but being dependent on someone else to have the same opinion as you?

I am not sure which filter you are talking about, but if it is a Report it does allow people to engage with it (it shows the "Show Anyway" button). If it is simple dumb spam, then the idea is to be temporary until those duplicates disappear. It can be better for sure, but somebody needs to spend time developing it.

> Instead, focus rather on a simplified content boosting approach, instead of content filtering.

We have more tools to boost content than to block it. So we are focusing on content boosting. But filtering is a necessity. No one wants the dick pics or non-family-friendly content when setting up their kid's accounts.

The majority of these features were developed because users didn't feel comfortable onboarding their own friends in the network. That feeling is changing but it's taking us a while to give them the confidence (and tools) to onboard others.

Reply to this note

Please Login to reply.

Discussion

"... when setting up their kid's accounts... "

Wow... if amethyst is an app for kids you should say so. There is not going to be enough filtering possible for loving parents to allow their kids use nostr. Parents might as well drop the kids off in time square and tell them to find their way home by dark.

I can't help but wonder if you really indend amethyst to be used by children or if you are just responding with the most emotional thing you can think of to win an argument.

Nuf said, I have uninstalled amethyst and will try iris for a while. Plebstr looks great but nobody should use it, or anything closed source. My kids are on see-n-say and will not be using amehtyst either.

I never intended to be for children (the PlayStore even limits to adults only). But users don't care about what my intentions are. They are trying anyway. So, I have to be mindful of that type of use.

You cannot be responsible for stupid parents. Thank you for publishing Amethyst open source.

I am not, but I am making an app for everyone. So, at some point, we will need to address the issue heads on and let parents create safe spaces for kids.

I definitely get where you're coming from with this, and the intention is right.

But, Devil's advocate...aside from children, what if my friends and family want to see the dick pics you mention? I don't, but I can hit block. It's not up to me or anyone else to decide what others see or partake in. If they choose to delegate that to be, great! I see a real use case for the ability to subscribe to another's mute list in the future. But I personally don't feel the default should be to remove content without user intervention. Perhaps a part of onboarding in the future can include a way to subscribe to community moderation from the start? As long as users are aware they are entering a filtered and curated experience.

If you report, they can always see what you have marked. You are not deciding for them but adding information to their feed in the same way you do with likes, zaps, boosts or reports.

That's a solid approach, but I've still encountered individuals in my blocked list that I wouldn't have blocked and I never interacted with. I don't think that was the intention behind the filtering you set up, but humans are a wildcard.

It's tricky, there are a lot of accounts I follow and interact with that people close to me would never want to deal with. But they also may want to see things I block. I am mainly concerned with the end user being blind to what's happening, or individuals who may not realize their actions are getting them filtered ending up in a personal echo chamber.