Super important. I think there are actually two separate problems that easily get conflated when thinking about this. As I see it, the first problem is "how to design an app" (from a product design point of view) where the main thing you're optimizing for is user experience, i.e. "is this enjoyable?", "is this useful?" because nobody can use, or wants to use, an app that is completely overrun by spam. Apps that solve the first problem may reach a large enough scale that they run into the second problem, which is something like "how to design a government" or "how to avoid being subsumed by the government". At that point, it's not clear what you should optimize for because people have different values. The problem space expands beyond the realm of tech into the realm of politics and big tech companies are forced into this fragile position at the interface, unable to effectively solve either problem.

Reply to this note

Please Login to reply.

Discussion

I think the solution to the problem you mention is to allow people to decide for themselves. (And parents to set standards for their children).

There’s a lot of age verification laws being passed right now. IMHO that’s completely the wrong direction.

Instead there should be device preferences which can be locked down by parents and validated using IETF’s Privacy Pass. Then apps and websites can filter based on those personal preferences, not what some government says.

In a scenario like that an app could have a thousand different content filtering standards for a thousand different users. But it needs to understand the content being presented. And beyond that it needs to see the content through the eyes of the user. A common vocabulary for moderation is one piece. “Trust lists” that let the user specify who they want making calls in gray areas is another piece of the puzzle.