The significant workforce reductions at X heavily impacted the Trust and Safety (T&S) teams responsible for content moderation.

With a reduction in human moderation staff, the platform has become more reliant on automated systems. The layoffs changed manual reviews from a core part of the moderation process to a barely functional system reserved for rare exceptions. For the average user, the appeals process is now an automated loop that is difficult or impossible to escalate to a human.

Consequently, it's now harder than ever to have a nuanced manual review and revert suspensions.

Permissioned social media is a broken construct that'll always be susceptible to this BS. That's why we #Nostr

#rawyakihonne

Reply to this note

Please Login to reply.

Discussion

sounds right. I don't think Twitter ever had sufficient human moderation staff, and excessive automated bans and the lack of "due process" has always been an issue. maybe it got worse since Elon took leadership.

yes, this is one of the main reasons we should move to permissionless platforms, and Nostr is very promising. as for Yakihonne, I tried it, but had to give up (I kept getting 99+ ghost notifications).

They keep releasing updates, it seems fine now, just fixed a couple of problems yesterday.