Close

X Is Considering Some Moderation Tactics After AI Snafu

January 30, 2024

1 min 25 sec read
5 Facebook Twitter Whatsapp Facebook Twitter LinkedIn Pinterest WhatsApp Copy Link Your browser does not support automatic copying, please select and copy the link in the text box, then paste it where you need it.
While Elon has insisted that crowd-sourced input is the way to curb harmful content, it seems there's still a bit of a challenge. Last week, AI-generated images of Taylor Swift being sexually assaulted by NFL fans grabbed the spotlight on X, gaining over 27 million views and 260,000 likes before the account that originated the post got the boot.

Hand Holding a Cell Phone with Rejected on Screen
Needless to say, Taylor was not happy about this, and there are reports she will take legal action against the creator of the content and X.

While X couldn't contain the runaway freight train spread of the content, even in light of the suspension, the platform has currently banned all "Taylor Swift" searches in the app.

While the content violated X's Sensitive Media policy, which means it would have been removed at any rate, some say that the platform's reliance on the crowd-sourced Community Notes may not be the best way to approach it. Many feel it's time for X to get its in-house moderators.

X has already taken steps by announcing it would construct a 100-person content moderation center in Texas, with the primary goal of focusing on child sexual abuse content, but "tasked with managing other elements as well."

Is this an admission by X that Community Notes isn't all it's cracked up to be? It also raises the question about whether or not the new center goes against X's "freedom of speech, not reach" approach, which revolves around the concept that the community should decide what's suitable or not for the platform. After all, X has always stated that there shouldn't be a "central arbiter of moderation decisions" like there was in the Twitter days.

Is Elon rethinking that approach?

While Community Notes somewhat addresses this, it's clear there's work to be done on a larger scale to stop the rampant spread of harmful content. From the start, way back when Elon got rid of 80% of his staff, it seemed a bit unrealistic to believe the platform could maintain the capacity to police itself, hence the reliance on Community Notes, which clearly isn't working. Just ask Taylor Swift and her lawyers.

Want to read this in Spanish? Spanish Version >>

5 Facebook Twitter Whatsapp Facebook Twitter LinkedIn Pinterest WhatsApp Copy Link Your browser does not support automatic copying, please select and copy the link in the text box, then paste it where you need it.