1. Website Planet
  2. >
  3. News
  4. >
  5. X To Open a New Content Moderation Center in Austin
X To Open a New Content Moderation Center in Austin

X To Open a New Content Moderation Center in Austin

Ivana Shteriova February 16, 2024
February 16, 2024
X (formerly Twitter) plans to open a new content moderation center in Austin, Texas. The “Trust and Safety center of excellence” will primarily focus on stopping child sexual exploitation (CSE) materials on the platform.

Aside from CSE materials, the new safety center will also address other issues like hate speech and posts promoting violence.

The company’s head of business operations, Joe Benarroch, revealed X’s plans to hire 100 full-time employees at the new location but did not share any information on when the center will start operating.

Speaking to Bloomberg, Benarroch said, “X does not have a line of business focused on children, but it’s important that we make these investments to keep stopping offenders from using our platform for any distribution or engagement with CSE content.”

The announcement comes just days ahead of X CEO Linda Yaccarino’s appearance before the Senate Judiciary Committee, along with CEOs from other major tech companies including Meta, TikTok, and Snapchat.

X has attracted widespread criticism regarding its safety efforts under Musk’s leadership. The business mogul took over the company in 2022 when he laid off 80% of its employees, including the majority of Twitter’s content moderators.

Since acquiring Twitter, Musk, who describes himself as a “free speech absolutist,” has taken steps to turn Twitter into a “free speech bastion.” He went on to reinstate previously suspended accounts of controversial figures. He also announced that users would no longer be able to block other users, along with pushing back certain policies related to misinformation.

X came under fire during the Hamas-Israel conflict for Musk’s antisemitic tweet and accusations that the platform has been fertile soil for misinformation and hate speech related to the war. At that time, the EU launched a formal investigation into the company’s content moderation practices.

Previously, the Center for Countering Digital Hate (CCDH), an online hate speech watchdog, reported that out of 100 hate speech posts on the platform, Twitter failed to take action against 99% of them. Musk responded with a legal action against the nonprofit, accusing it of fabricating faulty results.

While the decision to establish a new content moderation team of 100 employees seems like a move in the right direction for X, it’s worth noting that before Musk, the platform had around 1,500 employees in charge of tracking abuse and enforcing misinformation policies.

Rate this Article
4.3 Voted by 4 users
You already voted! Undo
This field is required Maximal length of comment is equal 80000 chars Minimal length of comment is equal 10 chars
Any comments?
Reply
View %s replies
View %s reply
More news
Show more
We check all user comments within 48 hours to make sure they are from real people like you. We're glad you found this article useful - we would appreciate it if you let more people know about it.
Popup final window
Share this blog post with friends and co-workers right now:

We check all comments within 48 hours to make sure they're from real users like you. In the meantime, you can share your comment with others to let more people know what you think.

Once a month you will receive interesting, insightful tips, tricks, and advice to improve your website performance and reach your digital marketing goals!

So happy you liked it!

Share it with your friends!

<

Or review us on

3248171
50
5000
74307664