In a significant shift from its previous practice, X, formerly known as Twitter, has finally unveiled its first transparency report since being acquired by billionaire Elon Musk in October 2022. The report, which covers the first half of 2024 and spans a brief 15 pages, marks a departure from the more detailed biannual publications released prior to Musk's takeover. According to analyses, this latest document reveals a blend of machine learning and human intervention in moderating content, signaling a new era in the platform's approach to governance.

The report disclosed that during this period, X users flagged an astonishing 224 million instances of potentially violative content or accounts. Out of these, 5.3 million accounts were suspended. Notably, over half of these suspensions—2.8 million—were for violations related to child sexual exploitation. This figure starkly contrasts with the platform's earlier statistics; for instance, in the latter half of 2021 alone, nearly 11.6 million accounts were reported with around half involving hateful content allegations leading to action against 4.3 million accounts and suspending 1.3 million.

X also significantly increased its reporting on child exploitation to the National Center for Missing and Exploited Children (NCMEC), submitting 370,588 reports in the first six months of 2024—a substantial rise from previous years' figures that hovered around tens to hundreds of thousands.

Despite receiving nearly 67 million complaints about hateful conduct within the same timeframe, X acted against only 2,361 accounts for such violations—a dramatic decrease compared to the one million suspensions for hate speech noted in late 2021. This reduction coincides with policy changes at X that have narrowed what constitutes actionable hate speech on the platform.

Freedom-Loving Beachwear by Red Beach Nation - Save 10% With Code RVM10

Moreover, X reported taking action—either through removal or labeling—on approximately 10.7 million posts for various rule violations throughout early 2024. The majority of these interventions were automated—a reflection of tech-driven moderation efforts gaining prominence on the platform.

The company also navigated numerous government inquiries and demands over this period, fielding over 18,000 requests for user information and complying with more than half (52%). Additionally, it received upwards of 72,000 appeals for content removal from global jurisdictions and complied with a majority (70%) of these requests—an uptick in compliance rates compared to data from late 2021.

This transparency report not only underscores how X has evolved under Musk’s leadership but also raises questions about how shifts in moderation policies are impacting both user safety and expression on what remains one of the most influential social media platforms globally.

How do you think the changes in X's moderation policies under Elon Musk's leadership will impact user safety and expression on the platform?

Freedom-Loving Beachwear by Red Beach Nation - Save 10% With Code RVM10