We publish the objective news, period. If you want the facts, then sign up below and join our movement for objective news:


Latest News

Elon Musk’s 'X' sues California over censorship law

 September 10, 2023

Elon Musk's X Corp is taking legal action against California over a content moderation law.

Musk's platform has initiated a lawsuit against the Golden State, contesting a content moderation law company officials deem restrictive. The move comes after Musk's purchase of Twitter last October, which introduced significant alterations in the platform's content management, as NASDAQ reported.

Musk's bold move after Twitter acquisition

Upon acquiring Twitter for a staggering $44 billion, Musk took swift action to transform the company.

As a staunch supporter of free speech, he eliminated several positions previously dedicated to content scrutiny.

Musk also made the decision to reinstate some prominent user accounts, which had been previously banned by the platform's old management.

This drastic change in policy raised many eyebrows. Notably, organizations like the Anti-Defamation League (ADL) and the Center for Countering Digital Hate have claimed that a spike in hate speech has since occurred on the platform.

This surge allegedly targeted communities such as Jews, Blacks, gay men, and transgender people in the wake of Musk's management takeover.

Musk is not just the man behind X Corp. As the world's wealthiest individual, he also heads the electric car manufacturer Tesla and the ambitious space enterprise, SpaceX.

California's stance on content moderation

California has been assertive in its approach towards content moderation on social media platforms.

AB 587, the law in question in the case, requires major social media entities with annual revenues exceeding $100 million to produce biannual reports.

These reports are expected to detail the firms' content regulation strategies, shedding light on objectionable posts and the methods used to tackle them.

The legislation also obliges companies to share their terms of service. Non-compliance can result in significant fines, amounting to $15,000 per day for each breach.

Gavin Newsom, the Democratic governor of California, endorsed this law last September.

He expressed his commitment to preventing social media platforms from being used to disseminate hate and misinformation.

Impact on X's revenue and brand safety

Following his acquisition of Twitter, Musk faced criticism for his staffing and content choices.

This Monday, he attributed a drastic 60% plunge in U.S. advertising revenues to his detractors, including groups like the ADL.

A.J. Brown, who served as X's head of brand safety and ad quality until his resignation in June, shared his insights in a recent interview.

He believed that Musk's policy shifts, which prioritize obscuring harmful posts over removing them, complicated efforts to assure advertisers of the platform's safety.

The office of California Attorney General Rob Bonta has yet to present a formal response. However, officials have indicated their intention to address the issue in court.

Case details and future implications

The case, filed as X Corp v Bonta, was filed in the U.S. District Court, Eastern District of California, under the reference No. 23-at-00903.

Many industry experts are closely monitoring the developments.

The outcome could set a precedent for other states and major corporations, potentially influencing future legislation and content moderation strategies across the country.


  • Elon Musk's X Corp opposes California's content moderation law, leading to a lawsuit.
  • After acquiring Twitter, Musk reinstated banned accounts and reduced content moderation roles.
  • Hate speech instances reportedly increased in the wake of these changes.
  • California's AB 587 law insists on greater transparency from major social media companies.
  • Non-compliance with the law could result in substantial fines.
  • X Corp's advertising revenue has suffered a significant hit, which Musk attributes to critics of his policy changes.
  • The outcome of the case could have broader implications for content moderation in the U.S.