In March 2026, a veteran journalist discovered that years of work and audience-building had disappeared overnight when Instagram and Threads removed both of the journalist's accounts without warning. The notice cited child-safety rules, a category that has repeatedly swept up benign material, from family photographs to public-interest reporting, and the appeal route offered little practical recourse. Meta has acknowledged errors in comparable cases, while the broader debate over platform moderation has only intensified since the company replaced third-party fact-checking with Community Notes in January 2025 and then rolled that system out more widely across Facebook, Instagram and Threads.

What makes the episode more than a personal grievance is the scale of the system behind it. According to the European Commission, the first harmonised transparency reports under the Digital Services Act began appearing in March 2026, giving regulators and researchers a clearer view of content removal rates and user appeals. Those reports build on reporting rules that took effect in July 2025 and were designed to make moderation practices easier to compare across platforms.

The European Union has also shown it is willing to punish failures. In December 2025, the European Commission fined X €120 million for transparency breaches under the Digital Services Act, citing misleading verification practices, weaknesses in its advertising repository and limited access for researchers to public data. The penalty underscored how far the regulatory mood has moved from hands-off tolerance to demands for auditable procedures and clearer explanations.

That is exactly the gap the article addresses: the mismatch between the power platforms exercise and the lack of due process behind that power. Apple’s DSA transparency report for the first half of 2025 offers one illustration of what formal reporting can look like when companies are forced to disclose notices, orders and moderation actions in a structured way. Such disclosures do not solve the political problem of online speech, but they do make enforcement easier to inspect.

The argument also reflects the direction of travel in the wider platform world. Meta’s move towards Community Notes was presented as a response to complaints about biased fact-checking, while X has leaned further into crowdsourced moderation and algorithmic systems. Yet the central criticism remains unchanged: decisions can be swift, but the reasons are often opaque, appeals are weak and the error rates stay hidden from the public.

Seen in that light, the call for a basic procedural floor is not radical. It is a recognition that social media now performs a public function, even though it is privately owned. The platforms may not be governments or utilities, but their decisions shape visibility, reputation and participation at a scale that affects journalism, politics and everyday life. If they want the authority that comes with that role, the article argues, they will increasingly have to accept the obligations that go with it.

Source Reference Map

Inspired by headline at: [1]

Sources by paragraph:

Source: Noah Wire Services