Australia’s eSafety Commissioner has lost a critical legal fight after the Administrative Appeals Tribunal (AAT) ruled that hundreds of “informal communications” used to request content removal must cease — because they circumvent the formal notice-and-appeal process established under the Online Safety Act 2021. The Tribunal found that by using informal “complaint alerts” rather than formal takedown orders, eSafety had been avoiding judicial review and denying content creators due process. The decision has been upheld in multiple appeals, effectively outlawing these informal takedown requests. The Commissioner is now appealing the AAT’s ruling.
Sources: Epoch Times, Lexology
Key Takeaways
– The AAT has determined that “informal” takedown alerts issued by the eSafety Commissioner amount to statutory removal notices — and are therefore reviewable under the Online Safety Act.
– eSafety’s use of complaint alerts reportedly numbered in the hundreds annually, while formal removal notices were rare — raising serious concerns about transparency and due process for online users.
– This ruling reinforces the legal requirement that content removal must follow formal processes, protecting free-speech and accountability for regulators.
In-Depth
The recent decision by Australia’s Administrative Appeals Tribunal marks a significant check on the power of the eSafety Commissioner, an agency that has increasingly shaped the tone and reach of online content regulation. At the center of the dispute was the Commissioner’s longstanding practice of issuing “informal” complaint alerts — less formal than statutory removal notices but nonetheless often prompting social media platforms to remove content. Critics argued these informal alerts allowed the Commissioner to bypass the formal mechanisms under the Online Safety Act, thereby avoiding judicial review and procedural safeguards.
In its ruling, the Tribunal rejected that distinction. It held that when the Commissioner requests removal of a post and the platform complies, that request constitutes a “removal notice,” regardless of how it was labeled. That triggers the Act’s legislative safeguards, including the right to appeal — protections that informal alerts entirely sidestepped. As a result, the hundreds of informal communications issued over the past year were effectively declared unlawful.
Legal analysts have long warned that the use of complaint alerts undermines transparency. In one notable case before the Tribunal, a complainant under pseudonym challenged the Commissioner’s lack of statutory basis, arguing that the informal flagging system deprived users of due process. In handing down the decision, the Tribunal explicitly criticized the Commissioner’s “group think” approach and lack of documentation and formal delegation within the removal-request process.
The implications of the ruling are broad. First, platforms such as X (formerly Twitter), Telegram, and others will now be less inclined to comply with takedown requests absent a formal notice — reducing overreach and the risk of arbitrary censorship. Second, online content creators and everyday users have greater protection: any removal request must now explicitly meet statutory requirements, ensuring accountability and due process. Critics of the ruling — including those within eSafety — argue it makes it harder to swiftly remove harmful content. But from a civil-liberties standpoint, it reasserts an essential principle: regulatory power must rest on transparent, legally grounded authority.
Given this outcome, the eSafety Commissioner has appealed. So while this is currently a win for free expression and procedural fairness, the final outcome remains uncertain. Still, the Tribunal’s decision sets a strong precedent: content removal cannot rely on shadowy, informal mechanisms — it must follow the written law.

