The European Commission (EC) has issued preliminary findings that Meta Platforms (owner of Facebook and Instagram) and TikTok are in breach of key transparency obligations under the Digital Services Act (DSA). According to the EC, both companies failed to grant adequate access to researchers seeking platform data, and Meta specifically is accused of deploying confusing or “dark-pattern” interfaces that make it excessively difficult for users to report illegal content such as child sexual abuse material and terrorist content. The investigations began in 2024, and if the preliminary findings are confirmed, the platforms face fines of up to 6 % of their annual global revenue. Meta contests the findings and says it has already made significant changes to comply. TikTok cites conflicts with the EU’s privacy regulation (GDPR) as part of its defence.
Sources: The Guardian, AP News
Key Takeaways
– The EC’s preliminary finding centres on the DSA’s transparency obligations — namely, providing researchers access to platform data and offering users clear, accessible complaint/reporting mechanisms.
– Meta is singled out for allegedly using interface designs that confuse or discourage user reporting of illegal content, undermining the “notice and action” principle embedded in the DSA.
– Both companies face severe potential consequences (fines up to 6 % of global revenue) but still have the opportunity to respond or remedy the issues before final ruling.
In-Depth
In a significant regulatory development, Brussels has turned its attention to two of the biggest players in social media—Meta and TikTok—asserting that both may have breached the European Union’s Digital Services Act (DSA). The DSA was introduced to modernize the rules governing digital platforms, especially large ones, by demanding transparency, user protections, and accountability. One critical pillar of that regime is that platforms must enable independent researchers to access data that helps assess risks and harms arising from their operations—especially in areas such as youth exposure to harmful content, the spread of terrorist material, or other systemic risks to public safety and health.
According to the EC, Meta and TikTok are falling short of that obligation. Researchers have reportedly faced burdensome procedures to obtain data, or the data provided is incomplete, inconsistent or delayed. In addition, Meta is accused of designing the user experience in a way that makes it difficult for users to flag illegal content or to challenge moderation decisions. These so-called “dark patterns” (deceptive, confusing interface choices) may discourage user reporting and hamper the effective operation of content moderation mechanisms. If platforms fail to satisfy the DSA’s “notice and action” framework—where users must have a straightforward, usable path to report, appeal and seek redress—then the spirit of the regulation is undermined.
From Meta’s perspective, the company disputes that it violated the DSA and points to reforms it says it has already implemented (for example, improved reporting options, enhanced appeal mechanisms and data-access tools) within its EU operations. TikTok, meanwhile, raises a complication: the requirement to provide broad data access to external researchers may conflict with the EU’s General Data Protection Regulation (GDPR), which places strict limits on handling personal data—especially for minors. Therefore, TikTok is urging regulators to clarify how the DSA’s transparency obligations and GDPR’s privacy protections should be reconciled.
For regulators, this is a test of whether Europe can effectively enforce the DSA and hold Big Tech accountable—moving from drafting ambitious rules to actual oversight and enforcement.
For platforms, the risk is not just reputational but financial: fines of up to 6 % of global annual revenue can amount to billions of dollars for firms of Meta’s or TikTok’s scale. A finding here could set precedent and shape how platforms globally handle data access, moderation transparency and user rights.
For users and society, the outcome affects how visible and robust content-reporting mechanisms are, how much independent research can expose risks (such as algorithmic bias, harmful content exposure or addictive design) and whether these mega-platforms behave in ways that respect user safety and rights—not just commercial priorities.
A key takeaway from the regulators is that transparency isn’t optional; it’s now baked into EU digital law. Platforms operating in Europe must anticipate that regulators expect not only technical compliance but effective, meaningful mechanisms that support scrutiny, reporting, redress and independent oversight. From a conservative viewpoint, this stems from a belief that powerful platforms must not operate on the basis of hidden algorithms, opaque moderation, or commercial incentives without accountability. The digital ecosystem now increasingly regards openness and traceability not as burdensome regulation, but as essential infrastructure for trust, rule of law and user empowerment.
The coming steps are important: the EC’s findings are preliminary, meaning Meta and TikTok have the chance to respond, take remedial action, and engage in consultations before any final binding decision is made (which could include formal findings and fines). Their responses will signal how seriously they take compliance and how much effort they are willing to invest in reshaping their EU operations to satisfy the DSA’s demands. For anyone tracking platform regulation, digital policy, or the intersection of Big Tech and user rights, this episode could mark a milestone moment.
In sum: the enforcement spotlight has shifted from drafting lofty tech-regulations to executing them. Meta and TikTok are front-and-center. The question now becomes not whether regulation exists—but whether it actually works.

