The European Commission concludes that Meta and TikTok fail to comply with the DSA due to lack of transparency and obstacles to access to data for research
Europe had been announcing for months that the Digital Services Act (DSA) It was not a toast to the sun. The ongoing investigation against Meta and TikTok makes this clear: Brussels believes that both platforms have failed in key obligations of transparency and data access.
We are not talking about poorly laid out forms; We talk about whether researchers, regulators and users can truly audit how content is moderated, how algorithms impact it and what happens when a minor stumbles across harmful material. The DSA wanted those responses not to depend on anyone’s good will. If now the European Commission says that this access is opaque or onerous, the problem stops being technical and becomes structural.
What does the DSA mean by “transparency”
The DSA requires very large platforms to offer access mechanisms to public data for independent investigation, to maintain simple reporting channels for illegal content and to explain systemic risks: disinformation, protection of minors, electoral integrity, opaque advertising. Transparency is not uploading a quarterly PDF; is to provide traceability to decisions and allow replicability of analysis. If an academic cannot draw representative samples or needs weeks to overcome administrative barriers, transparency becomes an empty promise. Hence, the Commission focuses on complicated procedures, incomplete responses and design patterns that discourage users from exercising their rights.
The Sore Point: Data Access and Design That Pushes in the Wrong Directions
The most serious reading is double. On the one hand, insufficient or late access to the data necessary to evaluate real risks. On the other hand, interface and flows that, according to the EC, include dark patterns: paths that “push” the user to less protective options, or that turn what should be a click to report content into a labyrinth.
If reporting a fraudulent seller or asking for an explanation of an algorithmic recommendation requires ten screens, The function exists, yes, but it does not work. The DSA sought the opposite: that the protections were usable and that researchers had raw material to measure impacts.
What each party is at stake in the short term: fines, changes and precedents
The procedure is not closed, but DSA foresees sanctions of up to 6% of global business if violations are confirmed. Beyond the headline, what matters is the operational precedent: if Brussels forces us to review reporting forms, simplify access and expand data windows, other services will follow the same path for pure compliance economy.
For Meta and TikTok, the cost is not just a fine: it is reengineering of processes and governance so that “we comply” withstands external audit. And for the rest of the sector, the warning is clear: useful transparency or expedient.
Neither moral crusade nor cultural war: it is audit capacity
It is convenient to park trench visions. The DSA does not dictate what to think or what content is “good”; It requires the ability to audit risks and clear mechanisms to mitigate them. In electoral integrity, for example, the debate is not about “banning” messages, but rather knowing how they are amplified, who pays for campaigns and with what segmentation. In minors, the question is not to “remove” the Internet, but to measure exposure, adjust recommendations and deactivate aggressive practices. The Commission maintains that, today, these measurements are not reliable because there is a lack of data or excessive obstacles. That is the heart of the case.
What should change if the EC tightens (and how to measure it)
If this file leads to formal obligations, we will see three fronts.
- Access to standardized data, with formats and deadlines that allow replicable studies without case-by-case negotiations.
- Simplified, visible and traceable reporting channels: what was reported, what was done and why.
- Pragmatic algorithmic transparency: do not open the code, but expose enough signals (inputs, objectives, controls) to evaluate systemic risks.
The metric will not be a blogpost; It will be that external researchers can reproduce results and that an ordinary user can understand and exercise their options without a manual.
How does this fit into the European regulatory map?
The DSA does not live alone. It coexists with the DMA (market power), with digital advertising standards, with protection of personal data and with guides on AI and synthetic content. The underlying message is coherent: if your platform conditions public space, you must accept proportional auditing. For Europe, furthermore, it is a geopolitical bet: it sets standards that then jump from region to region due to a domino effect. If Meta and TikTok reconfigure processes for the EU, they rarely maintain two completely different systems in other markets for long.
The noise of the record will pass; What will remain is if something changed in practice. It will be noticed if reporting problematic content stops being a gymkhana, if research files stop being dribs and drabs, and if algorithmic explanations stop being vague formulas. That’s the DSA bar: less declaration, more verification. If the platforms cross it, it will gain trust and, with it, the legitimacy of the business model itself. If not, Brussels already has the tool and the case to tighten more. In either scenario, the time for “on demand” transparency appears to be over.
