On May 15, 2025, the EU Commission officially informed TikTok that the platform could be in breach of key obligations under the Digital Services Act (DSA). The focus is on the so-called advertising repository. This is a database that TikTok must provide in order to deliver transparent information about advertising on the platform.
EU investigates TikTok's advertising repository
Such a repository plays a central role in protecting the digital public sphere. It enables researchers, supervisory authorities and civil society to systematically analyze advertising content and identify targeted campaigns aimed at political influence, disinformation or targeting vulnerable groups, for example. This is particularly important in the context of elections, hybrid threats or manipulative information operations.
The Digital Services Act (DSA) obliges very large online platforms (so-called VLOPs) such as TikTok to provide a transparent and freely accessible advertising repository. This repository must be designed in such a way that it can be comprehensively searched and analyzed. It should not only document which ads have been placed, but also provide detailed information,
- the specific content of the respective advertisement,
- who financed the advertisement (name of payer),
- which target groups should be addressed,
- over what period and with what reach the ad was displayed.
Only by providing this information in a structured, machine-readable and accessible form can transparency be effectively guaranteed in the digital advertising system.
What accusations is the EU Commission making against TikTok?
As part of its investigation, the EU Commission has identified several serious shortcomings in connection with TikTok's advertising repository. In its view, these violate the requirements of the Digital Services Act (DSA). These findings are based on an in-depth analysis that included internal TikTok documents, practical tests of the repository functionalities and discussions with independent experts.
The main point of criticism is that TikTok does not adequately fulfill its obligation to provide transparent and complete information about advertisements. For example, the repository often lacks essential information such as the exact content of the ad, the target group, the time of publication and the identity of the paying party. However, according to Art. 39 DSA, this information is mandatory in order to enable the traceability and control of online advertising.
Furthermore, the Commission criticizes that the user-friendliness and functionality of the repository are considerably limited. The tool does not allow a comprehensive and efficient search of the stored advertisements. The filter options are also inadequate, which makes systematic evaluation by third parties - such as academia, the media or civil society - considerably more difficult. This impairs the functionality of the repositories as a central instrument of digital transparency.
Last but not least, the technical implementation is also inadequate. According to the Commission, the platform tools used to provide the data in the repository are not state of the art and are sometimes difficult to access. This contradicts the requirement that such systems must be machine-readable, reliable and openly usable.
From the Commission's point of view, it is therefore clear that TikTok's current repository does not meet the requirements of the DSA in terms of either formal law or practice. The shortcomings are not only of a technical nature, but also jeopardize the overarching objectives of the law, in particular the integrity of public communication and protection against manipulative influences in the digital space.
This is how the TikTok process continues
In the further course of the proceedings, TikTok will first have the opportunity to present its point of view and exercise its right to be heard. In concrete terms, this means that the company can inspect the EU Commission's complete investigation file. This includes all relevant documents, assessments and evidence on which the Commission based its preliminary findings. In addition, TikTok can comment in writing within a set period and submit its own arguments, clarifications or counter-evidence. This procedure is in line with the procedural guarantees enshrined in the DSA, which are intended to ensure a fair and transparent administrative procedure.
The European Board for Digital Services is consulted in parallel to this defense process. This board consists of representatives of the competent national supervisory authorities of the EU Member States and supports the Commission with technical assessments and recommendations, among other things. The involvement of the Board serves to harmonize enforcement practice and ensure uniform application of the DSA throughout the EU.
If TikTok is unable to refute the allegations and if the Commission maintains its assessment at the end of the proceedings, the company could face significant sanctions. These include, in particular, the imposition of a fine of up to 6 % of annual worldwide turnover. This measure is permitted under Art. 74 DSA and is intended to serve as an effective deterrent. In addition, the Commission can order an extended monitoring period. Within this period, TikTok must submit regular progress reports and demonstrate that the deficiencies in question have been effectively remedied. Finally, the DSA also provides for the possibility of imposing so-called periodic penalty payments. These are intended to encourage platforms to comply with their obligations in the event of persistent or repeated non-compliance.
TikTok under increased scrutiny in the EU
The current development is not an isolated case, but is part of a series of regulatory measures taken by the EU Commission against TikTok. Back in February 2024, the Commission opened a formal investigation into a number of potential breaches of the Digital Services Act (DSA). In addition to the advertising transparency now in focus, the Commission also investigated other structural problem areas within the platform.
A central point of criticism was the design of TikTok's algorithmic systems. These are suspected of creating so-called "rabbit hole" effects. This refers to the phenomenon of users increasingly being led into one-sided or extreme information bubbles by algorithmically curated content. For young user groups in particular, this can increase the risk of behavioral patterns of use and the spread of extremist or manipulative content.
The effectiveness of the age verification measures implemented by TikTok is also under scrutiny. The Commission expressed doubts as to whether the platform can reliably distinguish between underage and adult users and whether it provides them with age-appropriate content. This raises the question of whether the right to protection of minors is being upheld, which is a key concern of the DSA.
Another focus of the investigations concerns access to platform data for independent research. According to the transparency requirements of the DSA, very large online platforms are obliged to grant accredited researchers access to certain data if this is required to analyze systemic risks. TikTok is suspected of providing this access only to a limited extent or not at all.
There is also another procedure that the Commission initiated in December 2024. This focuses on risk management with regard to democratic processes, in particular elections, and the protection of civil society discourse. Here, too, the Commission is investigating whether TikTok has taken sufficient measures to effectively prevent manipulation, disinformation or other forms of election interference.
Reading tip: TikTok must pay 530 million euros fine for data transfer to China
EU holds platform operators accountable
The TikTok case shows that the Digital Services Act (DSA) is now entering the phase of concrete application and enforcement. The EU Commission is using its newly acquired powers to uncover systemic weaknesses in digital platforms and hold their operators to account. It should be emphasized that the DSA's regulatory approach is not limited to traditional data protection issues, but goes far beyond this. For example, the transparency of advertising practices, the protection of democratic processes and the traceability of algorithmic systems are also part of the supervisory review.
Platform operators must adapt to the fact that formal guidelines alone are no longer sufficient. What is needed are technically functioning, publicly accessible and user-friendly implementation tools that also meet the legal requirements in practice. Repositories that provide relevant information only incompletely or inaccessibly can be considered a violation - even if they are technically available.
Companies should adapt their internal compliance structures, introduce new control processes and regularly evaluate whether their systems meet the expectations of the supervisory authorities. This also includes viewing cooperation with research and civil society not as a risk, but as a necessary requirement for responsible platform management.
In practice, interdisciplinary know-how is therefore required in addition to legal expertise in order to meet the regulatory requirements with technological solutions. The DSA has significantly raised the standard for digital responsibility in Europe. Companies are well advised to actively shape this change instead of only reacting in the event of a complaint.
Source: Communication from the EU Commission on the TikTok infringement of 15.05.2025