Aristotelis Zervos
Aristotelis Zervos, Editorial Director at 2B Advice, combines legal and journalistic expertise in Data protectionIT compliance and AI regulation.
The planned Digital Fairness Act (DFA) of the European Union is intended to create a harmonized legal basis to prohibit manipulative design methods in digital environments (so-called dark patterns). The aim is to protect consumers from deception, influence and psychological manipulation. At the same time, there should be clear, uniform requirements for companies. The DFA fits into the existing EU regulatory architecture and complements legal acts such as the Digital Services Act (DSA)which Unfair Commercial Practices Directive (UCPD) and the Consumer Rights Directive.
Concept and manifestations of dark patterns
Dark Patterns are design patterns in user interfaces that are specifically aimed at influencing user decisions or making them more difficult. They can appear in various forms, such as
- "Roach Motel"Simple registration, but more difficult deregistration or termination.
- Fake urgencyCountdown timer or "only 2 left" notices with no real basis.
- Confirm ShamingFormulations that put the user under emotional pressure ("No, I don't want to save money").
- Hidden costsAdditional fees are only visible in the last order step.
- NaggingRepeated, annoying pop-ups to force an action.
These practices are not only problematic from an ethical point of view. They can also violate existing consumer protection and data protection regulations.
Regulatory background to the Digital Fairness Act
Regulation at EU level to date has been characterized by various legal acts and is therefore inconsistent.
The UCPD contains prohibitions on misleading or aggressive business practices, but no explicit definition of dark patterns.
The DSA addresses manipulative designs primarily on very large online platforms, without covering smaller providers.
The GDPR only applies selectively, for example in the design of effective consents, but does not cover all use cases.
As part of its "Fitness Check" in October 2024, the EU Commission found that consumers are still inadequately protected against manipulative design elements despite existing regulations. The Digital Fairness Act is intended to close these gaps, create a uniform legal basis across the EU and ensure that consumers are protected by clear Definitions and bans on certain design patterns to ensure greater consumer protection in the digital space.
Contents of the Digital Fairness Act
The DFA provides for a comprehensive catalog of measures to systematically prevent manipulative design practices in digital environments and to establish a uniform standard in the internal market.
Key elements are a precise and binding EU-wide definition of the term "dark patterns" and their categorization according to severity, functionality and potential impact on the decision-making behavior of users. This classification should make it easier for authorities to identify and sanction infringements more quickly.
Certain particularly harmful practices, such as the deliberate complication of termination processes, the concealment or misrepresentation of costs, the creation of artificial urgency or the emotional pressure to make certain decisions, are expressly prohibited.
Providers of digital services must design their user interfaces in such a way that contracts, consents and other legally relevant declarations can be concluded clearly, unambiguously and without hidden hurdles. This also includes the obligation to enable terminations or contract terminations with a procedure that is just as simple as the conclusion of the contract, for example through a clearly visible "one-click cancelation" function.
The draft pays particular attention to the protection of minors: Providers whose services are aimed at this target group or typically reach them must implement additional protective measures to prevent manipulative incentives and targeted influence.
In addition, the DFA prescribes detailed documentation and verification obligations with which companies must demonstrate how their design elements comply with the legal requirements. This evidence is intended to enable the supervisory authorities to efficiently monitor and enforce the regulations.
Process status and timetable
The legislative procedure currently envisages that the EU Commission will July 17, 2025 a public Consultation which was launched until October 9, 2025 is underway. The final legislative proposal is expected to be presented in the third quarter of 2026. A transitional period is then expected to allow companies to adapt their digital offerings.
Reading tip: Addressees of the EU Data Act - who is affected and what obligations apply?
Digital Fairness Act and implications for companies
For companies, the Digital Fairness Act means a significant tightening of the requirements for the design of user interfaces and marketing processes. It will be necessary to adapt existing online presences to Compliance to closely involve design and marketing departments in legal assessments, provide evidence of compliance with the new requirements and, if necessary, fundamentally adapt existing conversion optimization strategies. Infringements could be punished with considerable fines, the amount of which could vary as shown in the GDPR could be based on annual sales.
The Digital Fairness Act is therefore a significant step in European consumer and market regulation. It is intended not only to strengthen the protection of users against unfair digital practices, but also to create a level playing field for companies. It is advisable for providers of digital services to keep a close eye on developments and start reviewing the legal and design aspects of their digital products now in order to be prepared for the upcoming requirements at an early stage.
Practical tips: How companies are preparing for the Digital Fairness Act
1. check UX and UI design for dark pattern risks
Carry out a comprehensive Audit of your websites, apps and online stores. Identify design elements that could be classified as manipulative patterns - such as complicated termination processes, misleading urgency information or hidden costs.
2. integrate interdisciplinary compliance teams
Involve the legal department, data protection officer, UX designer, Marketing and IT together in the review. This ensures that legal requirements and user-friendliness are harmonized.
3. use clear, understandable language
Check all texts in order, registration and termination processes. Avoid emotional pressure, ambiguous wording and technical jargon.
4. simplify termination processes
Set up a "one-click-cancellation" function or an equally simple solution to terminate contracts and subscriptions.
5. prepare verification
Document changes to UX designs and processes. Record how decisions were made and risks assessed in order to be able to prove that no dark patterns are used in the event of audits or complaints.
6. special protective measures for minors
Check whether your products or services are aimed at minors and implement additional protection mechanisms to prevent targeted manipulation.
7 Adapt to the new legal situation at an early stage
Use the current consultation phase to provide feedback to the EU Commission. Plan resources for the implementation of the expected changes.
Aristotelis Zervos is Editorial Director at 2B Advice, a lawyer and journalist with profound expertise in data protection, GDPRIT compliance and AI governance. He regularly publishes in-depth articles on AI regulation, GDPR compliance and risk management. You can find out more about him on his Author profile page.





