The European Commission is reviewing tech regulations to assess their effectiveness in ensuring “digital fairness.” Previous regulations include GDPR, DSA, DMA, and the upcoming AI Act. There are concerns about the potential introduction of a Digital Fairness Act due to vague definitions of practices like “dark patterns.” The EU is urged to assess the impact of existing laws before introducing new regulations to avoid unnecessary complexity and overlap.
Over the past few years, the European Commission has implemented several significant tech regulations, the most notable being the General Data Protection Regulation (GDPR) in 2016, followed by the Digital Services Act (DSA) and Digital Markets Act (DMA) in 2022, and the forthcoming AI Act in 2024. Recently, on October 3, 2024, the Commission initiated a “fitness check” on EU consumer law to evaluate the effectiveness of these regulations in promoting “digital fairness.” The outcome may lead to the creation of a Digital Fairness Act, but there is a pressing need to pause and assess the impact of existing rules before introducing more layers of regulation.
Unlike the United States, which remains a leader in technological innovation, Europe has positioned itself as a frontrunner in technological regulation. This shift is largely attributed to Europe’s precautionary approach, motivated by concerns over potential negative societal impacts from technology. Advocates for this approach characterize it as “values-based,” contrasting it with the U.S. model that emphasizes innovation and growth—values that have reportedly diminished under Europe’s regulatory landscape, according to the Commission’s European Competitiveness Report dated September 9, 2024.
The proposed Digital Fairness Act is expected to target a variety of controversial practices in online commerce, including personalized pricing, targeted advertising, influencer marketing, and the use of AI chatbots. A recurring theme in the discussions of these practices is the concept of “dark patterns,” which are design elements intended to manipulate users into actions beneficial to service providers but potentially contrary to user interests.
The ambiguity surrounding the definition of dark patterns raises concerns, as it may broadly include many standard marketing tactics that do not harm consumers. For example, marketing strategies aimed at appealing to specific demographics or upselling products are time-honored practices that do not inherently exploit consumers. Confusing such tactics with genuinely harmful practices, like misleading price adjustments during checkout, could result in regulation that is both ineffective and counterproductive.
As the European Union grapples with a multitude of tech regulations, further hastening to establish new ones would be unwise at this stage. With the DSA, DMA, and AI Act still establishing their roles—and considering the relatively recent implementation of the GDPR—regulatory bodies should adopt a cautious stance. Conducting a thorough review of how existing regulations affect consumer welfare is essential before adding additional layers of regulation to the tech industry.
Original Source: itif.org