Opinion | The Absurdity of Static Public Policy in Dynamic, Digital ‘Dark Patterns’
Opinion | The Absurdity of Static Public Policy in Dynamic, Digital ‘Dark Patterns’
Exponentially evolving digital spheres are ill-suited to rigid policymaking and irregular implementation. Digital public policy in this space needs to be agile and should access data to ensure compliance and monitoring if it has to be successful

We are all victims of dark patterns every single time we enter the digital domain. Every stroke of the keypad is monitored, managed and massaged to manipulate us into compliance behaviour by digital platforms. We don’t realise the ubiquity of this manipulation but neither do the policymakers.

There are moments when the path of public policymaking vacillates towards the absurd, particularly when charting courses in areas as rapidly changing as digital platforms. Static policymaking coupled with intermittent implementation simply fails in this swiftly evolving arena. ‘Dark patterns’—platforms’ clever and manipulative techniques designed to skew behaviour in favour of the platform, typically— underscore the urgency of facing this issue.

Guidelines have been introduced to counter this manipulation, perceived as harmful to consumer interests. Nevertheless, how efficacious is this traditional approach in the face of a fast-paced digital environment? Take a typical dark pattern that shows an artificial scarcity of a product. Even after the guidelines came out, it is still not being followed as it cannot be monitored. The most important dark pattern that was in the booking space was algorithmic pricing, which increased the pricing the number of times you searched for it, or even if you refreshed the screen on booking. This has not even been addressed or prohibited but it was the biggest challenge for consumers as it manipulates pricing.

To challenge manipulation, India’s Department of Consumer Affairs (DCA) launched a preliminary response on June 28, 2023. The department commenced a line of communication with prominent online platforms across the country, counselling against the use of dark patterns. An official from the government shared, “Our initial approach, on June 13, softly warned e-commerce businesses to steer clear of using dark patterns. Yet, on June 28, we adopted a sterner tone, issuing a clear warning of consequences should these practices persist.”

This series of developments firmly underscores that within a rapidly changing digital sphere, the route of rigid policymaking and irregular implementation is ill-suited. A more precise, explicit, and flexible approach is now essential in the interests of consumer protection in the digital age.

The approach of warning ‘gentle and sterner’ regarding anti-consumer behaviour is frequently observed in the domain of digital platforms. There’s a prevalent assumption that these platforms solely bear the torch of innovation, thereby significantly contributing to economic output, even as mounting research is beginning to challenge this notion.

To tackle this issue more effectively, the Department of Consumer Affairs (DCA) constituted a 17-member task force in August 2023 with a mandate to draft guidelines within the subsequent two months. This task force featured representatives from well-known bodies such as software industry associations like Nasscom, the Advertising Standards Council of India, and the National Law University, in addition to venture capital funds and leading e-commerce platform operators like Google, Flipkart, Amazon, and MakeMyTrip among others.

A closer look at the composition of this ‘scientific-expert’ task force indicates an intriguing skew. It predominantly comprises platform companies and an industry association representing these same entities, alongside a solitary academician with a legal specialisation. Remarkably, there are no representatives from regular consumers or independent experts, creating a significant lacuna in achieving an optimal policy balance.

The absence of diverse representation in policy task forces is not an isolated occurrence but typifies agency capture—either by design or inadvertently. Policymakers often mistakenly perceive industry representatives as the sole experts, leading to an oversight in including other stakeholders. This phenomenon is particularly prevalent in the sphere of platforms and emerging technologies, where the sector itself frequently becomes a self-regulated expert, defining the bounds of regulation for others, including consumers.

What is now evident is not merely an isolated example of industry or industry association monopolising expertise; it has evolved into a commonplace norm in policymaking. The draft regulations prepared jointly by the department and ASCI were subjected to public scrutiny for 25 days. However, there is a conspicuous lack of information regarding the public’s or consumers’ contributions to these draft regulations.

The policymaking journey from draft to the final set of guidelines, released on December 7, 2023, displayed no changes or additions. This status quo indicates one of two possibilities: either no feedback was received from consumers or consumer organisations, or the received inputs were not deemed relevant or meaningful enough to be incorporated into the final guidelines.

If the former is the case, it reflects poorly on Indian consumer organisations; the latter, however, is an even more disheartening critique of policymakers in charge of a consumer authority if they failed to integrate at least a single amendment in the final guidelines.

Such circumstances suggest a fracture in the policymaking process regarding consumer rights. It appears to largely exclude citizens from contributing to policy that impacts them daily. Digital platforms, given their pervasive influence on numerous daily decisions, should not be regulated by the industry’s own voice.

Dependence on industry representatives for expertise and scientific knowledge during the crafting of regulations in emerging technology domains can lead to a deficit of trust. Policymakers inherently play the role of the legislature—public representatives who are expected to resonate more closely with the voices of consumers or citizens. This procedure, however, seemed to be bypassed during the notification of guidelines—neither was it presented before the Parliament nor was there a proper consumer consultation process involved.

This approach not only diminishes the credibility of regulators but also compromises the very effectiveness of the regulations. Regrettably, this form of agency capture is a frequent occurrence in the platform world, with industry insiders even deeming it as a standard operating principle.

A lenient or “soft” approach towards platforms isn’t unique to Indian regulatory bodies—it’s a universal concern among regulators due to the significant political clout these platforms carry in the digital world.

Known for releasing billions in customer acquisition strategies, platforms aim to rapidly scale before competitors or regulations can even begin to pose a threat. Consequently, during this phase of expansion, they employ their extensive toolkit of strategies to persuade policymakers that their growth positively impacts the economy and the broader ecosystem.

This soft approach is evident in the final guidelines which ostentatiously “prohibit dark patterns”. The mechanisms for enforcing such a prohibition, however, remain unclear. Crucial aspects, such as the means to detect or prevent these ‘dark patterns’ or who would be held accountable for such practices on a platform, are left ambiguous. Furthermore, if an entity is responsible for managing or curbing these dark patterns within a platform company, no penalty or consequence is specified.

The loose structure of guidelines aimed at preventing consumer-centric malpractices leaves too much room for interpretation. This could have negative consequences, like failing to curb dark patterns or placing the onus on consumer courts, where individuals must grapple with powerful platform companies to uphold their rights. Alternatively, consumers might report violations to the Consumer Protection Authority, but the resulting actions and effectiveness in influencing platform behaviour remain uncertain.

Digital platforms are incredibly versatile and adaptive—much like a swift river that both shapes and conforms to its surroundings. As technologies evolve, the ‘better’ platforms constantly pivot and reinvent themselves to accommodate shifting customer needs, becoming super adept at staying ahead of regulations and working around narrowly defined legal frameworks. This applies to dark patterns too, new ones have been implemented while prohibited ones have not been done away with fully.

The issue lies where policymaking or episodic implementation is static, particularly when the only data source is the platform itself, which is potentially in violation. This is akin to expecting the offender to willingly supply incriminating evidence against themselves—a notion contradictory to natural behaviour. Moreover, it also allows the platforms to say that the policy decisions are not based on data when they themselves restrict data access to both researchers and policymakers.

The current regulatory and policy framework offers limited avenues for developing a dynamic, inclusive policymaking approach, especially concerning digital platforms. This represents a major shortcoming given the swift, adaptive nature of these platforms.

To address impending regulations in emerging technologies, including AI, it would be beneficial for public policy to incorporate adaptive thinking into curriculums and as subjects of research. This would prepare us for a future where managing rapidly evolving digital landscapes becomes the norm rather than the exception. Policymakers also need to insist on access to data from all platforms if they want to regulate these platforms.

To promote balance and inclusivity in policy design, policymakers must incorporate a more diverse range of representation. This will ensure that industry-driven perspectives do not overshadow consumer interests and independent expert insights. Policymakers also need to move away from static policymaking in the case of platforms and look at dynamic interventions. These dynamic interventions are what CIPP is working on and its research will show how can they be implemented.

K Yatish Rajawat is a public policy researcher and works at the Gurgaon-based think and do tank Centre for Innovation in Public Policy (CIPP). CIPP has been working on emerging tech policy issues. Views expressed in the above piece are personal and solely that of the author. They do not necessarily reflect News18’s views.

What's your reaction?

Comments

https://hapka.info/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!