India Cracks Down on Digital Deception: Guidelines Target Dark Patterns in E-Commerce
HomepageBlog
India Cracks Down on Digital Deception: Guidelines Target Dark Patterns in E-Commerce

India Cracks Down on Digital Deception: Guidelines Target Dark Patterns in E-Commerce

Jenny Li
Jenny Li

On November 30, 2023, India's Central Consumer Protection Authority released the "Prohibition and Regulation of Dark Patterns 2023 Guidelines" (hereafter referred to as the "Guidelines") in an official gazette, following an open consultation by the Ministry of Consumer Affairs.
Effective from the date of announcement, the Guidelines define "dark patterns" as "any practices or deceptive design patterns using UI/UX (user interface/user experience) interactions on any platform; designed to mislead or trick users to do something they originally did not intend or want to do; by subverting or impairing the consumer autonomy, decision making or choice; amounting to misleading advertisement or unfair trade practice or violation of consumer rights." In other words, it is about tricking users into unintentional actions such as making purchase decisions, equivalent to misleading advertising, unfair trade practices, or violation of consumer rights.
Additionally, dark patterns include displaying false popularity of products or services to manipulate user decisions, defined as any practice or "deceptive design pattern" used in user interfaces or user experience interactions on any platform, intended to mislead or deceive users into doing something they originally did not intend or want to do.
The Guidelines apply to all platforms, advertisers, and sellers systematically offering goods or services in India and state that no one, including any platform, may engage in any dark pattern behavior. The Guidelines list the following 13 types of specific dark patterns that deceive consumers:
  1. False urgency: Misleading users to create a sense of scarcity and then enticing them to buy.
Illustration:
Presenting false data on high demand without appropriate context. For instance, “Only 2 rooms left! 30 others are looking at this right now.”
  1. Basket sneaking: Adding items to the shopping cart without user consent, leading to a total amount exceeding the selected products or services.
Illustration:
Automatic addition of paid ancillary services with a pre-ticked box or otherwise to the cart when a consumer is purchasing a product(s) and/or service(s).
  1. Confirm shaming: Using text, video, audio, or other media to evoke fear, panic, or ridicule in users, leading them to take certain actions.
Illustration:
A platform for booking flight tickets using the phrase “I will stay unsecured”, when a user does not include insurance in their cart.
  1. Forced action: Requiring users to buy additional products, related services, or share personal information.
Illustration:
prohibiting a user from continuing with the use of product or service for the consideration originally paid and contracted for, unless they upgrade for a higher rate or fees.
  1. Subscription trap: Making paid subscriptions difficult to cancel or forcing users to provide payment details for free subscriptions.
  1. Interface interference: Disturbing designs that manipulate the user interface to hide key information.
Illustration:
Designing a light colored option for selecting “No” in response to a pop-up asking a user if they wish to make a purchase or concealing the cancellation symbol in tiny font or changing the meaning of key symbols to mean the opposite.
  1. Bait and switch: Misleadingly offering an alternative outcome instead of what the user expected.
Illustration:
A seller offers a quality product at a cheap price but when the consumer is about to pay/buy, the seller states that the product is no longer available and instead offers a similar looking product but more expensive.
  1. Drip pricing: A deceptive pricing practice including hidden or undisclosed prices.
Illustration:
A consumer is booking a flight, the online platform showcases the price as X at the checkout page, and when payment is being made, price Y (which is more than X) has been charged by the platform to the consumer.
  1. Disguised advertisements: Illegal content submitted in the form of user-generated content, news articles, or fake ads.
  1. Nagging: Frequent interruptions and disturbances in user interactions to increase engagement.
Illustrations:
a. Websites asking a user to download their app, again and again
b. Platforms asking users to give their phone numbers for supposedly security purposes
c. Constant request to turn on notifications with no option to say “NO”
  1. Trick questions: Misleading users with confusing or unclear language.
  1. SaaS billing: The process of collecting and aggregating payments from consumers based on subscriptions in the SaaS business model.
  1. Rogue malware: Using ransomware or spyware to mislead users into paying to remove fake malware.
Rohit Kumar Singh, India's Minister of Consumer Affairs, stated that with the rapid development of e-commerce, more and more online platforms are using dark modes to mislead or manipulate consumers' purchasing choices and behavior. He added that the newly issued guidelines aim to provide clear definitions for all stakeholders, including buyers, sellers, markets, and regulatory bodies, clarifying what constitutes unacceptable unfair trade practices.
The introduction of India's "Dark Modes Guidelines" marks a pivotal shift towards greater transparency and fairness in the realm of digital interactions. To align with these guidelines and foster a more ethical digital landscape, corporations need to undertake specific, actionable measures:
  1. Implement Transparent Pricing Strategies: Ensure all costs, including those traditionally hidden or drip-priced, are clearly stated upfront. Transparency in pricing not only aligns with the guidelines but also builds customer trust.
  1. Revise Advertising Practices: Scrutinize advertising methods to eliminate any form of misleading content. Advertisements should be straightforward, honest, and provide a clear depiction of products and services.
  1. Design Ethical User Interfaces: Review and redesign user interfaces and experiences to avoid manipulative patterns. This includes clear opt-out options, straightforward language, and honest messaging.
  1. Develop and Enforce Compliance Policies: Create or update internal policies to explicitly prohibit dark patterns in all digital interactions. Regular training sessions should be conducted to ensure all team members are aware of these policies.
  1. Conduct Regular Audits: Regularly audit digital platforms and marketing strategies to ensure ongoing compliance with the guidelines. This should include a review of user interfaces, advertising materials, pricing strategies, and overall customer interaction.
  1. Establish a Consumer Feedback Loop: Set up mechanisms for receiving and addressing consumer feedback on potential dark patterns or deceptive practices.
  1. Engage in Continuous Learning: Stay informed about evolving digital laws and ethics to ensure practices remain up-to-date and compliant.
To effectively navigate these changes, corporations may benefit from professional consulting services specializing in digital compliance and consumer rights.