Dark patterns, the dark side of nudges

Nudge on the “dark” side, dark patterns are interface design techniques that aim to deceive or manipulate a user into performing an action against their will or interest. Increasingly widespread on the web, dark patterns are implemented to maximize profits or collect the most personal data.

Within ethical design and user experience, dark patterns pose a problem because they constrain the user.

But what exactly is the origin of dark patterns, how to recognize them, what are their consequences?

The origin of dark patterns

“Pattern”, “pattern”, “dark” schema… there is no real translation in French to translate “dark patterns”. While a “pattern” is in software engineering a way to improve, stabilize or secure software, dark patterns are “dark ways” of influencing user behavior.

The term “dark patterns” was coined by Dr. Harry Brignull, a user experience specialist, who created the website darkpatterns.org in 2010, which became deceptive.design in 2022, sharing “tricks used on websites and applications to encourage us to do things we don’t want to do”.

In addition to being a wealth of information and examples, this site also lists laws and news around dark patterns.

Types of dark patterns according to Dr. Harry Bignull

Prevention of comparison

The user has difficulty comparing the features and prices of a product, because they are organized in a complex way or are complicated to find.

Example: T-Mobile presents its subscriptions from the most to the least expensive.

Shame Confirmation

User is manipulated into doing something they wouldn’t normally do

Example: Incentivizing an offer by using a unique opt-out button label

Disguised Ads

User believes they are clicking on an interface element or native content when it is actually an ad

Example: Ads with fake download buttons on software download sites.

False Shortage

User is pushed to take action (most often a purchase) because of a limited offer or popularity.

Example: Showing low stock and high sales in a very short period of time induces false popularity. Below, Digitec highlights flash offers with limited stock.

Fake social proof

The user is misled by a large number of testimonials, fake testimonials, etc.

Example: Several extensions like Trustpulse show “live” notifications about the site’s activity (reservation, purchase, etc.)

False urgency

The user is pushed to take action because they are presented with a false time limit

Example: Newsletters with a counter indicating the end of a promotion

Forced action

The user wants to take an action, but the system asks them to take another unwanted action in return

Example: Forcing a customer to create an account to place an order.

Difficult to cancel

After being very easily registered, the user finds it complicated or even impossible to unsubscribe.

Example: Canceling an Amazon Prime subscription requires several actions and goes through several stages of dissuasion.

Hidden Costs

User is lured by a low price, but ends up with unexpected fees after investing effort

Example: On UberEats, fees are not announced when placing the order, but only upon validation.

Hidden Subscription

User is unknowingly signed up for a recurring subscription

Example: On Figma, when a user authorizes a guest to share, the guest is unknowingly charged a subscription.

Harassment

User seeks to perform an action, but is constantly interrupted

Example: Social networks encouraging notifications to be enabled each time the application is opened (like Instagram in 2018)

Obstruction

User faces obstacles to perform their action

Example: If a user wants to receive their personal data from Facebook, they are overwhelmed with information and dissuasive steps. (Although it is easier today than in 2018)

Pre-selection

An option is selected by default to influence decision-making.

Example: A former employee of a cybercriminal company reveals in an Underscore_ video that some services allow downloading popular software containing hidden malware. Outside of advanced installation, extensions are added to the computer to track everything a user does on the web to offer them increasingly targeted ads.

Sneaking

The user is lured into a transaction under false pretenses by having only partial information.

Example: In 2015, the site sportsdirect.com introduced an unwanted magazine subscription into users’ shopping carts for an extra £1 with every purchase.

Clever wording

User is misled by word choice.

Example: In the early 2010s, Ryanair insidiously added travel insurance by asking travellers to select their country of residence from a drop-down list. To opt out of travel insurance, users had to choose it from the list. The information on how to opt out was written below and was hidden by the drop-down list.

Visual interference

The user expects to see information presented in a clear manner, but it is hidden or disguised

Example: In 2019, it was possible to purchase an upgrade (autopilot, automatic parking, etc.) for your Tesla from the mobile app. Some users, having purchased this option by mistake, have filed a complaint, but the mention of the impossible refund was barely visible on the page.

The consequences of dark patterns

Dark patterns are insidious nudges that tend to penalize the user when faced with an action that is not convenient for a company.

Many users pay the price every day for these dark patterns while many companies and marketing services invent new ways to divert the user from their initial action to achieve their ends.

The consequences of dark patterns can be more or less serious depending on the degree of the dark patterns. Users can then lose trust or become more suspicious if they observe the same pattern again.

The best-known example is that of Facebook, which has been heavily criticized for its use of dark patterns. In 2018, the company Cambridge Analytica was taken to court following its collection of millions of Facebook users, without any consent. This affair caused Facebook to lose the trust of its users and led the social network to be the target of government investigations. Facebook then had to change its privacy policy in order to respect its users a little more.