2 min read

AI & Addiction By Design

Featured Image

The pleasures of new media culture ...are consciously designed to be delivered to us in ways that undermine our cognitive autonomy and moral agency. They make it harder, not easier, for us to choose well.” (Shannon Vallor, Technology & The Virtues)

There exist two worrisome similarities between people’s addiction to new media technology and people’s addiction to slot machines. Both artifacts are opaque and incomprehensible “black boxes”. Both artifacts bring to light the ethical problem of purposely designed products meant to subvert our sense of autonomy and agency.

First, many casino owners and Big Tech companies are using the same playbook on greed. They’ve figured out how to feed their insatiable thirst for massive profits by exploiting people’s lack of self-control for wanting continued dopamine fixes. Many casino owners have made it too easy for gamblers to wager all the money they have (or money they don’t have) even if it leads them to bankruptcy. Likewise, Big Tech uses highly lucrative business models that seduce people into unabashedly sharing their personal information for free through updates, photos, likes and shares. It then turns around and sells users’ personal information to advertisers without their consent.

Second, many casino owners and Big Tech companies have purposely designed their artifacts to be addictive. Through the decades, casino owners have continuously re-engineered the slot machine’s physical ergonomics (i.e. comfortable chair, red buttons), sensory experience (i.e. the sounds of winning coins, flashing lights), overly convenient logistics (i.e. easy access to money literally at one’s fingertips) and psychological effects of “near wins” so that gamblers just can’t stop playing. Likewise, Big Tech has carefully designed predictive algorithms to provide personalized experiences at scale to their legion of users by mining their endless streams of personal data and online behaviors. These companies purposely make it increasingly difficult for users to leave their platforms by providing endless personalized news feeds and recommendations mixed with dopamine triggers of likes and followers that provide users with continuous self-validation.

Interestingly, Vallor warns Big Tech that if they don’t start integrating ethics into their business model and culture, they’ll eventually be stigmatized like tobacco companies:

“It only took a few decades for cigarette companies to go from corporate models of consumer loyalty and affection to being seen as merchants of addiction, sickness, and death, whose products are increasingly unwelcome in public or private spaces. Only extravagant hubris or magical thinking could make software industry leaders think they are shielded from a similar reversal of fortune.”

Will Big Tech heed Vallor’s warning by reducing the addictive design of their platforms? Should they?

Learn more about the dynamic interaction of AI technology, ethics and law.
Take the Skills4Good Responsible AI Program!