“Engineered Addiction:” How Social Media Fuels Mental Health Crises
Opaque, Manipulative Algorithms Endanger Our Well-Being. But There are Steps We Can Take to Break Free
For the bulk of the last fifteen years, I’ve started businesses with the goal of “hacking” algorithms used by social media platforms. Now, looking back on my experience—and watching where the industry is headed—I’m alarmed by what the future holds.
My first company, launched in 2009, helped businesses and organizations establish their presence on newly commercialized social media sites like Facebook and Twitter. The second, launched in 2011, collected user behavioral data from Facebook to help brands predict the optimal time and frequency to post content. Finally, the third company, launched in 2014, used software and human intelligence to summarize and ‘de-bias’ the exponentially growing volume of online news and noise.
Recent headlines have laid bare how social media platforms have become a juggernaut, no longer confined to content-sharing, community building, and messaging. These sites are no longer “hackable” in the way they once were. Today, they deploy what I call “engineered addiction,” a calculated method crafted to keep us endlessly engaged and scrolling - often at the expense of our well-being.
As a tech entrepreneur and novelist, I explore this dynamic in my psychological thriller The Salzburg Executioner, paralleling a centuries-old system driven by fear, secrecy, and injustice with the present-day dangers of algorithmic manipulation. The story aims to explain, in narrative form, how the hidden forces of Big Tech can corral people toward crisis. Even self-harm and death.
During the writing process, I reviewed over fifteen years of independent and academic studies which confirm how social platforms increasingly hold the power to mold our experiences.
Take “infinite scrolling” for example. This feature, which refreshes content endlessly without offering a natural break for the user, was found to cause the highest level of regret among social media users, regardless of what they were viewing. This aligns with broader public sentiment: a Pew Research Center study indicates that 64% of U.S. adults regard social media’s influence on society as largely negative, citing heightened stress and polarization. Taken together, these findings point to how the architecture of our digital spaces maintain an alarming grip on our emotional well-being. And exploit basic human vulnerabilities.
Behind features like infinite scrolling lie sophisticated algorithms meticulously engineered to tap into our emotional triggers—fear, outrage, and despair.
Leaked internal company documents from social media companies show that features like infinite scrolling and autoplay weren’t serendipitous innovations. They were intentionally borrowed from other behavioral reward systems, like casino slot machines, to bolster how much time users spend on social media. These strategies fuel revenue streams tied to every minute we remain online. Behind features like infinite scrolling lie sophisticated algorithms meticulously engineered to tap into our emotional triggers—fear, outrage, and despair.
With the exception of being able to block individual users or posts, we have very little control over what we see on social media, and almost no knowledge of how the algorithms making content recommendations work. In The Salzburg Executioner, I draw a parallel to an historical form of control. From 1328 to 1803, the Prince Archbishop of Salzburg wielded near-absolute power over what was considered right and wrong, true and false. Ordinary citizens lived under capricious laws, risking everything if they ran afoul of the Prince Archbishop’s whims. Hired executioners carried out brutal punishments as public spectacles, reinforcing fear and subservience in the community.
Just as the Prince Archbishop dictated who lived or died according to his shifting whims, today’s tech giants - through opaque, proprietary algorithms - wield a similarly unchecked power to define what content is prioritized, suppressed, or amplified, all without most users ever knowing how or why.
Essentially, we are in the dark. We click “I Accept” without grasping what data social media platforms harvest, how they use it, or to whom it is sold. We have no insight into the inner workings of the algorithms that measure every flick of our thumb, tap of our fingers, and every second we hover over a post. Who outside these tech giants knows what secrets they hold about our tendencies, our mental weak points, or how best to sow division. Do you?
The “gallows” of that bygone Salzburg era have been replaced by more subtle manipulations. Instead of an executioner’s blade, we have quiet nudges that amplify strong emotional responses keeping us locked to our screens at the expense of our mental health. As reported by The Wall Street Journal in 2019, Meta’s own internal documents showed that social media use is linked to increased depressive symptoms and negative self-image, especially among teens and girls. The Salzburg Executioner mirrors that reality, depicting how lives unravel when technology is calibrated to escalate, rather than temper, sensational topics.
Advocates, myself among them, call for more investment and research into platform content moderation; a hybrid of automated detection and human oversight. We call for design changes that weaken the addictive hooks and provide warnings for users falling into dark or harmful veins of content.
But social media companies, bound to the “engagement at all costs” business model and pressure to deliver shareholder value, resist any structural pivot that might reduce profit. Two weeks ago, Meta’s CEO, Mark Zuckerberg, announced plans to end its third-party fact-checking program in the US and replace it with a community notes system. Research supports that such a shift will result in more misinformation, deeper social rifts, and, unfortunately, more self-harm.
Ultimately, these platforms survive on cycles of user engagement: each additional minute spent online harvests more data, which then refines the system’s capacity to tailor both content and advertising. The circle loops on itself. Reversing course would require fundamentally rethinking profit mechanisms and surrendering some addictive features. So far, these companies appear unwilling to do so.
Think back to the early days of social media. In 2006, Twitter’s simple “Public Timeline” feature let users browse a chronological torrent of public posts. When we got bored or overwhelmed, we left. There were no cunning algorithms analyzing how long we lingered on one post or swiftly scrolled past another. Now it’s a dramatically different story. The “waterfall” of content has a hidden undertow, pulling us deeper the better it knows our behaviors, patterns. And fears.
It will take collective action from users, regulators, and the platforms themselves to make social media safer.
In The Salzburg Executioner, I juxtapose Salzburg’s historical cruelty with our current descent into an online and app labyrinth of “engineered addiction.” Then, as now, fear, spectacle, and control collide. Then, as now, the casualties are real people.
I remain hopeful the early magic and sense of connection and community these platforms offered remains rooted somewhere deep in their DNA. But it will take collective action from users, regulators, and the platforms themselves to make social media safer.
Designers and engineers can build features that encourage conscious breaks - whether through timed scrolling limits or gentle prompts to step away. Legislators can enact greater transparency requirements, compelling social platforms to reveal the inner workings of their powerful algorithms. Users, too, can become more intentional, opting out of addictive cycles by prioritizing mindful engagement and digital literacy.
It’s true that these measures demand awareness, resolve, and vigilance. The crucial question remains: will we collectively advocate for a safer, more equitable online experience? Or simply keep scrolling until there’s no escape?
-Salzburg, Austria. 23 January, 2025