Your Life in the Algorithm : 9 Oct 24
In this edition: The US judicial system grapples with the “Blackout Challenge,” and other online safety and digital detox news.
In 2021, a ten-year-old girl named Nylah Anderson accidentally hanged herself after attempting the "Blackout Challenge." The challenge, shown frequently on TikTok, encourages people to choke themselves using everyday items, such as belts or purse strings, while recording the act on video. Nylah learned about the challenge from the For You page of her TikTok account.
After Nylah’s death, Taiwanna Anderson, Nylah’s mother, filed a lawsuit against TikTok and its parent company, ByteDance. She believed Blackout Challenge videos appeared in her daughter’s feed frequently, even though Nylah did not search for them. Ms. Anderson further claimed TikTok’s content recommendation algorithms targeted Nylah based on her age, demographics and viewing habits.
Nylah is not the only child to have died while attempting the Blackout Challenge.
In the lower courts, TikTok used what is oft referred as the “guardian angel” of internet company liability shields, Section 230 of the Communications Decency Act (CDA). Passed in 1996, the CDA effectively treats internet providers and social networks as utilities who simply pass 3rd party information from one place to another. Without the CDA, companies like Facebook, X, TikTok and others would be buried in lawsuits. Of note, the CDA became law one year after Microsoft released version 1.0 of its Internet Explorer web browser.
Last month, however, the US 3rd Circuit Court of Appeals handed down a surprising and potentially far-reaching decision, arguing that the CDA did not provide a safe harbor for TikTok, because its algorithms recommended and promoted a video that led to Nylah’s death. The court ruled that when algorithms make decisions on behalf of users like Nylah, they cease being “utilities,” and limit or relinquish their Section 230 protections.
TikTok is appealing the ruling, which could ultimately reach the US Supreme Court. One can assume every social media company in the US is also following its progress closely.
Knock on wood, the decision could ultimately offer parents, educators and consumers a better understanding of how content recommendation and manipulation algorithms work, and their impact on our mental health, our youth, and our at-risk populations. So we can make more informed decisions prior to using, or allowing the use of, these platforms and apps.
Thoughts, feedback, or personal stories? Please drop me a line.
The dangers and opaque nature of content recommendation algorithms are a key theme in my forthcoming book, The Salzburg Executioner, which you can pre-order today.
What you need to know this week
EU requests info from YouTube, Snapchat, TikTok on content algorithms - MSN - Oct 07, 2024
The tech firms must provide the requested information by November 15, the EU said, after which the commission will decide on next steps, which could include fines.
Social media and online video firms are conducting ‘vast surveillance’ on users, FTC finds - The Guardian - Sep 21, 2024
Social media and online video companies are collecting huge troves of your personal information on and off their websites or apps and sharing it with a wide range of third-party entities, a new Federal Trade Commission (FTC) staff report on nine tech companies confirms.
Is big tech harming society? To find out, we need research – but it’s being manipulated by big tech itself - The Conversation - Oct 03, 2024
The whole debacle highlights the problems caused by big tech funding and facilitating research into their own products. It also highlights the crucial need for greater independent oversight of social media platforms.
Has Social Media Fuelled a Teen-Suicide Crisis? - The New Yorker - Oct 03, 2024
Mental-health struggles have risen sharply among young Americans, and parents and lawmakers alike are scrutinizing life online for answers.
The Rising Importance of Digital Detox During a Mental Health Crisis - Star Local Media - Oct 05, 2024
Our youth may struggle to remember a time without devices, but it wasn't that long ago that we didn't have smartphones or tablets, as most millennials and older generations remember.
Thank you for reading The Impetus of L.A. Fatzinger. You can help support this public post and the work behind it by sharing it with two friends for free.
An “impetus” is something that makes a process or activity happen - or happen more quickly. The word also reminds me of two others I commonly wrestle with: impulsive and impatient. The Impetus of L.A. Fatzinger is my space to be impulsive with thoughts, impatient with ideas, and, along with the occasional favorable winds, clear skies, and a smattering of good luck, to move thoughts and ideas along more quickly. I hope it makes you think, laugh, and, as I always try to do, test your perspective.