This TikTok lawsuit could change the face of social media forever and it’s about time


Social Media and its Section 230 protection may have met its Waterloo. For most of the two-plus decades we’ve been using social media like X (nee Twitter), Facebook, Instagram, TikTok, and others, they’ve operated under protections designed 25 years ago primarily to shield platforms like Compuserve and AOL.
Those protections, which are part of the Communications Decency Act of 1996, said that online computer services couldn’t be held liable for content posted on their platforms by third parties. These services were like dumb, vast warehouses with shelves of information placed there by others. A warehouse doesn’t create what’s inside, it just accepts the content and gives consumers access.
This was back in the days of AOL, which controlled the pages you saw using keywords, a rough organizing principle for such a vast amount of information. In some ways, early platforms like Prodigy, CompuServe, and AOL were just one pretty interface removed from the Bulletin Board Systems that preceded them.
Modern digital services, mainly social media, have one major difference: they no longer passively wait for you to discover content and make connections on your own. Everything is tailored based on custom algorithms. TikTok’s vaunted For Your Page, X’s For You page, Threads’ For You Feed, Facebook’s feed, Instagram’s recommendations – all of them are driven by algorithms that learn your habits and then deliver other people’s content based on those assumed interests.
AOL wanted people to sign up and stay on, but it mostly kept its numbers up by managing churn. Almost as many people stopped paying for and using the service as signed up each month. That’s why we all got so many disks and CDs in the mail, begging us to join.
Algorithms in control
These days, the platforms are mostly free. Ads and partner deals pay the bills, so it’s crucial that eyeballs remain glued to each service. Hence, the algorithms that do the dirty work of keeping us all engaged.
While AOL, CompuServe, and even ISPs could fairly claim that they had no control over the content we saw online, and that the responsibility still fell on the shoulders of the content originators, the algorithms make the picture far murkier for modern social media, and perhaps even search engines like Google.
Section 230 has been under attack for years. I used to believe that it fairly protected all online services. When you look for someone to blame for seeing unwanted violent, hateful, perverse, or even pornographic content in your feed, the ultimate responsibility lies with the creator of that content and not the host.
I don’t believe that anymore and, as far as I can tell, it looks like US courts could soon make a precedent on this point in a closely watched case.
Precedent could be set
In 2021, a 10-year-old girl, Nylah Andreson, found a viral meme in her TikTok feed. The video promoted something called “The Blackout Challenge.” Social media is full of these viral challenges and the vast majority of them are harmless.
This one was not. It promoted choking yourself until you black out.
Tragically, Nylah, according to the filing, died while attempting the challenge and her family has been suing TikTok ever since. While the lower courts dismissed the case, a US Court of Appeals ruled that Nylah’s family could sue TikTok and specifically pointed to the TikTok algorithm as not being protected by the Federal-level Section 230.
From the ruling:
“TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its own first-party speech.”
While no one person at TikTok curates content for anyone’s feed, it is fair to call the algorithm the arbiter, and the algorithm is programmed by TikTok, which is owned by the Chinese company ByteDance (the company is currently being told to sell itself to US entities or face a ban in the States).
The Andreson case will continue and if Nylah’s family wins its suit against TikTok, it could mean a rapid end of protections for all social media currently using algorithms to shape our feeds. If TikTok loses, the social media companies could be held liable the next time you see hate speech, violent imagery, pornography, or suggestions of dangerous actions.
In a separate interview, Nylah’s family said they wanted these Big Tech firms to be held accountable for the algorithms and to do more to protect their users.
The winds of change
Whatever the final result, any platform that programs an algorithm to analyze your interests, then caters content based on that analysis, has a responsibility to ensure that its algorithm can’t deliver dangerous content.
In my own social media use, especially on TikTok, I’ve marveled at the algorithm’s power and flexibility. It will endlessly fill my For Your Page, keeping me hooked for hours at a time. It does allow for personal curation, which mostly happens by searching for things of interest.
When I stumble on something I like, I pay extra attention to it. I watch it more than once, pause the video, like it, share it, and then watch a few more videos in the same vein. If I do this a few times, I can shape my FYP feed so that I see more videos about people refurbishing old gadgets or making pasta.
However, these feeds have a needy side. They always throw in a “you might also like” topic that’s been popular with others. They’re trying to prevent you from losing interest in your feed and the platform.
That’s how, I believe, most people end up seeing things like violence and dangerous memes. You need to show the feed how much you dislike that content, then you can weed it out – assuming the algorithm allows it.
TikTok will fight this case, as other social media platforms have, but I think the tide has turned and a loss is possible. If that happens, TikTok, X, Threads, Facebook, Instagram, and other social media platforms may be forced to trash and recast all of their algorithms to ensure they don’t repeat the mistakes of the past. Otherwise they could end up buried under costly lawsuits – which they might lose again – until the platforms succumb and disappear forever.
You might also like
Social Media and its Section 230 protection may have met its Waterloo. For most of the two-plus decades we’ve been using social media like X (nee Twitter), Facebook, Instagram, TikTok, and others, they’ve operated under protections designed 25 years ago primarily to shield platforms like Compuserve and AOL. Those protections,…
Recent Posts
- Top digital loan firm security slip-up puts data of 36 million users at risk
- Nvidia admits some early RTX 5080 cards are missing ROPs, too
- I tried ChatGPT’s Dall-E 3 image generator and these 5 tips will help you get the most from your AI creations
- Gabby Petito murder documentary sparks viewer backlash after it uses fake AI voiceover
- The quirky Alarmo clock is no longer exclusive to Nintendo’s online store
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010