How Instagram can take its child safety work even further


In May, I wrote here that the child safety problem on tech platforms is worse than we knew. A disturbing study from the nonprofit organization Thorn found that the majority of American children were using apps years before they are supposed to be — and fully a quarter of them said they have had sexually explicit interactions with adults. That puts the onus on platforms to do a better job in both identifying child users of their services and to protect them from the abuse they might find there.
Instagram has now made some promising moves in that direction. Yesterday, the company said that it would:
- Make accounts private by default for children 16 and younger
- Hide teens’ accounts from adults who have engaged in suspicious behavior, such as being repeatedly blocked by other children
- Prevent advertisers from targeting children with interest-based ads. (There was evidence that ads for smoking, weight loss and gambling were all being shown to teens)
- Develop AI tools to prevent underage users from signing up, remove existing accounts of kids under 13, and create new age verification methods
The company also reiterated its plan to build a kids’ version of Instagram, which has drawn condemnations from … a lot of people.
Clearly, some of this falls into “wait, they weren’t doing that already?” territory. And Instagram’s hand has arguably been forced by growing scrutiny of how kids are bullied on the app, particularly in the United Kingdom. But as the Thorn report showed, most platforms have done very little to identify or remove underage users — it’s technically difficult work, and you get the sense that some platforms feel like they’re better off not knowing.
So kudos to Instagram for taking the challenge seriously, and building systems to address it. Here’s Olivia Solon at NBC News talking to Instagram’s head of public policy, Karina Newton (no relation), on what the company is building:
“Understanding people’s age on the internet is a complex challenge,” Newton said. “Collecting people’s ID is not the answer to the problem as it’s not a fair, equitable solution. Access depends greatly on where you live and how old you are. And people don’t necessarily want to give their IDs to internet services.”
Newton said Instagram was using artificial intelligence to better understand age by looking for text-based signals, such as comments about users’ birthdays. The technology doesn’t try to determine age by analyzing people’s faces in photos, she said.
At the same time, it’s still embarrassingly easy for reporters to identify safety issues on the platform with a handful of simple searches. Here’s Jeff Horwitz today in The Wall Street Journal:
A weekend review by The Wall Street Journal of Instagram’s current AI-driven recommendation and enforcement systems highlighted the challenges that its automated approach faces. Prompted with the hashtag #preteen, Instagram was recommending posts tagged #preteenmodel and #preteenfeet, both of which featured sometimes graphic comments from what appeared to be adult male users on pictures featuring young girls.
“Prompted with the hashtag #preteen, Instagram was recommending posts tagged #preteenmodel and #preteenfeet, both of which featured sometimes graphic comments from what appeared to be adult male users on pictures featuring young girls.”https://t.co/HRclDZnNBp
— Jeff Horwitz (@JeffHorwitz) July 27, 2021
Instagram removed both of the latter hashtags from its search feature following queries from the Journal and said the inappropriate comments show why it has begun seeking to block suspicious adult accounts from interacting with minors.
Problematic hashtags aside, the most important thing Instagram is doing for child safety is to stop pretending that kids don’t use their service. At too many services, that view is still the default — and it has created blind spots that both children and predators can too easily navigate. Instagram has now identified some of these, and publicly committed to eliminating them. I’d love to see other platforms follow suit here — and if they don’t, they should be prepared to explain why.
Of course, I’d also like to see Instagram do more. If the first step for platforms is acknowledging they have underage users, the second step is to build additional protections for them — ones that go beyond their physical and emotional safety. Studies have shown, for example, that teenagers are more credulous and likely to believe false stories than adults, and they may be more likely to spread misinformation. (This could explain why TikTok has become a popular home for conspiracy theories.)
Assuming that’s the case, a platform that was truly safe for young people would also invest in the health of its information environment. As a bonus, a healthier information environment would be better for adults and our democracy, too.
“When you build for the weakest link, or you build for the most vulnerable, you improve what you’re building for every single person,” Julie Cordua, Thorn’s CEO, told me in May. By acknowledging reality — and building for the weakest link — Instagram is setting a good example for its peers.
Here’s hoping they follow suit — and go further.
This column was co-published with Platformer, a daily newsletter about Big Tech and democracy.
In May, I wrote here that the child safety problem on tech platforms is worse than we knew. A disturbing study from the nonprofit organization Thorn found that the majority of American children were using apps years before they are supposed to be — and fully a quarter of them…
Recent Posts
- DOGE can keep accessing government data for now, judge rules
- In a test, 2000 people were shown deepfake content, and only two of them managed to get a perfect score
- Quordle hints and answers for Wednesday, February 19 (game #1122)
- Facebook is about to mass delete a lot of old live streams
- An obscure French startup just launched the cheapest true 5K monitor in the world right now and I can’t wait to test it
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010