5 reasons why Gemini AI is bad news for kids


As a family tech expert, I’ve seen social media and tech companies do some pretty incredible things. But Google’s plan to roll out Gemini, its AI chatbot, to users under 13, is wild. They gave notice to parents in an email, but it felt much more like a warning than a warm invitation.
So, why are they doing this? No one’s really sure, although Google is simply just joining Instagram, Snapchat, and a whole host of other platforms in the race to bring AI to nearly every facet of our lives.
Children, though, are much more vulnerable than adults — especially when it comes to online interactions. Here are my top 5 concerns about Google’s recent and reckless decision in opening up Gemini to kids 13 and under.
Chief Parent Officer, Bark Technologies.
1. It’s teaching kids to outsource thinking and creativity from a very early age
Young kids need to be practicing writing, drawing, and critically thinking with their own minds — not using scraped words and images dredged up from the depths of the internet.
Chat GPT is already proving to be a breeding ground for cheating and shortcuts in schools. Giving younger kids instant access to Gemini will only accelerate this introduction to cutting corners when it comes to learning and being creative.
2. Misinformation is rife on the platform
When an AI platform like Gemini provides completely wrong information, it’s called a “hallucination.” That’s a quaint way of saying “making things up that are total nonsense.”
Google even says in its FAQs about Gemini in the Family Link app that “[Hallucinations are] a strange term, but it basically just means that Gemini sometimes gets things wrong or says things that aren’t true. [They] can sound real, and Gemini might even say them confidently.”
For adults, these types of errors may be easy to recognize and ignore, like saying that the capital of France is Cairo. But for kids, they may not know when to double-check a simple answer — let alone something complicated or nuanced. This sort of defeats the purpose of having Gemini help with homework for children.
3. Inappropriate content can present dangers to kids
Gemini can also act as a chatbot “friend” for kids, which presents multiple dangers. Other similar chatbots have been blamed for exposure to sexual content and even one child’s suicide.
Of course, Google has stated that the Gemini for kids will have safeguards, but there’s never a guarantee that inappropriate things won’t slip through the cracks — especially when these AI platforms regularly hallucinate.
Fortunately, apps like Bark can monitor your child’s saved photos, videos, and even text messages for inappropriate content they may save or share from Gemini.
4. Personal info that’s shared can be hacked
Sharing personal information — from sensitive emotional states to home addresses to personal photos — with an AI platform is vulnerable because everything can be hacked. If someone were to gain access to your child’s Gemini chats, it could be stressful and even dangerous.
Hate speech and bias can be conveyed in Gemini responses
The way AI platforms like Gemini work is by scraping the totality of the internet for similar information — information that was written by other humans.
That means that human biases and viewpoints can be presented by Gemini as truth, which we now know isn’t even always true.
Because AI platforms provide answers based on information that humans created, it can mirror prejudices that exist in the data it’s fed with. This can include harmful positions about marginalized groups.
Final word
At the end of the day, technology is just another tool that can make our lives easier, but it’s just that — a tool, not a necessity.
Even though calculators are used every day in advanced math, kids still learn how to count, add, subtract, multiply, and divide the old-school way in elementary school.
The same should go with AI platforms like Gemini when it comes to writing, thinking, and being creative.
Checkout our comprehensive list of the best AI tools.
This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
As a family tech expert, I’ve seen social media and tech companies do some pretty incredible things. But Google’s plan to roll out Gemini, its AI chatbot, to users under 13, is wild. They gave notice to parents in an email, but it felt much more like a warning than…
Recent Posts
- Universal accidentally announces the title of the next Mario movie
- 5 reasons why Gemini AI is bad news for kids
- HP Coupon Codes & Deals: Save up to 81% in May
- Jessica Jones is back – Krysten Ritter’s hard-hitting PI joins Daredevil: Born Again season two
- Meta’s beef with the press flares at its antitrust trial
Archives
- May 2025
- April 2025
- March 2025
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010