Experts warn some ChatGPT models can be hacked to launch deepfake scams


Getting scammed by a chatbot is unfortunately no longer in the domain of science fiction, after researchers from the University of Illinois Urbana-Champaign (UIUC) demonstrated how it could be done.
Recently, Richard Fang, Dylan Bowman, and Daniel Kang from UIUC published a new paper in which they described how they abused OpenAI’s latest AI model, called ChatGPT-4o, to fully automate some of the most common scams around.
Now, OpenAI’s latest model offers a voice-enabled AI agent, which gave the researchers the idea of trying to pull off a fully automated voice scam. They found ChatGPT-4o does have some safeguards which prevent the tool from being abused this way, but with a few “jailbreaks”, they managed to imitate an IRS agent.
Advanced reasoning
Success rates for these scams varied, the researchers found. Credential theft from Gmail worked 60% of the time, while others like crypto transfers had about 40% success. These scams were also relatively cheap to conduct, costing about $0.75 to $2.51 per successful attempt.
Speaking to BleepingComputer, OpenAI explained its latest model, which is currently in preview, supports “advanced reasoning” and was built to better spot these kinds of abuses: “We’re constantly making ChatGPT better at stopping deliberate attempts to trick it, without losing its helpfulness or creativity,” the company’s spokesperson told the publication.
“Our latest o1 reasoning model is our most capable and safest yet, significantly outperforming previous models in resisting deliberate attempts to generate unsafe content.”
OpenAI praised the researchers, saying these kinds of papers help ChatGPT get better.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
According to the US government, voice scams are considered fairly common. The premise is simple: an attacker would call the victim on the phone and, while pretending to help solve a problem, actually scam them out of money or sensitive information.
In many cases, the attack first starts with a browser popup showing a fake virus warning, from a fake antivirus company. The popup urges the victim to call the provided phone number and “clean” their device. If the victim calls the number, the scammer picks up and guides them through the process, which concludes with the loss of data, or funds.
More from TechRadar Pro
Getting scammed by a chatbot is unfortunately no longer in the domain of science fiction, after researchers from the University of Illinois Urbana-Champaign (UIUC) demonstrated how it could be done. Recently, Richard Fang, Dylan Bowman, and Daniel Kang from UIUC published a new paper in which they described how they…
Recent Posts
- Quordle hints and answers for Wednesday, February 19 (game #1122)
- Facebook is about to mass delete a lot of old live streams
- An obscure French startup just launched the cheapest true 5K monitor in the world right now and I can’t wait to test it
- Google Meet’s AI transcripts will automatically create action items for you
- No, it’s not an April fool, Intel debuts open source AI offering that gauges a text’s politeness level
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010