Microsoft chief warns more deepfake threats could be coming soon


When it comes to deepfakes, what we’ve seen so far is just the tip of the iceberg. In the near future, we won’t be certain if the person we’re speaking to on a video call is real, or an impostor, and crooks won’t have trouble creating an entire chronology of fake videos to support their claims, or trick people into believing the legitimacy of an offer or campaign.
These harrowing predictions come from Eric Horvitz, Microsoft’s chief science officer, in a new research paper, titled “On the horizon: Interactive and compositional deepfakes”.
Deepfakes are “photoshopped” videos, essentially. By using artificial intelligence (AI) and machine learning (ML), a threat actor is able to create a video of a person saying things that they never said. Now, according to Horvitz, crooks are ready to take it to the next level. Interactive deepfakes are just as you’d expect – real-time videos with which users can interact, which are, in reality, utterly fake.
Synthetic history
Compositional deepfakes, on the other hand, are described as “sets of deepfakes” designed to integrate over time with “observed, expected, and engineered world events to create persuasive synthetic histories.”
“Synthetic histories can be constructed manually but may one day be guided by adversarial generative explanation (AGE) techniques,” Horvitz adds.
He also says that in the near future, it will be almost impossible to distinguish fake videos and fake content from authentic ones: “In the absence of mitigations, interactive and compositional deepfakes threaten to move us closer to a post-epistemic world, where fact cannot be distinguished from fiction.”
This absence of mitigations stems from the fact that threat actors can pit artificial intelligence against analysis tools and develop deepfake content that is able to fool even the most advanced detection systems.
“With this process at the foundation of deepfakes, neither pattern recognition techniques nor humans will be able to reliably recognize deepfakes,” Horvitz notes.
So, next time a family member calls from abroad to ask for money to pay the rent, make sure it’s not a fraudster impersonating (opens in new tab) your loved ones.
Via: VentureBeat (opens in new tab)
Audio player loading… When it comes to deepfakes, what we’ve seen so far is just the tip of the iceberg. In the near future, we won’t be certain if the person we’re speaking to on a video call is real, or an impostor, and crooks won’t have trouble creating an…
Recent Posts
- NYT Wordle today — answer and my hints for game #1479, Monday, July 7
- Playdate Season 2 review: Taria & Como and Black Hole Havoc
- 3 features that would actually make me pay for a Samsung Health subscription for my Galaxy Watch – and one big problem it needs to avoid
- 250-million pixel virtual projector sets world record on 280-meter tall building used as a screen
- TikTok’s ‘ban’ problem could end soon with a new app and a sale
Archives
- July 2025
- June 2025
- May 2025
- April 2025
- March 2025
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022