An Indian politician used AI to translate his speech into other languages to reach more voters


As social media platforms move to crack down on deepfakes and misinformation in the US elections, an Indian politician has used artificial intelligence techniques to make it look like he said things he didn’t say, Vice reports. In one version of a campaign video, Manoj Tiwari speaks in English; in the fabricated version, he “speaks” in the Hindi dialect of Haryanvi.
[embedded content]
Political communications firm The Ideaz Factory told Vice it was working with Tiwari’s Bharatiya Janata Party to create “positive campaigns” using the same technology used in deepfake videos, and dubbed in an actor’s voice to read the script in Haryanvi.
“We used a ‘lip-sync’ deepfake algorithm and trained it with speeches of Manoj Tiwari to translate audio sounds into basic mouth shapes,” Sagar Vishnoi of The Ideaz Factory said, adding that it allowed the candidate to target voters he might not have otherwise been able to reach as directly (while India has two official languages, Hindi and English, some Indian states have their own languages and there are hundreds of various dialects).
The faked video reached about 15 million people in India, according to Vice.
Even though more deepfake videos are used to create nonconsensual pornography, the now-infamous 2018 deepfake video of President Obama raised concerns about how false or misleading videos could be used in the political arena. Last May, faked videos were posted on social media that appeared to show House Speaker Nancy Pelosi slurring her words.
In October, however, California passed a bill making it illegal to share deepfakes of politicians within 60 days of an election. And in January, the US House Ethics Committee informed members that posting deepfakes on social media could be considered a violation of House rules.
Social media companies have announced plans to try to combat the spread of deepfakes on their platforms. Twitter’s “deceptive media” ban takes effect in March. Facebook banned some deepfakes last month and Reddit updated its policy to ban all impersonation on the platform, which includes deepfakes.
How and when intentional use of altered videos might affect the 2020 US elections is anyone’s guess, but as one expert told Vice, even though the Tiwari video was meant to be part of a “positive” effort, the genie is out of the bottle now.
As social media platforms move to crack down on deepfakes and misinformation in the US elections, an Indian politician has used artificial intelligence techniques to make it look like he said things he didn’t say, Vice reports. In one version of a campaign video, Manoj Tiwari speaks in English; in…
Recent Posts
- Apple’s C1 chip could be a big deal for iPhones – here’s why
- Rabbit shows off the AI agent it should have launched with
- Instagram wants you to do more with DMs than just slide into someone else’s
- Nvidia is launching ‘priority access’ to help fans buy RTX 5080 and 5090 FE GPUs
- HPE launches slew of Xeon-based Proliant servers which claim to be impervious to quantum computing threats
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010