Deepfakes go viral

This past October, an audio clip appeared on social media in which the U.K. Opposition leader, Keir Starmer, was heard swearing at his staffers. It went viral on X, formerly known as Twitter, with 1.5 million hits.

Earlier this year, on the eve of the mayoral elections in Chicago, Paul Vallas, a candidate, seen as someone who would be strong in tackling crime, was making an all-out effort to win over the voters. Then, an account called Chicago Lakefront News released a video of him on social media, which showed him declaring: “In my day a police officer could kill as many as 17 to 18 civilians and no one would bat an eye.”

The damage was done. He lost to Brandon Johnson.

In both these seemingly searing exposés, neither had the British Labour leader sworn at his staffers nor had the U.S. mayoral candidate spoken the words heard in the video clip. 

Both were “deepfakes.”

How Deepfakes work

A 21st-century edition of photoshopping, deepfakes are fake media, whose name combines “deep,” taken from a learning technology engaged by artificial intelligence called deep learning (which involves multiple levels of processing), and “fake,” which indicates that the content isn’t genuine. The technology enables the creator to create video and audio that can easily trick (almost) anyone. 

Speaking at a recent E.M.S. press event, Sam Gregory, Executive Director of WITNESS, a human rights non-profit, that helps people to protect their rights with the help of technology, spoke about how deepfakes could pose a serious threat in the coming electoral season.

This technology has been around for the past five years, but what’s changed in the last year is that it has become much easier to create deepfakes. “Just about anyone, with the correct tool, can generate a deepfake. In fact, you can even pay someone to do it for you for very little money,” Gregory said.

Voice clones

There are a lot of commercial tools, some free and some paid, that enable the creation of both video and audio deepfakes, also known as “voice clones.” These two innovations have, in turn, helped in the development of another kind of deepfake, which is an avatar of someone who looks like a human and behaves like one and can be made to engage in a particular behavior.

Such AI-generated media can be deployed in a wide range of contexts. For example, in the ongoing war between Israel and Hamas, they have been wielded to whip up sentiments on either side of the conflict, added. Gregory

Deepfake impact on the U.S. electoral landscape

Deepfakes can be deployed stealthily against contenders for public office to tarnish their reputation, without their becoming aware that they’re the target of a vituperative campaign.

Gowri Ramachandran, Deputy Director of the Democracy program at the Brennan Center for Justice, a think tank in New York, added that deepfakes, in combination with less evolved digital media, such as fake webpages and fake emails, sent from fake addresses, can augment the threat of false narratives, created deliberately, with the intent to hoodwink voters.

Nora Benavidez, Director of Digital Justice and Civil Rights at Free Press, a media advocacy group based in Washington, D.C. said that many such false narratives originate in social media, gain the force of “facts” and eventually, make landfall in reputable media outlets, ranging from CNN to Fox. As there is a “very porous relationship” between social media and established broadcasting and publishing outlets and the ease with which content travels from the former to the latter, there is an intense need to vet the original content before it starts spreading.   

With less than a year until the 2024 presidential elections, Ramachandran outlined some of the other threats that could be coming down the pike as well.

Potential threats

One of them is the likelihood of physical threat to election officials and election workers as well as their family members. Earlier this month, letters containing a white powder were sent to election offices in multiple states.

According to a recent survey by the Brennan Center of Justice, the rise in the level of threats—which range from death threats that name their young children to harassment that focuses on their gender—is forcing election workers to leave their jobs because they feel unsafe.

This could lead to a drop in the number of available election workers. The decline is similar to the one that happened earlier, during the COVID-19 pandemic when the elderly were afraid to volunteer as poll workers for fear of risking exposure to the virus.

Other than that, there is the danger of the breakdown of physical infrastructure, Ramachandran added.

With Donald Trump signaling that he’s out for revenge and the continued push by some Republicans that the 2020 elections were “stolen,” coupled with a novel intelligence thrown into the mix, the chances of the next U.S. election becoming a battleground, one with effects even more pernicious than the last one, increase.

Alakananda Mookerjee lives in Brooklyn, and is a Francophile.