Anonymous ID: 1285e1 July 22, 2018, 8:20 p.m. No.2247199   🗄️.is 🔗kun

>>2247166

Deepfakes Are Coming. And They're Dangerous.

JENNA LIFHITS

3 MIN READ

Deepfake Barack Obama

Real vs. fake.

University of Washington

July 20, 2018 at 6:19 AM

Marco Rubio warns that the United States is not ready for the havoc that impersonation technology can wreak.

Technology is making it easier and easier to create the impression that someone said or did something that, in reality, they did not. For malicious actors armed with that impersonation software, the possibilities for havoc are endless: political sabotage, humiliating fake sex videos, or unparalleled interference in another country’s politics.

 

Lawmakers are increasingly interested in stopping that from happening.

 

“This is an effort to try to get ahead of something,” said Florida senator Marco Rubio in remarks at the Heritage Foundation. “The capability to do all of this is real. It exists now. The willingness exists now. All that is missing is the execution. And we are not ready for it, not as a people, not as a political branch, not as a media, not as a country.”

 

Generating fake faces once took “armies of visual effects artists,” said Chris Bregler, a senior staff scientist and engineering manager at Google AI. But recent strides in machine learning technology have made it significantly easier to make create fake videos. There’s even an app for it.

 

“You don’t have to have software engineers anymore. You just download it on your PC and run it,” Bregler said at the Heritage event. “That changed the game.”

 

Rubio said that that growing accessibility, along with the ability to rapidly disseminate of information, makes these fake videos all the more dangerous.

 

“In the old days, if you wanted to threaten the United States, you needed 10 aircraft carriers and nuclear weapons and long-range missiles,” said Rubio. “Today you just need access to our internet system, to our banking system, to our electrical grid and infrastructure. And increasingly, all you need is the ability to produce a very realistic fake video that could undermine our election, that could throw our country into tremendous crisis internally and weaken us deeply.”

 

What exacerbates the potential threat of deepfakes is the difficulty of disproving them—especially when a video looks very real.

 

“It’s true that we can, generally speaking, eventually debunk” deepfake videos," said Bobby Chesney, a professor at the University of Texas. “But the truth doesn’t ever quite catch up with the initial lie if the initial lie is emotional and juicy enough.”

 

Rubio pointed to the possibility that foreign states, Russia in particular, could use deepfake videos to up their meddling game: to aid in sowing discord, undermining democracy, influencing elections, or all three.

 

“I know for a fact that the Russian Federation at the command of Vladimir Putin tried to sow instability and chaos in American politics in 2016,” he said. “They did that through Twitter bots and they did that through a couple of other measures that will increasingly come to light. But they didn’t use this. Imagine using this. Imagine injecting this in an election.”

 

The increasing accessibility of deepfake technology could eventually make it so that anybody could abuse it.

 

One appalling and obvious example is deepfake sex videos, where someone's face is swapped for that of a pornography actor. These videos could be used for anything from humiliation to blackmail.

 

“When victims discover that they have been used in fake sex videos, the psychological damage may be profound—whether or not this was the aim of the creator of the video,” write Chesney and Danielle Citron, a law professor at the University of Maryland, in a recent paper on deepfakes. “Victims may feel humiliated and scared.”

 

Chesney and Citron list a number of other destructive options for potential deepfakes: a politician “taking bribes” or “engaging in adultery;” soldiers “shown murdering innocent civilians in a war zone;” “emergency officials 'announcing' an impending missile strike on Los Angeles or an emergent pandemic in New York City, provoking panic and worse.”

 

Political deepfakes in particular pose a national security risk. They can strain already tense relationships between nations and intensify a lack of trust in public discourse and institutions.

 

“One of the prerequisites for democratic discourse is a shared universe of facts and truths supported by empirical evidence,” write Chesney and Citron. “Effective deep fakes will allow individuals to live in their own subjective realities, where beliefs can be supported by manufactured ‘facts.’ When basic empirical insights provoke heated contestation, democratic discourse cannot proceed on a sustained basis.”

 

JENNA LIFHITS

is a staff writer at The Weekly Standard.