MoMI explores deep fakes, technology and misinformation

A new MoMI exhibit invites visitors to expand their ideas on digital fakes, fabrications and their impacts. Photo via Thanassi Karageorgiou / Museum of the Moving Image

By Rachel Vick

The Museum of the Moving Image is hosting a live event this week to discuss the details of fabricated videos and their consequences.

The discussion is hosted in connection with the museum’s current exhibit “Deepfake: Unstable Evidence on Screen,” exploring the world of deepfakes – videos of a person that have been digitally altered so that the person convincingly appears to be someone else.

The March 9 conversation “Deepfakes and Creative Play” will feature artists Chris Umé, Carl Bogan and Paul Shales. They will discuss the different facets of the creation of deep fakes, their history and their impact, explored throughout the exhibit with co-curator Barbara Miller, MoMI’s deputy director for Curatorial Affairs.

Co-curator Joshua Glick, assistant professor of English, Film & Media Studies at Hendrix College, said his hope is to inspire visitors to “talk about how this installation is made and the tech involved, and how deepfakery is created and used in addition to getting viewers to think historically.

“We want people to be aware of probable uses, the way this technology is used today and the real world harms it can cause,” Glick said.

Through the different sections of the exhibit, visitors to the Astoria museum are encouraged to question the authenticity of the videos and test their skills at spotting fakes, learn about the different uses — positive, creative or malevolent — and the cultural context they are created in.

It includes a hall of mirrors exploring amatuer attempts and an installation featuring Emmy Award–winning project “In Event of Moon Disaster.”

“Sometimes it's portrayed that media manipulation is a completely new problem, that we’re in this information apocalypse,” he said. “While there is a degree of crisis… [the goal is to remind] people to think the problem of media manipulation is not necessarily new. They have a long history: propaganda, war films, conspiracy theory videos, tabloid TV.”

“We want people to be aware that these malicious deepfakes are part of a broader context of disinformation, how they could be used in the context of a political election or sow the seeds of doubt,” he added.

There are also ways the technology can be “artful, savvy, strategic,” like the creation of political critiques or disguise the identities of a persecuted party to protect them while exercising free speech, like LGBTQ advocates in regions where their expression is illegal.

The exhibit also encourages viewers to think critically about the media they consume, with an emphasis on the ease with which deepfakes can be created using current technology and the ways politics and internet culture shape their impact.

“Trying to equip and empower visitors of the exhibition to be discerning members of the viewing public with this idea that there are things everyday people can do to be more critical, more skeptical,” Glick said. “We want viewers, students and consumers of media to be critical about these issues, to dig deeper and look at the context; to really think about authorship sponsorship stakeholders and intention.”

The exhibition will be on display through May 15.