Post Event Articles
Deepfakes for Good: Exploring Prosocial Applications in a Digital World
17 Nov 2023
By Taara Kumar
On the 17th of November, the Department of Communications and New Media had the pleasure of receiving Dr. Hang Lu for a talk, Deepfakes for Good: Exploring Prosocial Applications in a Digital World. Dr. Hang Lu is currently an Assistant Professor in the Department of Communication and New Media at the University of Michigan. In his talk, Dr Lu shared about the prosocial uses of deepfakes, their effects on audiences, and how they may affect policy support for the use of deepfakes.
The talk began with an acknowledgement of how far artificial intelligence (AI) technologies have come today, and how this has led to the proliferation of deepfakes online.
“Unsurprisingly what we have heard the most when it comes to deepfakes, is that they are often used for deception, and as a source of misinformation. There is also a widespread use of deepfakes for creating pornography,” Dr Hang Lu said.
There is no doubt that deepfakes generally have a bad reputation due to their role in nefarious online schemes by bad actors. This has led to much of the current research about the use of deepfakes to be centered around their dangers.
However, within the sea of negative perceptions of deepfakes, Dr. Hang Lu provides an alternative view through a crucial question.
“Can this technology be used for the greater good?”
A key topic of Dr Hang Lu’s talk was the perceptions of audiences when viewing prosocial deepfakes. He described prosocial deepfakes to be deepfakes generated for prosocial uses. Referencing his paper Let the dead talk: How deepfake resurrection narratives influence audience response in prosocial contexts (Lu & Chu, 2023), he addressed the use of AI-based technology and deepfakes in “deepfake resurrection” videos, and the psychological processes that audiences go through when viewing this type of content.
Within the context of drunk driving and domestic violence, they found that regardless of the prosocial intent, the use of deepfake resurrection videos had a negative effect on prosocial persuasion regarding policy changes for both issues. Other key findings include that the perceived realism, identification, compassion, and surprise were positively related to policy support. Surprisingly, they also found that the perceived desecration of the dead was unrelated to policy support.
Another key topic of the talk was the perceptions of audiences when viewing deepfakes for entertainment purposes, termed “misinfotainment”. In a recent study, "I know it’s a deepfake”: The role of deepfake disclaimer and comprehension in the processing of deepfake parodies (Lu & Yuan, under review), Dr Hang Lu, along with a colleague used the Comprehension-Elaboration Theory of Humor to study how viewers identify and process parody videos that utilise deepfakes.
Interested in climate change as a prosocial effect, they conducted a study where participants were randomly assigned to either watch a deepfake video with a disclaimer, and then the original non-deepfake video unrelated to climate change; or a deepfake video without a disclaimer, and then the original non-deepfake video unrelated to climate change.
Lu and Yuan found that deepfake parody videos produce minimum beneficial or prosocial outcomes compared to the original, non-deepfake video; and that the use of a deepfake disclaimer appeared to be more beneficial for policy changes than its non-use.
Dr. Hang Lu ended the talk with a discussion on deepfake use, its regulation, and its proximity to online misinformation laws. He added that while some countries are still crafting their own laws for the regulation of deepfakes, some social media platforms such as TikTok have already incorporated deepfake regulations in their policies.