2023
_CULTURE SOCIETY

Digital Forensics that Spot Deepfakes
Teaching computers how to recognize altered images could help expose forgeries and combat misinformation that spreads readily online.

_Matthew Stamm

Stamm is an associate professor of electrical and computer engineering in the College of Engineering.

The same technology that makes it possible to edit a wine stain out of a photographed shirt also permits mischievous actors to digitally distort images and videos. And when photo-editing crosses the line from sincere to sinister, it can shape viewers’ minds in dangerous ways.

Professor Matthew Stamm is a specialist in information forensics who has learned how to pinpoint images that have been altered.

Mood_Mismatch

Deepfake videos often feature a disconnect between the emotion in a person’s face and voice.

Every detail associated with a digital photo, from the camera that creates an image to the software used in processing it, leaves a “fingerprint,” Stamm says.

“We can even trace elements of the photo not just to the brand and model of camera that took it, but to the individual camera.”

Professor Matthew Stamm

“If contrast adjustment has been done to a photo, we can see the hallmarks of that in our analysis,” he explains. “We can even trace elements of the photo not just to the brand and model of camera that took it, but to the individual camera.”

Stamm developed an artificial intelligence algorithm to detect fake images and videos by identifying inconsistent forensic fingerprints contained in an image. Using machine learning, he can teach computers how to distinguish originals from forgeries.

Stamm’s work on digital and audio deepfakes has appeared recently in journals including IEEE Transactions on Information Forensics and Security and IEEE Transactions on Image Processing. His work has also led to a partnership with the Defense Advanced Research Projects Agency.