Stamm is an associate professor of electrical and computer engineering in the College of Engineering.
The same technology that makes it possible to edit a wine stain out of a photographed shirt also permits mischievous actors to digitally distort images and videos. And when photo-editing crosses the line from sincere to sinister, it can shape viewers’ minds in dangerous ways.
Professor Matthew Stamm is a specialist in information forensics who has learned how to pinpoint images that have been altered.
Deepfake videos often feature a disconnect between the emotion in a person’s face and voice.
Every detail associated with a digital photo, from the camera that creates an image to the software used in processing it, leaves a “fingerprint,” Stamm says.
“We can even trace elements of the photo not just to the brand and model of camera that took it, but to the individual camera.”
Professor Matthew Stamm
“If contrast adjustment has been done to a photo, we can see the hallmarks of that in our analysis,” he explains. “We can even trace elements of the photo not just to the brand and model of camera that took it, but to the individual camera.”
Stamm developed an artificial intelligence algorithm to detect fake images and videos by identifying inconsistent forensic fingerprints contained in an image. Using machine learning, he can teach computers how to distinguish originals from forgeries.
Stamm’s work on digital and audio deepfakes has appeared recently in journals including IEEE Transactions on Information Forensics and Security and IEEE Transactions on Image Processing. His work has also led to a partnership with the Defense Advanced Research Projects Agency.