How to spot 'deepfake, shallowfake' manipulated videos | ABS-CBN
ADVERTISEMENT

Welcome, Kapamilya! We use cookies to improve your browsing experience. Continuing to use this site means you agree to our use of cookies. Tell me more!
How to spot 'deepfake, shallowfake' manipulated videos
How to spot 'deepfake, shallowfake' manipulated videos
AC Coloma,
ABS-CBN News
Published Apr 26, 2024 01:49 PM PHT

MANILA - Cybersecurity advocates on Friday sounded the alarm over or manipulated videos of personalities that surfaced online in recent weeks.
MANILA - Cybersecurity advocates on Friday sounded the alarm over or manipulated videos of personalities that surfaced online in recent weeks.
Business technology analyst Jerry Liao and Manila Bulletin Tech Editor Art Samaniego told Teleradyo Serbisyo there are ways to spot these manipulated videos.
Business technology analyst Jerry Liao and Manila Bulletin Tech Editor Art Samaniego told Teleradyo Serbisyo there are ways to spot these manipulated videos.
Samaniego and Liao cautioned that some manipulated videos labeled as "deepfakes" are more properly called "shallowfakes"
Samaniego and Liao cautioned that some manipulated videos labeled as "deepfakes" are more properly called "shallowfakes"
Deepfakes use generative AI and machine learning to create realistic videos of personalities, while shallowfakes are created using video-editing software.
Deepfakes use generative AI and machine learning to create realistic videos of personalities, while shallowfakes are created using video-editing software.
ADVERTISEMENT
To spot a manipulated video, Samaniego recommends paying attention if the subject's speech matches their facial expressions.
To spot a manipulated video, Samaniego recommends paying attention if the subject's speech matches their facial expressions.
"Parang mali minsan ang bibig kasi ine-edit nila ang part para ibang bibig ang nangyayari. yung inconsistencies sa skin tone. Iba iba ang skin tone pati ang shadow. Yun dapat ianalyze mo at mayroon tayong tech tools na pwedeng gamitin," Samaniego said.
"Parang mali minsan ang bibig kasi ine-edit nila ang part para ibang bibig ang nangyayari. yung inconsistencies sa skin tone. Iba iba ang skin tone pati ang shadow. Yun dapat ianalyze mo at mayroon tayong tech tools na pwedeng gamitin," Samaniego said.
(Sometimes, the mouth movement doesn't match the voice. There are also inconsistencies in skin tones, even the shadows. One should be able to analyze and there are also tech tools available to double-check on these.)
(Sometimes, the mouth movement doesn't match the voice. There are also inconsistencies in skin tones, even the shadows. One should be able to analyze and there are also tech tools available to double-check on these.)
Liao and Samaniego said another thing to look at is the shape of the face.
Liao and Samaniego said another thing to look at is the shape of the face.
"Walang facial expression. Mahirap gayahin ang facial expression walang emosyon ang video nila," Liao said.
"Walang facial expression. Mahirap gayahin ang facial expression walang emosyon ang video nila," Liao said.
ADVERTISEMENT
(Facial expressions are blank. Facial expressions are difficult to emulate and videos are emotionless. Look at the shape of the face, the figures of the face, and then the ears.)
(Facial expressions are blank. Facial expressions are difficult to emulate and videos are emotionless. Look at the shape of the face, the figures of the face, and then the ears.)
For deepfakes, another clue is the shape of the subject's ears.
For deepfakes, another clue is the shape of the subject's ears.
"Yung tainga minsan isa sa indicator na makikita mong fake yung video kasi ang machine learning, lalo na pag kulang ang sample kailangan mo ng malaking sample," Samaniego shared.
"Yung tainga minsan isa sa indicator na makikita mong fake yung video kasi ang machine learning, lalo na pag kulang ang sample kailangan mo ng malaking sample," Samaniego shared.
Authorities earlier said they were investigating a viral video that used an audio clip that mimicked the voice of President Marcos Jr.
Authorities earlier said they were investigating a viral video that used an audio clip that mimicked the voice of President Marcos Jr.
The Department of Information and Communications Technology earlier warned that deepfakes will spread during elections in the Philippines.
The Department of Information and Communications Technology earlier warned that deepfakes will spread during elections in the Philippines.
ADVERTISEMENT
With the midterm elections nearing, Samaniego also urged the government to create measures to combat deepfakes, and for the public to check mainstream media for updates.
With the midterm elections nearing, Samaniego also urged the government to create measures to combat deepfakes, and for the public to check mainstream media for updates.
"Pag nakakita kayo magdouble source ang mga tao. pag nakita nila na mag-post hanapin kung magpost sa iba and ang mainstream media. 'Pag nilagay sa mainstream media na-verify ito. Hindi ito basta basta pinadala," Samaniego said.
"Pag nakakita kayo magdouble source ang mga tao. pag nakita nila na mag-post hanapin kung magpost sa iba and ang mainstream media. 'Pag nilagay sa mainstream media na-verify ito. Hindi ito basta basta pinadala," Samaniego said.
(When you see these types of videos, double-source them. If you see them post, check if they posted this elsewhere and on mainstream media, which verifies these videos.)
(When you see these types of videos, double-source them. If you see them post, check if they posted this elsewhere and on mainstream media, which verifies these videos.)
ADVERTISEMENT
ADVERTISEMENT