Share This Post

Social media, news media, and the Internet in general are littered with a variety of fake news, deep fakes, and more. Manipulating multimedia content can deceive and change the way people think, act, react, believe, and in some cases cause harm. 

There are many goals for creating manipulated multimedia. Brand impersonation can drive fake narratives that impact share prices, executives, and public opinion. Adversarial countries also weaponize fake media to drive fake intelligence and promote fear, threats, and sometimes war.

Analysis of fake images involves categorizing the attributes in and about an image: 

  • Where the photo was taken 
  • Individuals in the photo 
  • Objects in the photo 
  • Words in the foreground or background 
  • Facial expressions 
  • Human characteristics 

Using intelligent feature extraction, anomalies can be identified to uncover where the image may have been modified. This involves analyzing adjacent pixels to uncover these anomalies. We perform this to create a new image with the anomalistic pixels. This provides insights into the portions of the image that were modified:

A pilot sticking his head out of the window to take a selfie at 35,000 feet is clearly an altered image. But other fake narratives aren’t quite so obvious. Thus, analyzing images using this approach is key to uncovering altered images intended to drive fake narratives.

To find out more about our Fake Image Analysis Toolset (FIATS), visit our website at www.silentsignals.com or contact us at [email protected] 

This article was written by Mike Raggo of SilentSignals. Original article posted on SilentSignals blog: https://silentsignals.com/582-2/

More Articles

Article

Future-Proof Your Cyber Career

Cyber moves fast, so staying on top of everything can be difficult to impossible at times. So, to effectively future-proof your cyber career, you can