The Potential Impact of DeepFakes for Private Investigators
- August 20
- by PInow Staff
Just as in any industry, private investigators are faced with industry changes as a result of improved technology. At first, video technology and capabilities revolutionized how private investigators could surveil subjects, gather evidence, and more. Now, however, the ability to alter videos is changing the public’s perception of what is real and what is fake.
What is a DeepFake?
If you have never heard of the term “deepfake" before, you are not alone. However, it is important, especially as a private investigator, that you know what they are and how they are used. At its most basic form, deepfakes are video that has been altered with artificial intelligence — think photoshopping for videos. Deepfakes are an altered video of a person, from what they’re doing to what they’re saying, changed at the whim of the creator — and with the viewer none-the-wiser because it looks and sounds so real. From superimposed faces to matching the voice to changed words, deepfakes can change everything. First emerging in late 2017 after a redditor posted deepfakes under that username, deepfakes at first seemed focused on creating AI porn. However, deepfakes are rapidly proliferating beyond porn and into disseminating misinformation, among other fake videos.
Impact to PI Industry
Aside from being a glaring problem that blurs the line between art and malfeasance, deepfakes also pose a problem for PIs as well. Private investigators are tasked with gathering evidence to support a case. Whether that evidence is for a divorce case, human trafficking, fraud, or something else, video evidence is often captured and used. The emergence of deepfakes could severely impact the private investigations industry if fake videos are procured as evidence in a case. Otherwise innocent people could be perceived guilty, and fiction could be deemed fact. Obviously, that could not only affect the outcome of a case, but it could also impact the reputation of the private investigator if he or she was duped. Private investigators must be able to discern what video is real from what is fake.
How to Identify a Deepfake
While it may seem impossible to detect a deepfake, the tools to figure out whether a video is a deepfake are still developing, and it is becoming easier to figure them out. In fact, Purdue University is taking an even deeper look (at a coding level) to understand and detect deep fakes through media forensics. However, even though deepfakes are proliferating and getting even more duplicitous, you don’t have to be an engineer to spot them.
To create a deepfake, artificial intelligence takes the visual data provided by the videos and pictures already on the internet to replicate a subject’s face and expressions.
For PIs who are not moonlighting as tech savants, there are some ways you can identify a deepfake without employing algorithmic technology. CNN recently covered this and offered a side-by-side of two videos to test your skills. It is not as easy as you might think, but if you are armed with some knowledge on how to spot deepfakes, you may be able to tell the difference.
Check for blurring or changes, especially around the face and lips, as these can be tell-tale signs that the video is altered. Changes can be different skin tones, additional chins/skin, blurring, etc. Additionally, if the subject of the video seems to abnormally blink (read: less than normal, or their eyelids look “off”), this can be a sign that the video you are watching is a deepfake. Finally, one of the biggest indicators is if the speech and sound do not match up with the mouth/lips you are watching.
Overall, deepfakes pose a serious threat to our culture. Be sure to arm yourself with the knowledge of what deepfakes are, how to spot them, and be prepared to deal with them.