As defined by Wikipedia, “Deepfakes are media that take a person in an existing image or video and replace them with someone else’s likeness using artificial neural networks. They often combine and superimpose existing media onto source media using machine learning techniques known as autoencoders and generative adversarial networks (GANs). Deepfakes have garnered widespread attention for their uses in celebrity pornographic videos, revenge porn, fake news, hoaxes, and financial fraud. This has elicited responses from both industry and government to detect and limit their use.”
A deep-learning system can produce a persuasive counterfeit by studying photographs and videos of a target person from multiple angles, and then mimicking its behaviour and speech patterns.
This technology was used in film industry. In August 2018, researchers at the University of California, Berkeley published a paper introducing a fake dancing app that can create the impression of masterful dancing ability using AI. This project expands the application of deepfakes to the entire body; previous works focused on the head or parts of the face.
In January 2018, a proprietary desktop application called FakeApp was launched. This app allows users to easily create and share videos with their faces swapped with each other. As of 2019, FakeApp has been superseded by open-source alternatives such as Faceswap and the command line-based DeepFaceLab.
As a Fraud
Audio deepfakes have been used as part of social engineering scams, fooling people into thinking they are receiving instructions from a trusted individual. In 2019, a U.K.-based energy firm’s CEO was scammed over the phone when he was ordered to transfer €220,000 into a Hungarian bank account by an individual who used audio deepfake technology to impersonate the voice of the firm’s parent company’s chief executive. The perpetrator reportedly called three times and requested a second payment but was turned down when the CEO realized the phone number of the caller was Austrian and that the money was not being reimbursed as he was told it would be.
Deepfakes have been used to misrepresent well-known politicians in videos. That can be even used to create hoaxes, such as a video of a president saying crazy things.
Moreover, people can be inserted to pornographic films through this technique with the purpose of harassing or harming them. Accordingly, deepfakes will become a critical issue or doubt in authentication or credibility of a video creation.
In today’s world, this technology exists and works too. It means that we have to look at media like videos in a whole new perspective. If someone circulates a video on social media of a famous person or politician saying or doing something controversial, you will first have to ask if the video is even real.
How to identify such a deepfake video?
It is a challenge highly bound with “technology”. The most common method is to use similar algorithm/s that is / are being used to create deepfake video to detect the same. The other way is to use technology (blockchain) to verify the source.
Consequently, Deepfakes has been identified as one of the main cyber security threats / issues in 2020.