FBI reports that people are using deepfakes to apply for remote jobs

0
653
Alejandro Escamilla / Unsplash

It may seem that today’s scammers have tried just about everything for their crimes, but that’s not true. Every day people and big companies fall victim to new crimes. The FBI has learned that fraudsters are using deepfakes to gain access to corporate IT networks. This is done by using fakes to apply for remote technology positions.

At first glance, it may seem like an isolated incident and it won’t happen to other people, but it’s worth learning more about this technology. Not only companies but also ordinary people can become victims because this technology has gone too far.

What is deepfakes technology

Deepfakes are synthetically produced media content in which the original person, the one who is originally in the photo or video, is replaced by another person. Deepfakes algorithms are based on an autoencoding neural network. Later this neural network was improved with a generative-adversarial network.

The principle of creating deepfakes consists of several steps. It starts with a neural network being given a large amount of data. It then processes the array and learns facial features and facial expressions. The final stage is when the neural network learns to reproduce a person’s face and can be used for content synthesis.

Deepfake services work using open machine learning algorithms and libraries. At the expense of what a neural network can’t only learn, but also achieve an ideal quality close to the real human face.

Quite often deepfake technology is used for entertainment by substituting the faces of famous people. Most often such videos are distributed online for fun. However, with it, there is a risk to become a victim of cybercriminals who can use your photos and voice for their purposes.

How do scammers use deepfakes to apply for remote jobs

The FBI has learned of new crimes using deepfakes technology. Fraudsters are using deepfakes to gain information from corporate IT networks. This is done by conducting online interviews for remote positions using stolen identities.

That is, scammers use other people’s stolen information that infects their identity. They then use these identities to get jobs at information technology, programming, database, and software companies. Accordingly, these scammers are only interested in working remotely.

If the interview using deepfakes goes well, the fraudster has access to the company’s data. That’s when scammers can borrow data on the company’s financial and proprietary information for their purposes. Most often this happens to sell confidential company data and take over the money.

What does the FBI say about the use of deepfakes technology

The FBI has been approached by quite a few company representatives recently. They have reported that there have been many attempts to get remote jobs at their company by fraudsters. These interviews have been conducted using videos, images, or fake recordings.

The FBI’s Internet Crime Complaint Center reports that there has been a significant increase in the number of complaints about the use of deepfakes. So has the use of stolen personal information to apply for remote jobs, mostly in technology.

Quite a few of the company representatives who were contacted reported that fraudsters managed to obtain the necessary data. The reason for this is the improvement of deepfakes technology and the frequent use of remote work.

In reports from the FBI’s Internet Crime Complaint Center, these crimes most often involve remote jobs in information technology, programming, databases, and software-related job functions. Along with this, scammers used stolen PII to apply for these remote positions.

Still, statistics show that most companies recognize a new employee impersonating another person as a fraudster. This happens when background checks are done before hiring. As a result, it turns out that the PII listed by some of the apps belongs to another person.

Also along with this, a large percentage of app companies reported fraud and fraud only after the paycheck was paid to an employee who was impersonating another person. The same or after the data was taken over and disseminated to competitors or other people.

Can deepfakes be considered a threat?

Deepfakes are a threat in today’s world. Sometimes the development of technology helps fraudsters to commit successful crimes. This is what causes companies to lose the money they earn or confidential information becomes available to third parties. As a result, the scammers may not be held responsible for their actions because their personal data and appearance weren’t captured.

In other situations, deepfakes can also be a threat. With the spread of deepfakes, there is a danger of discrediting any user whose photo is online. Given the number of photos that people post on social media, there is no shortage of material for deepfakes.

How to identify deepfakes

The representatives of the companies that contacted the FBI reported that they were able to prevent some cases because they could identify deepfakes. So it’s realistic but you need to watch closely and have a good internet connection.

There were also cases where the scammer started coughing or sneezing, but at the same time the video wasn’t moving coats, it helped to recognize the deepfakes in the early stages of the interview. This is the easiest way to know if the person in front of you is impersonating someone else.

Since today deepfakes have flaws, which is why the video has some visual inconsistencies. Most often, a synthetic video has some movement of the head and torso, but it gives the cheater. As almost always such a video will have synchronization problems between the movements of the face and lips and the corresponding sound.

Visual inconsistencies, usually presented in synthetic videos, include head and torso movements and synchronization problems between the facial and lip movements and the corresponding sound.

Company owners and people whose responsibilities include conducting online interviews should keep the most important rules in mind to prevent deepfakes fraud. There are several recommendations from the FBI, the main one is to keep an eye out for new ways to detect deepfakes and methods to combat them.

Also, use multifactor authentication of employees, and electronic signatures to protect emails. You should minimize the number of communication channels of the company and insist on a personal meeting in the office before the new employee gets access to company information.