Dipfake: the dangers and risks of the technology, how to distinguish a fake and liability under the law
While dipfakes were once used for fun and experimentation, it has now become a weapon for stealing money and spoofing the voices of loved ones.
In the article we will consider the work of dipfake, instructions on how to recognize a fake message or call, and how the legal level regulates the responsibility for the creation of such content.
What dipfake is and how it works
Deepfake is the creation of fake photos, videos or audio using neural networks. The person looks so natural that it is difficult to distinguish whether it is true or the result of machine processing.
Dipfakes is based on generative-adversarial networks (GAN). It is a self-learning algorithm built on two neural networks. The first one is. generatorcreates a fake. And the other one. discriminatorThe discriminator checks the result and looks for errors. If the discriminator notices inaccuracies and dissimilarities, it indicates it and the generator repeats its work.
The first use of such technology was in 2014 - Audrey Hepburn's young face was used in a Dove advertisement. The unrealism is immediately noticeable.
In 2018, comedian Jordan Peele created a video of President Barack Obama calling Trump a jerk. Four years later, technology has improved and the video caused a stir. But if you look closely, you can see unnatural lip movement and lethargy. Peel's production company created the video to show how dangerous dipfakes are.
By 2026, the technology has been refined. Now neural networks copy not only facial expressions, but also micro-motor skills. AI takes into account lighting and shadows, synchronizes pronunciation with lip movement, recreates the manner of speech, intonation and even pauses.
Essentially there is a digital copying of a person, resulting in a breach of the security of their personal data.
Main types and examples of use
Below is a table with examples of how dipfakes are used for safe and recreational purposes and for fraudulent purposes.
Safe application | Dangerous use |
The rejuvenation of actors in movies | Stealing money through voice and video imitation |
The resurrection of screen characters | Fraud through fake advertisements |
Synchronization of dubbing during translation | Fraud under the scheme "Call from the manager" |
Creating personalized advertising | Misinformation in politics and economics |
According to the ANO Dialogue Regions, 342 unique cases of dipfakes were identified between January and September 2025. The creation of false information affected the whole of Russia: from Kaliningrad to Sakhalin.
Most often, regional governors and other officials appear in dipfakes. On their behalf, they report on absurd regulations, new rules and problems that cause resentment on the part of residents.
Dipfake is a tool that works for good or for bad, depending on whose hands it ends up in. For fraudsters it is a tool of manipulation, in the film and art industry it is an opportunity to create new things and experiment.
How to recognize a dipshake
Despite advances in dipfakes, they are still inaccurate and can still be distinguished. Attention to detail will help with this.
Visual Signs. Blurred facial contours, severe asymmetry, unnatural smoothness of the skin, blurred jewelry. Facial contours ripple when moving. Inconsistent lighting or shadows. Absence of blinking.
Auditory signs. Pay attention to how the person speaks, not what he or she says. The speech is monotonous and lacks the usual intonation. The sentences are pronounced without natural stops. There is no background noise or sound of movements when speaking. Metallic notes and abrupt cutting off of words are noticeable in conversation.
A reliable way to check the person you are talking to is to ask them to do an active action. Turn your head in different directions, smile or frown quickly, pass your hand in front of your face, take off or put on jewelry.
In such a case, the neural network is knocked down and the generated patterns are manifested.
The main rule is that if there is a feeling of "something wrong" when you call, or if the person asks you to transfer money to him quickly without explanation, stop. Contact the person in another way and clarify whether he or she is the one who contacted you.
Russian legislation: what kind of dipfakes are prohibited
There are no laws in Russia directly regulating the creation and use of dipfakes. If it is not based on the image of another person or someone else's material, such a dipfake has the right to exist.
But if the voice or person is used without consent, it is already a violation of the Civil Code, in particular, the results of intellectual activity and personal non-property rights. Article 152.1 of the Civil Code of the Russian Federation regulates:
Disclosure and further use of the citizen's image (including his/her photograph, as well as video recordings or works of fine art in which he/she is depicted) are only allowed with consents of this citizen. |
The exception is if the image was obtained in the state or public interest, in public places or the citizen posed for a fee.
In case of violation of his rights, a citizen shall have the right to demand the removal of the image and the suppression or prohibition of its further dissemination.
There are also no specific laws in the criminal code yet. But the Russian Government is considering a bill on criminal liability: up to 6 years' imprisonment or a fine of up to 1.5 million rubles for using dipfakes for the purpose of fraud or obtaining property fraudulently.
At the administrative level, this is regulated by Article 13.15 Part 9 and 10 of the CAO RF:
To dissemination knowingly unreliable information of public importance under the guise of reliable reports, creating a threat of harm to the life and (or) health of citizens, property, a threat of mass disruption of public order and (or) public safety shall entail a fine of 100 thousand rubles with confiscation of the object. |
The discussion on punishment for the use of diplomacy in crimes is still ongoing. For example, the head of the Ministry of Digital Development proposed to consider diplomacy as an aggravating circumstance when committing cybercrime.
Thus, it is important to realize that using someone else's face or voice without consent for the sake of a joke can result in administrative, civil or criminal liability.
How to protect yourself against biometric theft
Below are practical tips on digital hygiene: how to keep intruders from taking advantage of your face and voice.
High-resolution video with clear sound is the perfect dataset for scammers. And open social networks, storis, and mugs in Telegram are a source of such information. What to do?
- Limit access to social media and personal channels;
- Don't post videos where facial expressions are clearly visible;
- Do not send voice messages to strangers;
- Do not post long audio recordings in the public domain;
- Enter a code word with loved ones in case of emergencies and calls;
- Use detector services for video and audio communication.
A subsidiary of MTS, MWS, ran a program like this to detect fake voice and generated content to protect video calls.
The main rule remains caution and a little more verification. Biometrics has become as valuable as a wallet and passport data.
Questions about digital fakes
Can you go to jail for a funny dipfake with a friend?
If the friend does not mind and the publication is not made with the purpose to harm or deceive, then no. But if the dipfake is posted without consent, violates civil rights, leads to material damage, then the situation becomes legal.
Are there programs that detect dipfake with 100% guarantee?
There is no such program. Detectors are improving, but so are dipfake generators. There are a number of tools to help recognize, but they work in tandem with human verification.
How long does it take for scammers to fake my vote?
Fraudsters need at least 20 seconds of your voice with no background noise. A better fake requires a speech of several minutes or more. Therefore, the scammers' goal is to keep you on the phone line as long as possible.