Deepfake pornography: “My body is not an object that can be used without my permission”

0
38

300 pornographic films, in which the faces of porn actresses have been replaced by his, without his consent. This is the mind-boggling number of fakes found on the Internet by Énora Malagré and her lawyer. And again, the actress and host suspects the latter of not telling her the exact number of fake videos that contain her image. No doubt he wishes to protect her from this vertigo. From this traumatic event, still too fresh.

After twenty minutes ofinterviewit becomes more and more complicated to testify without sobbing. Barely six months ago, Énora Malagré learned that she was a victim of deepfakes (contraction of “deep learning” and “fake”, which can literally be translated as “false depth”) of a pornographic nature. Barely six months ago, she did not even know what this English expression meant, “the very existence of the process”.

The bewilderment and confusion

He is a colleague of Further investigation who came across these pornographic montages that feature her while he was investigating this serious societal subject. The teams of France 2 warn Enora, send her some screenshots so that she understands what it is about.

I will remember all my life that moment when I clicked on the image. I burst into tears and felt physical nausea.

“I will remember all my life this moment when I clicked on the image that was downloading. I burst into tears and I felt physical nausea, the urge to vomit”, is painfully remembers the victim. Alone in front of her desk, Énora “balances” her smart phone“disgust”.

Stunnedness, anger and then… the tiny but awful moment of confusion. “The famous second when you wonder if it’s you in the picture.”

The pornographic deepfake has the particularity of being so well done, so realistic, that we do not realize that it is a montage“, alert Master RachelFlora Pardo, which defends victims of sexist and sexual cyber-violence. Hence the trouble it creates in the victim.

“Even today, I can’t explain this feeling, I can’t control this feeling”, admits Énora Malagré, disturbed. The television woman is currently working with her shrink on this “mechanism wanted by the aggressors”, this painful moment when she considered that it was really her body.

The specialist also helps him to “accept that the deepfake pornography is a form of sexual assault“. “I saw myself – well, my face – having sex that was not consensual. I thought it was sexual assault. Then I felt guilty vis-à-vis the victims of other sexual violence for having formulated this idea. I felt like it almost instantly,” says the 41-year-old woman with a lump in her throat.

However, this is precisely the will of the aggressors hidden behind their screens, according to Ketsia Mutombo, president and co-founder of the collective Feminists Against Cyberbullying : that the victim who discovers himself grafted in a pornographic video, in humiliating positions, feels sexually assaulted, “to destroy her psychologically, to humiliate her in her flesh, to return her very violently to her sexuality”.

Harass and humiliate women

One deep fake, two female victims. The one whose face has been glued to another body, but also the one whose “image of her body is found diverted”, rightly points out the lawyer Rachel-Flore Pardo, co-founder of the association Stop Fisha, who fights against sexual and gender-based violence online, and co-author of the manual Fight cybersexism (Leduc editions).

Pornographic deepfake is gendered violence.

These pornographic videos manipulated “thanks” to artificial intelligence exclusively target women and represent 96% of all deepfakes, according to the alarming survey published in 2019 by deep tracea Dutch company specializing in this technology.

“The deepfake pornography is gendered violence. In our societies which associate nudity and sexuality, men’s sexuality is accepted, so their nudity only exposes them to very little violence”, analyzes Ketsia Mutombo, from Feminists Against Cyberbullying.

“It’s unbearable. As if we women need a new form of harassment… The street is not safe for us. Sometimes the business and the home aren’t either. And now the Internet. When are we going to stop being objectified and sexualized? I’m fed up. My body is not an object that can be used without my permission,” says Énora, crying and angry.

A taboo and a feeling of loneliness for the victims

The famous woman carries this fight that fell on her at arm’s length. But alone.

However, since she has testified, other well-known women have told her that they too are victims of such montages. Impossible for the latter, very shocked, to speak about it publicly for the moment. What Énora Malagré and her strength of mind understand perfectly. “But if I’m all alone with my little flag, the phenomenon will remain silent,” fears the feminist activist who experiences a feeling of loneliness.

In the United States, awareness of this malicious practice seems to him less taboo, more publicized. Thanks to Scarlett Johansson in particular. Victim of deepfakes pornographic like many American actresses, the Hollywood star called for means to fight against this phenomenon, more widespread – for the moment – on the other side of the Atlantic, where the social network Reddit has, since 2014, popularized the consumption of this faked pornographic content. His powerful testimony alerted public opinion to this misogynistic process and even allowed the adoption in the summer of 2019 of a law against deepfakes by the state of Virginia.

Énora Malagré worries about the road that remains to be traveled in France when she thinks back to the reactions of those around her. “You are famous, it’s normal”, or “It’s the ransom of glory”, considers a first category devoid of compassion. “It doesn’t matter, you know it’s not really you, it’s still a porn movie,” other friends retort. “It’s not a porn movie, it’s a deepfake porn! I did not act in this film. I did not choose to be in it, I was put there”, corrects the actress, who deplores the ravages of pornographic culture. “No one realizes that there is a victim in the middle of all this ?” she says, crestfallen.

Targeted by professionals and amateurs alike

Why she ? Haunting question that often comes to mind. How did she become the target of these creators of degrading content? “They have to click on the names of the most searched women on Google“, imagines Énora Malagré. “And then, I was on the air every evening for ten years, recalls the ex-columnist of Do not touch My TV. With all these snippets online, it must be easy for them to pick up many of my facial expressions.”

“Now, a lot of innovative, very cutting-edge applications are able to put a photo in motion and then insert it into a video. All you need is a face shot, sometimes found on a professional social network“, would like to underline the president of the collective Feminists Against Cyberbullyingimmediately joined by lawyer Rachel-Flore Pardo: “I have already accompanied non-famous victims. The creators of deepfakes spare no one.” But who are these men behind these montages?

“Associations of criminals who have very great technical skills, but also, malicious amateurs, who simply use applications”, answers Ketsia Mutombo. Who specifies, with regard to this second category: “On Telegram Where Signal – where the data is encrypted – men who do not know each other exchange criminal material to create violent media towards a targeted woman. This practice started on Facebook, they met in closed or secret groups, unlisted.

Justice can promise nothing to the victim

Énora Malagré appealed to the platforms that host these fake videos to delete them. Then she decided to file a complaint against X, even if “the time of justice is slower than that of the aggressors, who quickly change IP addresses”, she regrets.

This is exactly what Master Rachel-Flore Pardo recommends: report the assembly and file a complaint. She recalls that it is “a criminal offense, constituting an offense of identity theft, and sometimes cyber-harassment.” “It is a peculiarity of deepfake porn: the face is recognizable. The victim can be identified, he is designated as a target. So there is unfortunately no doubt that the dissemination of these images is followed by the harassment of the person (offensive messages, threats…)”, she adds.

We cannot offer the victim any guarantee that the mount will not reappear.

The lawyer specializing in cyber-sexism advises these women to preserve the screenshots that will serve as evidence when filing a complaint. Even if “it is not rare that, when a victim declares that they are montages, the audience says that it is not true, that they are justifications, so much the faked images seem probable”, deplores the president of Feminists Against Cyberbullying. “Not to be believed is already violence,” asserts the young woman.

Uncertainty, helplessness in the face of the image which can reappear at any time, despite these steps: this is the other violence with which a victim is confronted throughout his legal journey. The co-founder of Stop Fisha precisely describes this terrible situation: “On the Internet, new ground of commissions of violence, there is still sometimes an area of ​​lawlessness. Even by reporting an image, even after having obtained a criminal conviction, we do not have the means to prevent its replay. We cannot offer the victim the guarantee that the montage will not reappear. We are unable to put an end to the infringement, to promise him: ‘It’s fine, it won’t come out ‘”. Permanent anguish, lifelong harm.

LEAVE A REPLY

Please enter your comment!
Please enter your name here