The power of touch: how massage can positively impact anxiety and burnout
The power of touch: how massage can positively impact anxiety and burnout

IMAGE

Award-winning chef Graham Herterich on his life in food
Award-winning chef Graham Herterich on his life in food

Sarah Gill

Real Weddings: Heidy and Bryan’s beautiful country house wedding in Co Wexford
Real Weddings: Heidy and Bryan’s beautiful country house wedding in Co Wexford

Shayna Sappington

Page Turners: ‘The Inheritance’ author Cauvery Madhavan
Page Turners: ‘The Inheritance’ author Cauvery Madhavan

Sarah Gill

My Life in Culture: Artist Michele Hetherington
My Life in Culture: Artist Michele Hetherington

Sarah Finnan

I’m single and I’m thinking about kids, what are my options?
I’m single and I’m thinking about kids, what are my options?

Lauren Heskin

5 golden rules of home décor from an Irish interior designer
5 golden rules of home décor from an Irish interior designer

IMAGE

Ask the Doctor: ‘What does the procedure to rectify an undescended testicle involve?’
Ask the Doctor: ‘What does the procedure to rectify an undescended testicle involve?’

Sarah Gill

Small Things Like These: A necessary reminder of Ireland’s shameful past 
Small Things Like These: A necessary reminder of Ireland’s shameful past 

Sarah Finnan

House Tour: A former artist’s studio turned into a Clontarf home
House Tour: A former artist’s studio turned into a Clontarf home

Megan Burns

Image / Editorial

‘Terrifying’: App used to create fake nudes of women is shut down


By Jennifer McShane
29th Jun 2019
‘Terrifying’: App used to create fake nudes of women is shut down

An app which claimed it was able to digitally remove clothes from women to create fake nudes for “entertainment” has been shut down.  But the very fact that it was created at all – and with the purpose of ‘entertaining’ – is deeply frightening. 


A new AI-powered software tool made it easy for anyone to generate realistic nude images of women simply by feeding the program a picture of the intended target wearing clothes.

The now-defunct ‘DeepNude’ is the latest example of AI-generated deepfakes being used to create compromising images of unsuspecting women. The software was first spotted by Motherboard’s Samantha Cole, and was available to download free for Windows, with a premium version that offers better resolution output images available for $99.

The program reportedly used AI-based neural networks to remove clothing from images of women to produce realistic naked shots.

Both the free and premium versions of the app add watermarks to the AI-generated nudes that clearly identify them as “fake.”  But watermarked or not, the images could still have been used to target women.

The software was shut down hours after it was spotted available to buy; DeepNude will no longer be offered for sale and further versions won’t be released. The team also warned against sharing the software online, saying it would be against the app’s terms of service.

They acknowledge that “surely some copies” will get out, though.

And the app will still work for anyone who owns it.

Similar to *revenge porn, these images can be used to shame, harass, intimidate, and silence women. And here, at the touch of a button, you can – or could – do it from a phone.

The term ‘revenge porn’ covers the online posting of sexually explicit visual material, without the consent of the person portrayed.  The term typically includes photographs and video clips which have been consensually generated-either jointly or by self (“sexting”), as well as content covertly recorded by a partner or unknown third party.

Speaking to Motherboard, Katelyn Bowden, founder of anti-revenge porn campaign group Badass, said she found the app “terrifying.”

“Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo,” she told the site, according to the BBC.

Its very creation is sickening.

It’s outrageous.

And still, there hasn’t been much by way of outrage.

Are we becoming de-sensitised to such a horrific issue?

And, more importantly, why aren’t more laws in place banning the creation of such software?

It matters not that the nudes might be fake – should one go viral, this could see a woman’s life, career and reputation completely destroyed at the click of a button.

And all the creators had to say by way of response was that “the probability people will misuse it is too high,” after they created it for “entertainment purposes” a few months ago.

“Not that great”

Anyone who bought the app would get a refund, they said, adding that there would be no other versions of it available and withdrawing the right of anyone else to use it.

The fail to mention how they will control any variations of the software that makes its way online.

In their statement, the developers added: “Honestly, the app is not that great, it only works with particular photos.”

Yet still the website managed to crash, such was the apparent demand for the software when it made its way online originally.

We frequently hear of women in the public eye failing victim to the same thing. Jennifer Lawrence and most recently, actress Bella Thorn who leaked her own nudes after a hacker threatened her. She leaked her own before he could, by way of “taking back her own power.”

The fact that she had to – or the fact that this is happening at all in 2019 – is truly disturbing.

*If you have experienced this type of abuse and harassment please contact the Women’s Aid National Freephone Helpline on 1800 341 900 from 24 hours a day, seven days a week and speak to someone at your local Garda Victims Service Office here


Main  photograph: Unsplash