‘You don’t need to whisper rumours any more’: How AI sex videos are harming women

5 hours ago 1

Former actress Kate Bell asked to borrow her partner’s laptop. When she opened it, she couldn’t believe what she found; countless sexualised images altered using artificial intelligence, as well as sex videos of her and two other women.

It got worse. She found dozens more folders dedicated to female friends and acquaintances, containing deep fake pictures. “Some of them are in our work world, in our friendship groups,” Bell said.

“I felt extremely afraid that I was sharing a house ... with someone whose personality was just abhorrent,” said Kate Bell.60 Minutes

Bell never imagined such horrific images could be created by someone she knew, let alone the man she should have been able to trust most. “I died, I absolutely died that day,” she told 60 Minutes. “I felt extremely afraid that I was sharing a house, under surveillance, with someone whose personality was just abhorrent.”

The 42-year-old claims she’s the victim of unfathomable digital abuse. There is no evidence that Bell’s former partner shared the sexual content online, but that provides little comfort for Bell, who now works in cybersecurity.

“This is the quickest way to dehumanise a woman,” she said. “You don’t need to slander her or whisper rumours any more, you can literally create a deep fake of her doing something vulgar, and that’s it, done.

“[It was] really aggressive pornography. Images where I had been superimposed to be completely nude and there was a penis in my face, one up the front, one up the back and my arms were being pulled.”

Bell reported her former partner to police. She spent 12 hours meticulously going through what she thought was an overwhelming pile of evidence. But the investigation hit a wall. Her former partner refused to tell detectives the passcode to his mobile phone, limiting their ability to look into whether he shared any of the explicit material online (an allegation he denies).

“Here I am with a laptop full of evidence, but you can’t use it because the original source material is on the phone,” Bell said.

Bell’s former partner faced two criminal charges of filming Kate without her consent. Despite pleading guilty to both, he appealed his conviction and claimed he secretly filmed their sexual encounters because he was mentally ill.

While the magistrate dismissed the appeal, the man escaped a criminal conviction. “That to me says you do not see this as a real crime,” said Bell.

If Bell’s former partner had produced the same material today, he would face up to three years in jail due to new laws introduced in February to crack down on AI-generated digital abuse.

NSW Cybercrime Commander Detective Superintendent Matt Craft says the case highlights the need for law enforcement to keep up with rapidly evolving technology.

“The legislation is very clear that if you create images using artificial intelligence, whether they’re images or a recording, it is a criminal offence,” he said. “It is also a criminal offence to disseminate that material. So it would be an entirely different set of circumstances.”

While Bell’s former partner’s investigation cannot be reopened, cybercrime police say it sent a clear message. “He’s now on notice that the behaviour that he was engaging in previously is not tolerated, and it is certainly a criminal offence punishable by imprisonment,” Craft said.

Bell hopes sharing her story will highlight the importance of respecting women’s safety. “I’ve been through hell and I want to make sure this never happens to anyone else,” Bell said.

Start the day with a summary of the day’s most important and interesting stories, analysis and insights. Sign up for our Morning Edition newsletter.

From our partners

Read Entire Article
Koran | News | Luar negri | Bisnis Finansial