Rachel wasn't sad when she broke up with her boyfriend; she was relieved. Although the relationship had started well, her partner had become controlling; coming to her flat uninvited and refusing to leave, tagging along on nights out with friends even though they clearly didn't want him there. His actions had escalated – forcing her to sleep in the same bed as him, sending hundreds of texts a day and calling thirty or forty times. His behaviour displayed all of the hallmarks of an emotionally abusive relationship, and it took Rachel several tries before he realised that it was actually over. He eventually stopped calling and the unanswered texts slowly dried up. "Then one night I was having food with friends and he phoned me again," Rachel told WIRED. "Because my phone was on the table I let my friends answer it. They did something stupid: pretended to be a pizza parlour trying to take his order or whatever. After that he texts me saying: 'You're gonna be really sorry you did that'."
When she got home, she realised what he'd meant – several naked photos of her, sent during their relationship, had been posted on Facebook. Mutual friends had been tagged, and so had Rachel – meaning her entire friends list, including members of her family, could see the pictures. She'd become just one of many victims of 'revenge porn', also known as non-consensual pornography, where explicit sexual images of someone are distributed without their knowledge or consent.
Rachel's is not an unusual case. Earlier this year, a group of University College Dublin students were found sharing naked photos of women they'd slept with on a Facebook group; a female student told the university's College Tribune newspaperthat members of the group were "sharing stories about girls they'd slept with, rating them out of ten and sharing the girls' nude photos". In October 2015 an FOI request from The Guardian revealed that there were a total of 139 revenge porn allegations in the UK between January and April of 2015. Ten of these cases involved girls under the age of consent.
The internet and social media has made harassment and stalking easy, immediate and virtually cost-free, Mary-Anne Franks told WIRED. Franks is professor at law at the University of Miami School of Law, and has worked on several cases relating to non-consensual pornography. She's an advocate for victims of revenge porn and has been involved in US legislative reforms around the topic, as well as advising technology companies on abuse policies. She describes the internet as "a force multiplier for abusive behaviour". "The internet and social media has made harassment and stalking easy, immediate and virtually cost-free," she said. "It greatly increases the benefits of abusive behaviour, offering abusers social validation, reputational boosts and, sometimes in the case of revenge porn, money."
Ann Olivarius agrees. Olivarius is a senior partner at McAllister Olivarius, where she works on civil rights cases; she was involved in litigating a case which found, for the first time, that sexual harassment within a university was illegal. She acknowledges that gendered abuse against women is not a new phenomenon ("It's abominably long-standing"), but believes social media and digital technology allow it to occur "on an unprecedented scale and in new transformations". "I regularly hear victims of online harassment advised to just 'log off' or 'leave those sites' -- a solution that totally misunderstands the problem and takes for granted that online gendered abuse is just endemic to contemporary digital life," she said.
"The digital world is not separate – it is the real world, where people express opinions and have important social existences. It's not something we can fully opt out of, or should be asked to. We need to treat online abuse as real, as an extension of existing patterns of abusive behaviour."
Women are overwhelmingly the victims of revenge porn. Of the 139 cases reported in the UK between January and April 2015, 80 percent involved images of women. Read more: Emma Watson takes legal action over leaked photos
Research from the End Revenge Porn campaign put the figure even higher, at 90 percent. The same research highlights the very real impact of the phenomenon -- 93 percent of victims said they suffered "significant emotional distress"; 82 percent claimed "significant impairment in social, occupational, or other important areas of functioning" due to being a victim; 51 percent had suicidal thoughts due to being a victim; 42 percent sought psychological services. "The public sexual humiliation of women as punishment for perceived disobedience has become an industry in itself," said Franks. "Whether it's for leaving a relationship or expressing an unpopular opinion or simply refusing to conform to men's expectations of what they should be, women increasingly face the threat of vicious, crowd-sourced attacks on their privacy, reputation and physical safety."
Non-consensual pornography was criminalised in the UK in April 2015, with an amendment to the Criminal Justice and Courts Bill that stated it would be an offence for a person to "publish a private sexual image of another identifiable person without their consent where this disclosure causes distress". Distributors of non-consensual pornography can also be prosecuted under the Malicious Communications Act or the Stalking and Harassment Act, though to be prosecuted under the latter behaviour would have to be "a repeated act". The maximum prison sentence for distributing revenge porn is two years.
In September last year, it was announced that a total of 206 people had been prosecuted under the "revenge porn” law since it came into effect. A report by the Crown Prosecution Service (CPS) also revealed social media was a "growing trend" connected to revenge porn offences, as in some cases defendants used social networks such as Facebook to distribute intimate pictures of victims. Last year was the first year prosecutions for revenge porn had been included in its figures for England and Wales. Lucy Hastings, director at Victim Support, a charity that provides support for cases of domestic and sexual abuse, said more needed to be done to support victims of revenge porn.
"Revenge pornography is a particularly distressing crime for the victim which is often, but not always, brought about by the vengeful actions of former partners," she said. "It is a violation of trust between two people and its purpose is to publicly humiliate."
But at such an early stage it's hard to know how effective these laws are. Franks told WIRED that recent changes in the law hadn't solved the "real data problem" of revenge porn. "Because very few laws against non-consensual pornography existed until very recently, we don't have robust reporting data on the phenomenon," she said. "Because non-consensual pornography hasn't been -- and still isn't -- widely considered a crime, we don't have numbers like that for this conduct. Even if we did, the majority of laws against non-consensual porn have been passed so recently that we wouldn't expect to see much of an effect yet."
Franks also worries about "sexist and regressive beliefs about 'proper' sexual behaviour", as well as the "varying levels of technological literacy" found in police teams, which she says can "often translate into hostile or indifferent law enforcement responses".
In August 2015 Saunders acknowledged that figures were hard to come by, but was confident the new laws would make a difference. "The new offence has only been in force since April so it is too early for us to be able to say what impact this is having on the number of prosecutions," she said. "But anecdotally we are seeing more of these cases being brought to us by the police and it is clear that the new legislation is having an impact."
The law might have been slow to catch up, but it is now being strenuously tested. In June 2015 a revenge porn case in the Netherlands reached a legal deadlock after Facebook failed to provide police with information about an anonymous user who uploaded a sexually explicit video of a woman. The woman, who identified herself only as Chantal, had originally sued the social network after a video of her performing a sex act was posted in late January.
A Dutch court ordered Facebook to hand over the account details of the person who posted the revenge porn, but Facebook said all detailed relating to the user had been erased 90 days after it was deleted. Last week the Amsterdam District Court ordered Facebook to hand over any remaining information or have an "independent expert" access its servers.
For Facebook, revenge porn remains a major issue. CEO Mark Zuckerberg has argued his company is "committed to doing better" in its response to hate speech. The social network recently launched a suicide prevention initative to better monitor vulnerable and potentially suicidal users and report worrying posts. Its Community Standards reflect this narrative: "We want people to feel safe when using Facebook," it states.
As for revenge porn, Facebook's community standards explain that moderators "remove any content that threatens or promotes sexual violence". Its definition of sexual exploitation -- which is also banned on the site -- includes "solicitation of sexual material... [and] threats to share intimate images". All official statements relating to sexual exploitation, non-consensual pornography or other explicit imagery refer to it as "abusive content". Facebook's Community Standards warn that any content that "threatens of promotes sexual violence or exploitation" will be removed.
But the victims of Facebook-based revenge porn tell a different story.
"Facebook were okay to a point," Megan, another victim of revenge porn, told WIRED. "When I complained and reported the images, they came down within a couple of days, sometimes longer when they got loaded on weekends.". What they didn't do, however, was delete the offending account.
Megan's ex-girlfriend initially told her the photos had been uploaded when someone stole her phone. But after a failed attempt to get back together with Megan, the photos reappeared online. "When it first happened I contacted her and she removed them, and swore the others had been deleted. But two weeks later the same thing happened again with more photos," she said. The cycle continued, on and off, for eight weeks -- a period Megan describes as "the worst eight weeks of my life".
It might employ teams of moderators working around the clock, but Facebook's system for removing offensive content is far from perfect. Despite numerous complaints to the social network, Megan said the cycle only stopped after she contacted her ex-girlfriend's family in a last-ditch attempt to stem the flow of the images.
This is not just an issue for Facebook. Twitter banned revenge porn in March 2015 after 'the Fappening', where a number of stolen celebrity nude photos, which originated on 4chan, were spread on Reddit and Imgur. Reddit followed Twitter's lead, with CEO Ellen Pao banning revenge porn in July 2015. The decision was met with with a campaign of trolling and harassment from Reddit users. Pao resigned eight days later after receiving "sickening abuse".
In response to these cases, Facebook told WIRED that it's constantly working on improving moderation, both generally and in cases of revenge porn. Although it couldn't comment on specifics to protect the anonymity of the victims, a spokesperson pointed to Facebook's Help Centre, which allows victims of emotional or domestic violence to report their ex or current partners. The spokesperson also said Facebook was working with charities including Women's Aid and the Revenge Porn Helpline to target non-consensual pornography on the platform.
But Franks said technology companies must put aside worries about user experience and focus on protecting vulnerable users. "The nature of non-consensual pornography is that once it's out there, it's very hard to remove. These companies must implement steps to deter unauthorised disclosures of private information before they happen". Pop-up warnings or algorithms that automatically detect sexually explicit imagery were both good options, Franks argued.
"Whatever inconvenience these pre-emptive measures might impose on companies and users must be balanced against the wide-ranging and often irreversible harm caused by the unauthorised disclosure of intimate data."
Progress is being made. Franks said she was pleased that the likes of Facebook, Google, Reddit and Twitter had recently taken stands against consensual pornography, "acknowledging that sexual privacy is no less deserving of protection as other forms of privacy". But, she suggested, companies needed to do more than just announce "after the fact policies".
Olivarius suggested that social networks must "recognise that online harassment is not something that 'just happens'". "Their choices about how to run their platforms significantly shape behaviour," she said. "The services they design and policies they implement can either facilitate or discourage abuse. I'd like to see them make safety a priority, not only in official policies but across the design process." "There's no silver bullet to ending non-consensual pornography," Franks said. "Like domestic violence and sexual assault, non-consensual pornography is the product of a culture that does not view women as fully human and deserving of the same rights of bodily autonomy as men. It is not a matter of changing our laws or our technology or our culture; it is a matter of changing all of them."
Some names have been changed.
This article was originally published by WIRED UK