The UK is on the verge of placing the world’s internet pornography behind a wall of age checks.
When the Digital Economy Bill comes into force, any provider that benefits commercially from pornographic content must ensure under 18s can't access that content in the UK. If they fail to carry out these checks, payment providers and other third-parties will be legally obliged to step in. Failing that, ISPs will be required to block content. It is unclear how this will be achieved but it highlights the incredibly complex, expensive, and ultimately flawed process that will see tech-savvy teenagers continuing to access pornography via VPNs and other means.
Differing estimates put the internet’s pornography content at around 20 to 30 per cent of all websites in existence. And that is just on the searchable web. Speaking at the recent Westminster eForum on child online safety, Sky’s director of policy Adam Kinsley said the ISP already filters well over four million pornography sites as part of the opt-out filters scheme pushed through by the Conservative coalition, and due to be cemented in law when the Digital Economy Bill is implemented.
Read more: UK government plans to block porn sites that don’t provide age-checks
There is ample room for concern over how any institution can oversee such an enormous volume of content - the well-established, but relatively small, British Board of Film Classifications will be tasked with the fight.
But what of the part content providers, including social networks, play? They are are not considered commercial purveyors of pornography but some argue they are one of the world's largest aggregators of adult content.
During MP debates on the Digital Economy Bill, many advocates of age verification spoke of their concern over the “stumbled upon” factor - the vast majority of children who are accidentally finding pornography via pop-ups or social networks. An NSPCC survey published in 2016 found that more than 53 per cent of children had seen pornography online at least once, and, of these, 28 per cent came across it accidentally for the first time, often via pop-up advertisements; 19 per cent reported having online pornography shown to them by someone else without asking for/expecting it; and 19 per cent said they searched for it themselves.
Children and teenagers are unlikely to be forthcoming about seeking out adult content, so the results may well be skewed, but at face value, if the Digital Economy Bill is taking the report into account - and it was mentioned numerous times during the MP debates - the “stumbled upon” factor should be given the same level of importance.
This key area led to a heated debate at last week's eForum when David Cooke, director of digital and new media at Mindgeek - provider of the 59th most visited site in the world, Pornhub - suggested Twitter had as much responsibility as porn sites.
“The bill is all about protecting children from stumbling across adult content. My concern is that with Twitter, kids are going there; 13-year-olds can get a Twitter account and they allow hardcore imagery on their site without any checks whatsoever. Will that be included in the scope of the regulation?”
The answer? The regulators have said they will be “asking” Twitter to abide by the new law in some way.
“Social media is one of the categories of ancillary services providers,” said David Austin, CEO of the BBFC. “We would ask Twitter to close down an account that had hardcore pornography. But we don't have the power to compel it and we don't know how Twitter would respond.”
Within the bill, ancillary service providers - along with payment service providers - will be tasked with helping shut down the business of any porn site that does not comply with the age verification aspect of the law. The term includes any site that provides, in a business context, “services which enable or facilitate the making available of pornographic material or prohibited material on the internet by the non-complying person; or advertise, on or via any internet site operated by the non-complying person or via any other means of accessing the internet operated or provided by that person, any goods or services provided in the course of a business”. This approach has been in the works since as far back as 2013, when the UK's regulator for television on-demand, ATVOD, said it was trying to get industry support to block the flow of money to pornographic websites in the US, including PornHub.
Stephen Winyard, director and VP of ICM Registry and council member of the digital policy alliance, argued that Twitter is in fact commercially benefiting from the proliferation of pornography on the network: “It’s on Twitter, Reddit, Tumblr, mobile apps - Skype is used hugely for adult content. But Twitter is the largest platform for promoting pornography in the world - and it takes money for it. They pay Twitter money to advertise adult content.” During a WIRED search of Twitter we had to swiftly close a tab in the office when an adult video in a user's feed began to autoplay. Suffice to say, the content is there.
There were differing opinions among many of the attendees over exactly what content Twitter allows and bans, but for the record, its general policy states: “You may not use pornographic or excessively violent media in your profile image or header image. Twitter may allow some forms of graphic content in Tweets marked as sensitive media. When content crosses the line into gratuitous images of death, Twitter may ask that you remove the content out of respect for the deceased.” In a subsection, it states: "You may not feature graphic content (such as media containing pornography or excessive violence) in live video, or in your profile image or header image."
In summary, any user is allowed to share graphic content that does not veer into “gratuitous images of death” in their feed, just not in their profile image. Under its advertiser's policy, however - and in contradiction to Winyard's view - it states: "Twitter prohibits the promotion of adult or sexual products and services globally." When WIRED approached Twitter for comment, it said: "Our policies are pretty clear" and directed us to that same rule for advertisers. The representative also pointed out that in 2015, just 4 per cent of 12- to 15-year-olds reportedly used Twitter in the UK as their "main social media site or app".
Twitter and similar providers will be the next target for regulators, insisted John Carr, secretary of the children’s charities’ coalition on internet safety and member of the UK Council for Child Internet Safety. But he took umbrage with the comparison MindGeek’s Cooke made between the social network and pornography sites.
“Obviously it’s wrong that pornography is used through Twitter, but we must not confuse Twitter with Pornhub. There is a world of difference between companies that are expressly and solely about the publication of porn and social networks that forbid commercial pornography. It's about how they are failing to enforce their terms of services. The idea that Pornhub and Twitter are the same is frankly ridiculous - but if I was Pornhub that’s the position I would be pushing.”
As stated above, Twitter’s own terms of service do not explicitly prohibit pornography and over-18 content in the way many close to the debate believe. Though it could be argued that once the Digital Economy Bill comes into force, the following Twitter rule could be treated as a catchall: “You may not use our service for any unlawful purposes or in furtherance of illegal activities. International users agree to comply with all local laws regarding online conduct and acceptable content.”
Kinsley, Sky’s director of policy, reiterated concerns over where children are "accidentally" coming across content in reality.
“The Digital Economy Bill’s exact objectives are a little uncertain, but we are trying to stop children stumbling on pornography – but they are not 'stumbling', they are looking for it and Twitter is where they will [find] it. Whether what the government is proposing will deal with that threat is unclear. Initially, it did not propose ISPs blocking content. When it comes to extremist sites, the Home Office asks social media platforms to take down content. The government does not ask us to block material - it has never done that. So this is a big deal. It doesn’t happen with the IWF; it doesn't happen with terrorist material, and it wasn't in the government’s original proposal. Whether they got it right and how will we deal with these millions of sites, is unclear.
“We’re not really achieving anything if only dealing with a few sites.”
The Bill is incredibly complex, as it stands. Austin, from the BBFC, pointed out that for it to implement the bill correctly, it needs to be effective, proportionate, respectful of privacy, accountable - and the “tens of millions of adults that go online to see legal content must be able to continue to do so”.
At the same time, he said: “There is no silver bullet, no one model, no one sector that can achieve all child protection goals.”
This article was originally published by WIRED UK