What you call yourself on Facebook -- or what you are allowed to call yourself -- has been a contentious issue ever since the service's launch.
The social network has always maintained that it requires users to sign up under their real names in order to keep the site safe and reliable. Having in the past cited the existence of up to 83 million 'fake' accounts, and the need to protect against terrorists, bullies and scammers, its policy was to require users to provide an "authentic name" for their account.
The logic was sound in theory, but human rights groups argued that in practice the policy put marginalised people at risk: many transgender people, for example, use different names than the ones they were born with, and victims of stalking or abuse may use aliases to protect themselves from their harassers. The site was also accused of erasing identities; many Native American names were deemed 'false', and the profiles of drag performers were also deleted because of the policy.
Recently, in a letter signed by rights groups such as the American Civil Liberties Union, Human Rights Watch, and the Electronic Frontier Foundation, Facebook was criticised for "exposing its users to danger and disrespecting the identities of its users" as a result of the policy.
Facebook has apologised in the past for deleting accounts for providing alternative names, even as the larger issue remained unresolved. But now, in a long-awaited response, Facebook has said it will amend the policy.
The focus on real names won't be dropped completely -- it's been part of Facebook's terms and conditions since the site's inception -- but steps will be taken to prevent abuse. In a blog post, Facebook introduced fake name reporting tools which will require users to explain why they are reporting a name as fake, to provide context and nuance. This, in theory, could prevent fake reporting, as it will stop people being able to potentially dismantle accounts with a click of a button.
Reported users will also now have a chance to verify their identity -- a test verification form asks users whether or not they're part of a marginalised group, and will also give them an opportunity to detail their circumstances. Facebook are not guaranteed to decide in the affected party's favour, but it is a step in the direction for which human rights groups were lobbying. The new reporting tools are only available in the US for the time being, but "based on feedback, will iterate and roll them out globally".
"We want to create the best experience that we can for everyone, and we will continue to make improvements until everyone can use the name that their friends and family know them by," Facebook said.
This article was originally published by WIRED UK