We’re just starting to understand the scale of Russian influence in the US election. Facebook stated that 50 per cent of the 2016 voting-age population, some 126 million people, were exposed to Russian efforts to influence opinion. More revelations will no doubt follow.
But when I saw the evidence described in Special Counsel Robert Mueller’s indictment, I wasn’t surprised by the murky activities as much as the systematic approach to research, analytics and strategy. The Russians were using the same techniques as some of the most successful startups in the world – tactics that I use every day and have done for a decade.
Today, I’m head of growth at early-stage tech investor Forward Partners, where I provide hands-on support to our portfolio of 46 startups.
But my first internship was at a small independent record label, based outside of London and focused on up-and-coming artists. I started in early 2008, and was told that I would be responsible for helping the artists get more views on YouTube. The unique selling point of the label was that we could support artists with their “digital PR”, the vague term used before we were awash with “social media gurus”.
But there was a twist. It was quickly made apparent that my internship involved creating fake Twitter accounts which – automatically and with little cost – promoted our artists’ YouTube videos. It was questionable how effective these campaigns were at generating real fans. I applauded the creative thinking, but the work felt disreputable and was so repetitive and uninteresting that I swiftly left.
Ten years later, and this is still going on, both in so-called marketing agencies across the globe, but also allegedly at the Russian Internet Research Agency, a company with strong links to the Kremlin that is involved in online campaigns, referenced as the defendant in an indictment concerning their “conspiracy to defraud the United States.”
My brief experience at the “Twitter factory” echoes the experience of Ludmila Savchuk, a former employee of the Russian Internet Research Agency who is now campaigning for the end of their practices. She believes that “information peace is the start of real peace”.
Savchuk confirmed what many in Russia had already suspected – that the “trolls” are running wild and are fed by the Kremlin. Through social media and news sites, they are influencing public opinion globally, promoting Putin’s views via fake online profiles. And while I don't condone the intent, I have to say, the execution was impressive.
Once you dive into the detail of the indictment, the strategy and tactics the Russians used to influence American opinion are not dissimilar to those I use regularly to help startups:
In order to gauge the performance of various groups on social media sites, the ORGANIZATION tracked certain metrics like the group’s size, the frequency of content placed by the group, and the level of audience engagement with that content, such as the average number of comments or responses to a post.
(Point 29, INDICTMENT, Case 1:18-cr-00032-DLF Source)
Regular data analysis is an important part of my job and anyone leading growth for a startup. I make sure companies I work with have a dashboard that tracks monthly, weekly and daily performance of the business, including engagement levels on social media.
Starting in or around June 2016, Defendants and their co-conspirators, posing online as U.S. persons, communicated with a real U.S. person affiliated with a Texas-based grassroots organization.
During the exchange, Defendants and their co-conspirators learned from the real U.S. person that they should focus their activities on ‘purple states like Colorado, Virginia & Florida.’ After that exchange, Defendants and their co-conspirators commonly referred to targeting “purple states” in directing their efforts.
(Point 31, INDICTMENT, Case 1:18-cr-00032-DLF Source)
I don’t begin working with a business until they have done extensive customer development interviews with prospective customers, and I’m adamant that this continues systematically. Surprisingly, many companies in the market fail to understand the importance of this and think they can build products and campaigns in a vacuum which rarely leads to good outcomes.
This key piece of insight about “purple states” allowed the Internet Research Agency to improve their targeting radically, and drive growth in their core mission more effectively.
Read more: Twitter has admitted Russian trolls targeted the Brexit vote (a little bit)
Specialists received feedback and directions to improve the quality of their posts. Defendants and their co-conspirators issued or received guidance on: ratios of text, graphics, and video to use in posts; the number of accounts to operate; and the role of each account.
(Point 38, INDICTMENT, Case 1:18-cr-00032-DLF Source)
I always advise startups that it’s difficult to improve unless you’re humble enough to know that you don’t have all the answers. We build in significant feedback loops from stakeholders and the audience themselves to make sure we’re communicating effectively. The Russians followed this method, initialising the same feedback loops which in turn allowed them to develop their skills rapidly.
Defendants and their co-conspirators also created thematic group pages on social media sites, particularly on the social media platforms Facebook and Instagram.
Starting at least in or around 2015, Defendants and their co-conspirators began to purchase advertisements on online social media sites to promote ORGANIZATION-controlled social media groups.
(Point 34 & 35, INDICTMENT, Case 1:18-cr-00032-DLF Source)
Facebook Groups can offer a more satisfying experience than your general newsfeed. This has not passed Mark Zuckerberg by.
This is currently one of the most effective ways to drive traffic from Facebook: it’s favoured by the Facebook algorithm.
The Internet Research Agency understood this perfectly, and used their paid marketing spend to drive traffic to the groups rather than a website. For this, they get full marks.
You can see how I manage Facebook groups, by joining the Founder Community.
Read more: How Vladimir Putin mastered the art of 'online Judo' – and why the west should be worried
Automated and semiautomated Twitter accounts, bots, have recently gained significant public attention due to their potential interference in the political realm... we find suggestive evidence that one prominent activity that bots were involved in on Russian political Twitter is the spread of news stories and promotion of media.
("Detecting Bots on Russian Political Twitter," Stukal Denis, Sanovich Sergey, Bonneau Richard, and Tucker Joshua A, Big Data, December 2017, Vol. 5, No. 4: 310-324)
Extensive research has showed the clear use of "bots" by the Internet Research Agency, these are software applications which run repeated scripts on the internet. The software is readily available. It’s a grey area but it would appear as long as you are providing ‘helpful’ information and using the API Twitter endorses it.
Scary stuff perhaps, but when Twitter (and other platforms) are measured on Monthly Active Users by their shareholders, they don’t seem incentivised to fix this. And they’ve got some way to go, as the platforms weren’t built with this use-case in mind.
I’ve helped companies succeed with their social media automation without straying into black-hat territory. It’s incredibly powerful, but unlikely to be a tactic that lasts – especially now Russian factions have exploited it for geo-political gain.
The Internet Research Agency were combining aggressive automation with fake account creation, which is something I do not advise on both accounts.
You got a list of topics to write about. Every piece of news was taken care of by three trolls each, and the three of us would make up an act. We had to make it look like we were not trolls but real people.
(Marat Mindiyarov, former employee in the Facebook department of Internet Research Agency)
Clearly a mix of automation and manual work seems to have been at the heart of the Internet Research Agency’s approach, which shows they have a sophisticated understanding about the limitations of social media automation.
But the deeper point here is that real human interaction is very hard to replicate. Fake interaction is relatively easy to detect - you can easily fish out fake accounts, simply by following particular US news sites on Facebook. All it takes is a few clicks on profiles with high engagement - they are often full of pro-Trump memes and nothing else. All political pomp and personality, no human identity.
As a business, assuming you’re not looking to influence an election anytime soon, you should strive to create legitimate communities of engaged customers. This is achieved by being genuine, transparent and sometimes even vulnerable. You need to instil the importance of speaking to your customers regularly into your teams, via whatever means are most appropriate to your business.
In terms of what lies ahead for those who have been indicted, I’m sure they’ll have nothing to fear as long as they remain in Russia. We wait to hear how the United States Government respond, there’s no doubt the social media companies are anticipating strong regulatory pressure will be announced soon, thus forcing them to make significant changes.
The future of social media and its place in society seems assured, we’re addicted to it – sit in any restaurant or tube carriage in the world and you’ll see just how attached to our phones and our social media accounts we really are.
The addictive qualities of social are no accident. Facebook ex-President Sean Parker, recently admitted that the founders built the platform to exploit “a vulnerability in human psychology”. Zuckerberg and the team were motivated from day one to answer the question “How do we consume as much of your time and conscious attention as possible?”
Over a decade since I was asked to build fake Twitter profiles for musicians, and since social media took off in earnest, we should take a serious look at how it’s impacting our society. It's not just that our collective hive mind was (allegedly) hacked by rogue actors, but our individual minds, which are being hacked by the social media platforms themselves.
I believe those building the products at these companies should not have an unchecked ability to exploit well known neurological patterns to drive more engagement, or have the freedom to turn a blind eye to millions of fake “bots”. Otherwise socio-political uncertainty and mental health issues caused by social media will continue.
Social media is a tool that has the potential to connect and empower people. It’s fair to say that as a medium it’s going nowhere, but much tighter regulation and clearer guidelines are needed to ensure that the tool is used positively and the murky practices of the past are governed more effectively.
Jake Higgins is head of growth at Forward Partners
This article was originally published by WIRED UK