In the 18th century, those operating at the highest levels of society, from London to Moscow, needed to be able to speak French, then the language of status, the nobility, politics, intellectual life and modernisation. A hundred years later, British advances in industry, science and engineering meant that English succeeded French: a tongue with West Germanic origins replaced a romance language as the means of conducting business and diplomacy on the international stage. Today, even in some parts of China, English is still used as the global lingua franca, a leveller that enables deals to get done and the wheels of commerce and technology to spin.
Around a decade ago, another type of language – one that was written rather than spoken – was held up as a deterministic factor for those seeking to gain influence or advantage in the digital age: coding. Its champions proselytised that proficiency in programming would determine employability and access to a thrusting, energetic entrepreneurial future. Over two or three years, a small industry sprang up, intent on instructing those who had no formal education in computer science to create products using C++.
Coding is a worthwhile skill in the way that any kind of education is advantageous. Knowledge is valuable and being able to code at a high level undoubtedly increases employability. But the key here is the phrase ‘high level’. Coding isn’t a panacea. Some people excel at writing prose and are able to make a living from doing so. But not everyone who can string a sentence together can pay the bills in that way. Similarly, people who demonstrate talent at coding will create wealth and benefit themselves by doing it. But those who can complete a few commands in Java probably won’t.
However, at some point, a narrative emerged that the world was going binary: if you could code you were a winner, if you didn’t know your literals from your identifiers you were destined for low-skilled drudgery serving the digital overlords of Shoreditch. But recent research suggests that a more nuanced view is emerging. While those who code at the highest levels will be much sought after by recruiters, large organisations are increasingly seeking talent with non-cognitive skills.
Our education system is largely built to standardise and measure our abilities in cognitive skills such as English and maths. Coding fits neatly into this mindset: by its very nature it is rules-based in the way that, say, algebra is. In 2014, the UK government took a bold and enlightened step, by introducing computing to the UK school curriculum. Two weeks ago, it went further. Speaking at the World Economic Forum in Davos, the Prime Minister announced a £20 million investment in the establishment of the Institute of Coding, a consortium of more than 60 universities, businesses and industry experts. The mission: close the UK’s digital skills gap. The initiative exists at the other end of the educational spectrum – as part of the UK’s Industrial Strategy it will see industry and universities attempting to boost employability by devising new standards, digitising the workforce – including the professions, boosting diversity and inclusiveness and sharing best practice.
Read more: AI cyberattacks will be almost impossible for humans to stop
This is laudable, but what researchers have recently discovered is that organisations are increasingly searching for employees with skills that are hard to quantify and, by their very definition, can’t be standardised. And, as artificial intelligence continues to advance, might the coders themselves find that technology is moving beyond them? The job of engineers is to write instructions for computers to follow: if this, then that. With advances in machine leaning and deep neural networks, we won’t be issuing orders, we’ll be training machines.
In February 2016, Google appointed John Giannandrea to run its artificial intelligence research. Within months, this change in leadership gave rise to a new sensibility: engineers were no longer giving orders, they were teaching. “By building learning systems,” Giannandrea said later that year, “we don't have to write these rules anymore.”
Our relationship with computing will change over time as more of it takes place inside a black box. Beyond the inputs and outputs, we simply don’t know what’s going on under the hood, which makes many business leaders uncomfortable. Why should we follow what an AI tells us when we don’t understand how it came to its decision?
The narrative is that the coming age of automation will lead to two waves of mass job losses – first the truckers, taxi drivers and retail assistants, following that, white collar roles in insurance, legal services, and the back office. But, in March last year, a paper published in the MIT Sloan Management Review reviewed the results of a global study of more than 1,000 large companies that are using or testing AI and machine learning. It revealed the emergence of new roles and categories of jobs that are uniquely human. The report suggested that cognitive technology will require skills that machines are unlikely to possess any time in the near future.
The authors divide the AI-driven jobs into three categories. The first is trainers, who will teach AIs to mimic human qualities, for instance, ensuring that Alexa or Siri’s answers to queries don’t sound canned. Secondly, explainers, a group of people who will act as a bridge between AI and organisations, to try and understand why AI has come to certain decisions and to conduct investigations when it makes a recommendation that has had a negative outcome. This role could be crucial: for instance, GDPR offers consumers a “right to explanation” to any decision made on a purely algorithmic basis. Thirdly, sustainers will ensure that AI systems are operating as designed and that unintended consequences are dealt with urgently. This third category is one of the fastest growth areas in AI, as organisations realise the potential ethical implications of the technology.
Clearly, all three of these categories blend cognitive and noncognitive skills. A degree of technical proficiency is required, but qualities that require social, behavioural, leadership and emotional comprehension are prerequisites. Despite the divinations of the champions of code, the workplaces of the future will not be binary environments for the haves and have-nots. In an age when the ability to think and learn, to process and reason take centre stage, the coming age of automation means that, in order to thrive in the modern workplace, we must become more human.
This article was originally published by WIRED UK