Amazon's Alexa deal with the NHS can help patients but at what cost?

Digital devices are invading healthcare. For some, it’s a lifesaver. For many others, it could be a dangerous attempt to save the NHS money

Amazon’s digital assistant Alexa doesn’t know much – yet. Apart from reliably playing music and giving the date and time, weather forecast, and various odd definitions or historical data, one of the pebble-like device’s favourite answers seems to be: “hmmm, I don’t know that one.” Alexa is now promising to up its game and start giving reliable medical information.

Amazon is partnering with the NHS to stream the health service's advice already available online through Alexa, but using voice. In principle, it sounds like an interesting proposition – especially in the wake of the government trying to encourage people to judge their symptoms better and avoid coming into the country’s overburdened GP practices when they don’t need to; more than 50 million GP consultations each year fall into this category.

There are complaints for blocked noses, dandruff and travel sickness, “costing the NHS billions that could be better spent providing care to those who need it,” says Farah Jameel, British Medical Association (BMA) GP committee executive team IT lead. So getting advice on the NHS website or at dinner time asking Alexa about a sore throat might be a positive step – with “Amazon directing them to a reliable source that is clinician-led,” Jameel adds, especially for patients with visual impairments or other disabilities. But will it really work as intended?

“The worry is all about the information you put in,” says Elliott Singer, a GP partner in East London and Medical Director for Londonwide LMCs. Let’s assume a user says that he or she has a sore throat, and Alexa may advise gargling. But a GP would take a little bit longer to study the medical history, look down the throat and examine the patient to rule out, say, some periodontal abscess that needs surgical drainage, or another serious illness.

“The difference is that the tech we've got at the moment, it doesn't have that ability to really differentiate between an isolated symptom and when that symptom is a sign of a minor illness, or when that symptom actually could suggest a more significant underlying illness that needs further investigation or treatment,” Singer says. “The technology and the algorithms it uses are pretty basic and cannot make situational judgments to enable you to differentiate between a minor illnesses and a serious illness.”

It’s not just Alexa, of course – people have been using gadgets such as wearable devices that check their pulse and heart rate for years, or relying on gym equipment for the same purposes. “I do have instances of people being at the gym and putting their hands on the pulse region of a running machine, and then coming in to see a doctor because it's jumping all over the show,” says Singer. “It turns out that they've got a condition called arterial fibrillation, an irregular pulse. So there are times where this is helpful in diagnosing and helping people to realise what's going on in their own health.”

But one cardiologist in London who didn’t want to be named says the number of patients he’s been seeing has surged – with people coming in complaining that “my Fitbit just told me my heart rate is too high, I must be ill,” he says, when in reality the person is absolutely fine. The increased number of patients, he adds, is putting even more strain on already stretched NHS resources.

Sam Finnikin, a GP and clinical research fellow at the University of Birmingham welcomes patients who come in prepared, who have researched their symptoms – “even if it's completely wrong, it doesn't matter. As long as they've gone online and looked at their symptoms or their questions or their problem, then they come to the consultation prepared, and with questions, the questions I can answer,” he says. But, he adds, with the deluge of information, opinion and quackery online, it’s crucial to do the research on websites where quality is assured, like the NHS.

So while there’s nothing wrong with Alexa giving out information, it should not be interpreted as medical advice, adds Finnikin. “Straying into giving advice is very different and difficult territory,” he says. But reading about symptoms – with websites giving you information passively – and listening to someone’s voice saying the same information out loud can be perceived differently. “If somebody every day is developing a new symptom and asking about it, that could signify an underlying health anxiety, for instance, which an algorithm is never going to pick up,” says Singer.

One major caveat with health information technologies, whether that’s Alexa or Fitbit, is that they are not regulated; people with serious health concerns turning to them for advice is inadvisable. “This is unregulated activity in terms of the devices,” Finnikin says. “Lots of things are being introduced, with no evidence to benefit and no evidence that they don't cause harm. So I do think that we need to look how we're dealing with all these options that are coming people's way to reduce the chance of any harm.”

Still, it’s not just machines that could have a poor track record; human medical care givers make mistakes too. When patients call 111, the operators won’t always give you the absolutely correct outcome. People do get things wrong – but that’s when the safety net comes in. Singer says a GP's advice would be: “I expect this pain to go away in this certain time if you do this. And if it hasn't, I want you to come back to me."

“And so it's whether or not there's a way of introducing the equivalent of safety netting into any advice given from a device like Alexa to actually say – if your sore throat hasn't gone away in two days, and you start having a high fever, you do need to seek further advice,” Singer adds.

If Alexa stays a symptom-sorter similar to the Babylon app, that’s one thing – and technology like that is likely to become more common. But “this is a very grey area and very difficult,” says Finnikin. For a piece of technology to actually provide a diagnosis it has to be regulated by the FDA in the United States and by MRHA (Medicines and Healthcare products Regulatory Agency) in the United Kingdom. Alexa's advice through the NHS website is not claiming to be a diagnosis.

“People think that what they're doing is giving their symptoms to get a diagnosis, which is very understandable, because that's how we think that these systems work. But the people who sell or promote these systems are very clear that what they're not doing is giving a diagnosis. But patients don't understand this.” So proper regulation of these products is crucial, and at the moment there’s simply a caveat at the bottom saying the device does not give a diagnosis. “Patients or members of the public have no idea about the differences,” says Finnikin.

Another concern is that people most likely to use Alexa for health information are those who are already digitally savvy and use the internet to research their symptoms. “Ownership of these kinds of devices is higher in groups that already use the internet more. So I think a big challenge for the NHS is how to make digital technology more accessible, so that those people who need it most don't get left behind,” says Sarah Scobie, deputy director of research at Nuffield Trust.

This article was originally published by WIRED UK