Most apps designed for mental health sufferers – including those endorsed by the NHS – are clinically unproven and potentially ineffective, a recent study has shown.
In research published in the journal Evidence Based Mental Health, a team at the University of Liverpool found that many mental health apps and online programmes lack "an underlying evidence base, a lack of scientific credibility and limited clinical effectiveness".
The study also suggests that many mental health apps can lead to over-reliance and anxiety around self-diagnosis.
Simon Leigh, co-author of the study, argues that apps should be "well-informed, scientifically credible, peer reviewed and evidence based". "It's a really, really big problem," Leigh told WIRED. "The rate at which apps come out is always going to outweigh the rate at which they can be evaluated. Evaluation takes time – you have to design a pilot study and a retrospective observational study, randomise patients – and these apps can be knocked up in a matter of days."
Mental health apps have grown in popularity at a time when psychological services have faced an increased demand and decreased resources. Referrals to community mental health teams and crisis services have increased by 15%, despite a loss of around 200 full time mental health doctors and 3600 nurses.
Many patients, faced with waiting lists, have turned to alternative sources of support – such as apps. George Freeman, Minister for Life Sciences, has even launched a £650,000 fund for developing mental health apps, and a web-based mental health service is planned for London.
But this reliance on apps can be a double-edged sword. "If you're on a waiting list and you spend money on an app and then nothing happens it can make you feel like 'well, I've tried and nothing works'," Leigh said. "If you go through the process of downloading and using an app and there are no benefits, it can compound your anxiety about your mental health problems." "But apps that actually are good can play a really great role in terms of waiting lists. They can act as a triage for less serious mental health problems and can be the perfect remedy in some cases. Apps can be beneficial, but we need to ensure that with wider usage we also invest in further research to ensure that they're robust."
Jen Hyatt is CEO of Big White Wall, an online community for those experiencing mental health problems. She's passionate about what she calls the "transformative" role of mental health apps. "They can provide access to services from the comfort of the home. Many people find it hard to access services because of geography, because of mental ill health, because of physical disability. We've also found that, in the 50% of cases that do get to a GP, they're not able to guide mental health problems adequately."
Despite the increase in demand for health apps, the NHS' own App Library has been dogged with controversy, with researchers recently discovering a potential data leak. The library will be closing down, although the closure is due to the government’s work on a new endorsement model for health apps, not the data leak findings.
The four NHS apps found to be clinically effective were Big White Wall, Moodscope, a self-tracking and peer support network, Happyhealthy, a mindfulness app, and WorkGuru, an occupational stress-management programme.
Outside of NHS-endorsed apps, other mental health apps have also proved clinically effective. Sleepio, a cognitive behavioural therapy (CBT) based app for sleep management, has been shown in several trials to help long term insomnia – one trial put efficacy at 75%. And Wizard, a brain training app developed at the University of Cambridge, was found to improve the memory of patients with schizophrenia.
Stephany Carolan is CEO at WorkGuru and is currently studying for a PhD in digital intervention engagement. She told WIRED that mental health apps and programmes allowed "broader access to psychiatric help" as well as "normalising" mental illness. But she stressed that people should be wary of apps that promise too much. "There are no quick fixes," she said. "If an app says you only need to log in for ten minutes, it's just not true. It just doesn't happen like that. Take mindfulness for example – there's a strong clinical basis for it, but the interpretation of that in many online programmes is wrong. Mindfulness is a philosophy, it's more than a daily ten minute meditation – but a lot of programmes have taken it out of context. It's important that people know how complex a lot of these ideas are."
Big White Wall and WorkGuru, among others, are keen to make sure that mental health apps are clinically sound and socially responsible – but many other apps fail to replicate this eye for detail. There are thousands of unverified mental health apps available for Apple and Android, encompassing mindfulness, CBT, mood tracking, peer support and more. So how can we make sure we’re not being duped?
"There are a few simple steps you can take," Leigh told WIRED. "Systematic reviews have shown that apps that have involvement with clinicians are on average twice as effective, so if you go to the website of an app you're thinking of downloading and there's contact details for a mental health practitioner, it's one indication that the app is going to be clinically viable." "The most obvious thing really is to see if the apps are forthcoming with the information – it's easy to say "we can cure your depression!", but not so easy to show any proof of this. If an app gives patient numbers and statistics? It's not totally robust, but it's a good start.
"You have to have clinicians involved in developing of the app. And you really need to have a process for 'red flags'– you need to know what you're going to do if you're concerned about somebody's mental health if they come onto your site or app. You need to make sure people are encouraged to contact their GP, mental health professional or even the Samaritans if the app isn't working for them." "Ultimately, maybe we need to move to the stage where mental health apps are being validated by an external body, but it needs to be a transparent process – and of course it needs to involve users and members of the public
Jen Hyatt also ensures a similar process for Big White Wall. "I have no tolerance for developers who try to avoid taking responsibility for the safety of people online. We have a responsibility to our users - it's the only route to a good, rigorous resource." "We have support staff 24/7. We have data analytics, tests we use to screen for tests, and a clinical governance handbook that has protocols for issues like suicide ideation, self harm and other crises. They can be escalated to a clinical psychiatrist within two minutes."
And Google and Android should make prominent those apps that have this kind of strong basis. The whole industry has a responsibility to promise those that work."
This article was originally published in October 2015 and has been updated to mark Mental Health Awareness Week.
This article was originally published by WIRED UK