14445634744_b7caf10f5f_k 870x370

An elderly man with cardiovascular disease tests his own blood pressure, and sends the results to an online application that his doctor can access. Another patient with depression living in a rural area far from health services tells a psychiatrist how he is feeling via a video connection. All of this occurs without the patients leaving their homes.

These scenarios may sound like science fiction, but such ‘telehealth’ has the potential today to bring high-quality and specialised care to previously underserved populations. Studies indicate patients respond positively to using the technology, and it increases their access to health services. Health professionals report that it reduces the need for patient visits, and assists with clinical decision-making. There is also evidence suggesting that telehealth can improve patients’ ability to manage their own health, not to mention lower the cost of healthcare through fewer hospitalisations.

Despite this clear advantage, health systems have yet to abandon their hospital-centric approach to care. If telehealth is such a good idea, why is not being given wide support? One of the most intractable problems holding the service back is a much-needed rethink of the types of workers we will need in future health systems.

“50% of doctors and 40% of nurses report being under-qualified for some of the tasks they have to do”

Up until now, much of the discussion on the healthcare workforce has centred on the shortage of doctors or nurses. It’s true that many doctors in particular are approaching retirement and will soon need to be replaced. An OECD report released this month has shown that countries have substantially increased their training of doctors and nurses, and the numbers are growing. But this is an expensive approach. A recent report by the UK’s National Audit Office has indicated it takes three years and costs about £79,000 to train a nurse, ten years and £485,000 to train a general practitioner, and 14 years and £727,000 to train a specialist. These huge investments deliver healthcare professionals with astonishing skills, but regrettably we do not always take advantage of these skills wisely.

There is evidence of a considerable skills mismatch in the health sector, with a large proportion of health workers over-qualified for the work they do. The 2011-12 OECD Programme for the International Assessment of Adult Competencies survey showed that between 70-80% of doctors and nurses report being over-qualified for some aspects of their work. This suggests an inefficient use of their time and a waste of human capital. To be blunt, is it really worth ten years of training someone to spend much of their day looking into children’s ears to confirm that they are a little bit red and might require some basic antibiotics? Is there not a way their skills could better serve the population’s health?

At the same time, after all that training, about 50% of doctors and 40% of nurses report being under-qualified for some of the tasks they have to do. Education and training programmes need to transform so as to make health workers ‘fit for practice’. The outlook at present is discouraging, as many health programmes teach little about the skills we know to be needed in future systems such as ICT and people management.

Perhaps the biggest challenge will be to rethink ‘who does what’ – or ‘scopes of practice’ in the health jargon. This means letting appropriately-trained clinicians perform tasks they were previously not permitted to. The most common example of this is the nurse practitioner. In some countries, these more advanced nurses, who usually have a Master’s qualification, can prescribe limited medication and order diagnostic tests under controlled conditions. An OECD review in 2010 showed that advanced-practice nurses are able to deliver the same quality of care as doctors for a range of patients. Most evaluations find high patient satisfaction, mainly because nurses tend to spend more time with patients as well as provide more information and counselling.

“The evidence base for change is growing, but it needs to be matched by a growing political will”

Is it so difficult to imagine that diabetes workers, when backed with strong ICT support and clear protocols about what to do when symptoms are not within a prescribed range, can be trained to ensure that treatments are followed correctly, leaving those with more expertise to focus their attention on problematic cases? The barriers to realising this vision remain considerable. There are strong lobbies against change, particularly by professional associations. Policymakers need to engage these groups boldly, so they too can begin to see change as the tremendous opportunity to gain new skills and focus on what they do best, rather than succumbing to the impulse to feel threatened.

Traditional roles and responsibilities need to transform, and alongside them so do the antiquated ways of thinking. The evidence base for change is growing, but it needs to be matched by a growing political will. The question is, are governments bold enough to meet the challenge?

IMAGE CREDIT: CC / FLICKR – NEC Corporation of America