6th May 2016, Volume 129 Number 1434

William Diprose, Nicholas Buist

The increasing economic demands of the growing and ageing population on our already overburdened healthcare system make our current model of care unsustainable.1 Novel ways of delivering care are needed, and consequently, there is growing interest in using artificial intelligence (AI) to aid medical decision-making.2,3 However, the impact of AI on the future landscape of medicine remains unclear. We briefly explore how, in the coming decades, the traditional role of the doctor will be challenged by AI in a) autonomously performing diagnosis, and b) autonomously making treatment decisions.

Artificial intelligence in other industries

Driven by the economic benefits of tireless labour, machines have been replacing human workers since the industrial revolution.3 Historically, tasks such as manufacturing have been most susceptible to automation. However, due to recent advances in computing, such as machine learning,4 cognitive tasks,such as decision-making, are becoming increasingly susceptible to automation through AI.3 In fact, with the exponential nature of technological advances, almost half of current jobs in the US are considered as ‘high-risk’ of technological unemployment over the next one to two decades.3 A range of industries are being affected by AI, from technologies such as self-driving cars,5 through to software that writes plain English news stories from structured data.6

Artificial intelligence in medicine: automated diagnosis and treatment decisions

Turning to the healthcare industry, to what extent will AI be able to carry out the cognitive tasks traditionally performed by doctors?

The British Medical Association states that diagnosis “largely differentiates doctors from other health professionals.”7 However, this ‘unique’ role of diagnosis is ultimately a pattern-recognition algorithm. Information is gathered, synthesised, and compared with predefined categories we call diseases. If a patient’s pattern of symptoms, signs and test results match that of a known disease, then we classify and treat them accordingly. Clearly, this process could be performed by an appropriate AI.

Indeed, IBM have already created an AI known as Watson, that is able to perceive, ‘understand’, and make decisions based on natural language. In addition to defeating the champions of Jeopardy! (An American television game show competition in which contestants are presented with general knowledge clues in the form of answers, and must phrase their responses in the form of questions), it is used at Memorial Sloan Kettering Cancer Centre to aid diagnosis and produce management plans for oncology patients.3 In contrast to humans, who can only learn from personal experience, Watson synthesises information from millions of medical reports, patient records, clinical trials and medical journals. Furthermore, Watson does not eat, sleep, go on holiday, or get sick.

According to principal investigator, David Ferrucci, Watson is already “out-diagnosing” medical residents in certain situations.8 Similarly, Isabel—a web-based clinical decision support system (CDSS)—suggested the correct diagnosis in 96% of 50 consecutive cases published in the New England Journal of Medicine.9 This is comparable with human doctors, who have been shown to make the correct diagnosis in 95% of outpatients.10

Notably, medical specialties that utilise images for diagnosis are particularly amenable to appropriation by AI. This is exemplified by an algorithm that ‘learned’ from a database of normal and abnormal images to diagnose and classify diabetic retinopathy as accurately as human doctors.11 Similarly, when applied to a dataset of 340 brain magnetic resonance images, an algorithm developed at the University of Malaya classified images as either ‘healthy’ or ‘diseased’ with 100% accuracy.12 Even aspects of the physical examination can be performed by AI, with a computer-vision algorithm classifying a group of 55 patients as either ‘healthy’ or ‘Parkinson’s disease’ based on automated analysis of handwriting with 79% accuracy.13

Although these solutions are intended to be physician assistants as opposed to physician substitutes, these findings have huge implications for us because diagnosis, our defining role, could be performed better, faster and more inexpensively by AI in the near future. If nothing else, these finding suggest that AI could substitute for human diagnosis in ‘visual’ medical specialties such as radiology, pathology, dermatology and ophthalmology in the very near future.

Following diagnosis, the doctor and patient must decide on appropriate treatment. This process relies on the doctor applying their clinical acumen to a particular problem, in combination with available evidence and patient preferences.14 In the same way as making a diagnosis, the process is largely algorithmic. As a result, there is growing use of treatment CDSSs that range from simple information resources, to ‘intelligent’ algorithms that suggest patient-specific evidence-based treatment recommendations.15 Antimicrobial Resistance Utilisation and Surveillance Control (ARUSC) is an example of an ‘intelligent’ antibiotic CDSS that is fully integrated with the electronic health record. In a recent prospective cohort study in Singapore, use of ARUSC halved mortality rates in patients who were initially started on empiric antibiotics.16 Similarly, Watson is currently making useful patient-specific treatment suggestions to leading oncologists.3 Clearly, when making treatment decisions, humans and machines combined are superior to humans alone.

Where does this leave the doctor?

As these systems become more intelligent, diagnosis and routine treatment decisions could, in principle, be performed independentlyby AI. As a result, the human clinician would only need to perform tasks that are beyond the capability of AI, such as communicating with patients, performing procedures, or making the final treatment decision in combination with the patient. Therefore, the clinician does not necessarily need to be a doctor. The cognitive tasks, which require many years of medical school training and decades of clinical experience, would no longer be the role of the doctor. This would be more apparent in the hospital setting, where there is a greater emphasis on the diagnostic process—as opposed to primary care—where the relationship between doctor and patient is often more important.

However, in both community and hospital settings, health professionals requiring less intensive training than doctors, such as clinical nurse specialists, could be trained to ‘fill the gaps’ where AI remain less capable—for instance, in history-taking, physical examination or basic procedures. Indeed, it has been shown that with appropriate training, nurse practitioners are comparable to physicians when treating patients in primary care.17 There may be a role for a small number of doctors to oversee processes, but the current role of a doctor as an expensive problem solver would become largely redundant.


Over the coming years, AI will challenge the traditional role of the doctor. Human doctors make errors simply because they are human, with an estimated 400,000 deaths associated with preventable harm in the US per year.18 Furthermore, the relentless growth of first world health care demands in an economically-constrained environment necessitates a new solution. Therefore, for a safe, sustainable healthcare system, we need to look beyond human potential towards innovative solutions such as AI. Initially, this will involve using task-specific AI as adjuncts to improve human performance, with the role of the doctor remaining largely unchanged. However, in the longer term, AI should consistently outperform doctors in most cognitive tasks. Humans will still be an important part of healthcare delivery, but in many situations less expensive, fit-for-purpose clinicians will assume this role, leaving the majority of doctors without employment in the role that they were trained to undertake.


Artificial intelligence (AI) is a rapidly growing field with a wide range of applications. Driven by economic constraints and the potential to reduce human error, we believe that over the coming years AI will perform a significant amount of the diagnostic and treatment decision-making traditionally performed by the doctor. Humans would continue to be an important part of healthcare delivery, but in many situations, less expensive fit-for-purpose healthcare workers could be trained to ‘fill the gaps’ where AI are less capable. As a result, the role of the doctor as an expensive problem-solver would become redundant.

Author Information

William Diprose, Whangarei Hospital, Northland District Health Board, Whangarei; Nicholas Buist, Whangarei Hospital, Northland District Health Board, Whangarei, New Zealand.


We would like to thank CGP Grey for producing his inspirational YouTube video, Humans Need Not Apply, available from: https://www.youtube.com/watch?v=7Pq-S557XQU


William Diprose, Whangarei Hospital, Northland District Health Board, Whangarei, New Zealand.

Correspondence Email



  1. Gorman D. The future disposition of the New Zealand medical workforce. N Z Med J 2010;123(1315):6–8.
  2. Deo RC. Machine Learning in Medicine. Circulation 2015;132(20):1920–30.
  3. Frey CB, Osborne MA. The future of employment: how susceptible are jobs to computerisation. Retrieved September 2013.
  4. Hastie T, Tibshirani R, Friedman J. The Elements of Statistical Learning. New York, NY: Springer New York; 2009.
  5. Shim I, Choi J, Shin S, Oh T-H, Lee U, Ahn B, et al. An Autonomous Driving System for Unknown Environments Using a Unified Map. IEEE Trans Intell Transport Syst 2015;16(4):1999–2013.
  6. Carter J. Could robots be the writers of the future? [Internet]. TechRadar. 2013 [cited 10 January 2016]. Available from: http://www.techradar.com/us/news/computing/could-robots-be-the-writers-of-the-future--1141399
  7. BMA. The Role of The Doctor | British Medical Association [Internet]. 2016 [cited 10 January 2016]. Available from: http://www.bma.org.uk/developing-your-career/medical-student/the-role-of-the-doctor
  8. Castaneda C, Nalley K, Mannion C, Bhattacharyya P, Blake P, Pecora A, et al. Clinical decision support systems for improving diagnostic accuracy and achieving precision medicine. J Clin Bioinforma 2015;5(1):1–16.
  9. Graber ML, Mathew A. Performance of a Web-Based Clinical Diagnosis Support System for Internists. J Gen Intern Med 2007;23(S1):37–40.
  10. Singh H, Meyer AND, Thomas EJ. The frequency of diagnostic errors in outpatient care: estimations from three large observational studies involving US adult populations. BMJ Qual Saf 2014;23(9):727–31.
  11. The Economist. Now there’s an app for that [Internet]. 2015 [cited 10 January 2016]. Available from: http://www.economist.com/news/science-and-technology/21664943-computers-can-recognise-complication-diabetes-can-lead-blindness-now
  12. Siddiqui MF, Reza AW, Kanesan J. An Automated and Intelligent Medical Decision Support System for Brain MRI Scans Classification. PLoS ONE 2015;10:e0135875–16.
  13. Pereira CR, Pereira DR, Silva FAD, Hook C, Weber SAT, Pereira LAM, et al. A Step Towards the Automated Diagnosis of Parkinson’s Disease: Analyzing Handwriting Movements. 2015 IEEE 28th International Symposium on Computer-Based Medical Systems (CBMS), IEEE; 2015, pp. 171–6.
  14. Haynes R. Clinical expertise in the era of evidence-based medicine and patient choice. Evidence-Based Medicine 2002;7(2):36-38.
  15. Garg AX, Adhikari NKJ, McDonald H, Rosas-Arellano MP, Devereaux PJ, Beyene J, et al. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA 2005;293(10):1223–38.
  16. Chow AL, Lye DC, Arah OA. Mortality Benefits of Antibiotic Computerised Decision Support System: Modifying Effects of Age. Sci Rep 2015;5:17346
  17. Mundinger MO, Kane RL, Lenz ER, Totten AM, Tsai WY, Cleary PD, et al. Primary care outcomes in patients treated by nurse practitioners or physicians: a randomized trial. JAMA 2000;283(1):59–68.
  18. James JT. A new, evidence-based estimate of patient harms associated with hospital care. J Patient Saf 2013;9(3):122–8.