But advancements in technology will have a significant impact on healthcare.
By Dr Barry Finette
If you ask Open AI’s ChatGPT, recently integrated into Microsoft’s Bing, if technology like ChatGPT will replace doctors, here’s the response:
As a language model, I have been trained on a diverse range of text sources, so I can provide information and answer questions on a wide range of topics. However, while I can certainly provide information and advice on various medical topics, I am not a substitute for a licensed physician. Doctors have received specialized training and education, and they have hands-on experience diagnosing and treating patients. They also have access to a patient’s medical history and can perform physical examinations, which can be important for making an accurate diagnosis. So, while I can provide information, it’s always best to consult with a doctor for medical advice.
The importance of trained clinical health professionals cannot be overstated. The repercussions of the growing global shortage of healthcare professionals, in particular physicians, is being felt in every corner of the world. This may be because physicians are unavailable or because individuals or families cannot afford to see them when they are concerned about their health. Access to healthcare is in crisis. With technology such as ChatGPT going viral, it’s easy to imagine a future where we are not so dependent on doctors and nurses; just type your symptoms into a ChatBot and find out exactly what is wrong with you, what next steps to take, where to go and how to be treated. The concept is not novel and is one that has been explored globally for quite some time. Have you ever “googled” your symptoms?
As a Professor of Pediatrics, a research scientist and a clinician for over three decades, I would suggest that the question is not if AI will replace doctors, but instead how technology can be used to complement clinicians, providing medical information that can improve healthcare seeking behavior and finding a “place” in extending the reach of a healthcare system and global health workforces. When it comes to examples like ChatGPT, there is enormous potential for information sharing, such as what it means if you have yellow eyes or what COVID-19 is. This is of very high value and is empowering to individuals who want to play a more active role in their own health. A key role of physicians is to educate their patients about the medical issues they are worried about or directly facing. Going the “next” step and performing high quality clinical assessments, however, may not be as straightforward for AI.
For starters, there is only so much clinical data that one can gather via a conversation, which makes even the most sophisticated symptom-checking, data mining, machine learning-based technology limited. As a doctor, I interpret nonverbal visual cues from my patients which are a critical part of my clinical evaluation. I also utilize unique communication skills in order to acquire key clinical information that patients, in many cases, do not know is important. This helps me to map the timeline and course of many diseases and conditions and how that information fits into the puzzle. In addition, interpretation of not just “what” patients share but “how” and in what “context” are all key skills and experiences that physicians obtain, which are essential for them to clinically evaluate their patients. Doctors spend years building on their training by interacting with patients, which positions them to not only take into account the health data but interpret it. Machine learning-based data mining and NLP AI – the process used by technology like ChatGPT – is still limited in this regard, as conducting a high quality clinical assessment includes the weighting and interpreting of all available health information in a meaningful way. Others have concluded that the largest barriers to the implementation of ChatGPT in clinical practice are deficits in situational awareness, inference, and consistency, which you can read here.
Secondly, and a lesson we have learned at THINKMD, is that in many areas where this empowering health information is most needed, specifically low and middle income countries, there is no internet. Thus technology needs to be server independent. In addition, information generated needs to be able to be accurately translated to local languages and dialects for it to have any value
Third, there are many legal barriers to sharing medical information in relation to clinical evaluations and decision making, that will need to be overcome in careful and considerate ways before medicine can truly be “practiced” by AI/ML based platforms.
AI and machine learning may not yet replace clinicians, but engineering and technological advances offer the opportunity to empower communities where clinicians are not available by supporting clinical assessments and decision-making as close to a doctor’s capabilities as possible. This depends on technology that has been scientifically validated and proven in order to aim for the highest quality of care possible – a feat that may not be achieved by engineering alone, but a combination of engineering and clinical expertise.
In this brave new world, the tools like ChatGPT that are being developed should be utilized to support health systems globally. However, data mined medical information and high quality clinical care are not synonymous, and it will be quite some time before they are.