A recent article on Medscape opined that the overwhelming majority of physicians are making diagnostic errors and that increased use of artificial intelligence (AI) could help alleviate this problem. Art Papier, MD, a dermatologist and medical informatics specialist, described some specific devices currently on the market designed to help the modern practitioner. The article sparked a lively debate in the comments section among a broad range of healthcare professionals.
Some were not fully on board with all of this new technology. One acerbic physician assistant predicted the course that medicine might take:
The day is coming when you will pull up to the first kiosk, tell it your symptoms, pull up to the second kiosk where you give a drop of blood, then pull up to the next kiosk for your prescription. CMNHI (Corporate Medicine No Humans Involved) is the wave of the future, and I wasted all this time on education and experience.
A primary care physician thought the traditional methods of examination were still best:
I find that an appropriate history with clinical exam and a fair amount of differentials to exclude still works fine!
And another healthcare professional saw increased use of AI as a crutch that was weakening the very skills it was designed to support:
This is just one of the multiple negative outcomes of the "Medical Profession" becoming the healthcare industry. Most college graduates could do as well as med school graduates as long as they know how to use a symptom checker as their diagnostic guide. The simple truth is that these new doctors are doing harm. They cannot diagnose and they don't make referrals or seek consults, so potentially critical, crippling or fatal conditions go unrecognized and untreated. It may take a significant increase in medical malpractice lawsuits before we see a return to patient-centered medicine.
But others saw no reason to fear the machines. An obstetrician wrote:
These apps are great aids. We use many every day just like we use computers to read EKG patterns. Many computers read mammograms and other x-ray patterns. Many more will come.
A dermatologist also proudly used new technology:
Patients like a physician who admits that they do not know everything... You can get stumped if someone comes in with a variant of the norm you are used to seeing... This is a helpful tool in our toolkit as physicians.
Another dermatologist was similarly optimistic:
Tools that increase diagnostic accuracy will only benefit patient care. Ensuring that the patient has been correctly diagnosed is imperative to initiating effective treatment. Support systems [can] provide efficient and easy access to extraordinary databases capable of providing an accurate differential diagnosis, relevant clinical photographs, and up-to-date diagnostic options. These resources are quickly becoming the future of medicine.
A registered nurse agreed and offered her perspective as a patient:
l. It [using AI] reinforces that my doctor will not wing his/her way through an appointment.
2. When the doctor is definitive in a recommendation I know it's because he/she is absolutely sure about the treatment plan and I'm on board right away.
3. I know he/she will seek a colleague's opinion when necessary.
Another healthcare professional also preferred this form of care:
I'm more impressed by and trusting of my doctors who say, "I'm not sure, let me look into that and get back to you," and then actually do get back to me with an answer... I know when someone is winging it and I'll bet patients without medical or nursing backgrounds can tell, too.
But some remained unconvinced. A physician wrote:
You can be 100% sure that AI is making mistakes but of a different kind... Every physician has made mistakes during his working time, but some should never be made! So read (to be a more skilled physician) a book for differential diagnosis! Or is reading old-fashioned?
A colleague worried about the quality of the information:
Artificial intelligence is only as good as the data input. If today's doctors are making many diagnostic errors and the same educational data as medical education is input into diagnostic programs, then diagnosis will still be misleading.
More than a few respondents felt that diagnostic error came not from a lack of modern technology but from the failure to properly use traditional methods of examination. One internist wrote:
One of the main reasons for these diagnostic errors is the inability or unwillingness to do a proper and complete history and physical examination... There is a tendency to order unnecessary lab tests and imaging studies without examining the patients thoroughly, and even when a physician attempts to do physical exams, improper and cursory techniques are used.
A medical administrator also felt that the article failed to spot the true source of misdiagnoses: "Teach physicians to listen [and then] some of these errors would not occur."
The final word goes to a physician who tried to find common ground:
Keep on reading every day. Acknowledge your limitations. Ask colleagues. Share interesting cases. Medicine is a journey and you should learn something every single day of your life. Even with the most common complaint like backache, there is something to know more about it every day! And if AI can improve your knowledge, why not. But AI must not become the alpha and omega of medical care.
The full article can be found on Medscape.
Medscape Family Medicine © 2018 WebMD, LLC
Any views expressed above are the author's own and do not necessarily reflect the views of WebMD or Medscape.
Cite this: Artificial Intelligence vs Doctors: Diagnostic Errors - Medscape - Aug 16, 2018.