Dr. Kevin Most: Robotics in Medicine
Many of us remember the Jetson’s cartoon and the robot they had that did just about everything. We heard about IBM’s Watson winning on Jeopardy, we hear daily about robots doing more and more in many industries. Is this impacting medicine? Will we someday reach a point where we don’t see a doctor at all, just a robot?? Anything is possible, and although that day is far away we are relying more and more on technology in many areas of medicine.
Do we have robots and computers now in medicine? Well, I am sure many of you have heard about robotic surgery. Let’s clear that misnomer up so everyone understands what that actually is, but let’s also realize that the world of Artificial Intelligence is here and advancing quickly.
First Robotic surgery, is it really a robot doing the surgery? No, the surgery is performed by a physician, however the physician is not standing next to the patient while performing the surgery. The surgeon is sitting a few feet away at a console. The patient has instruments placed into the abdomen or chest. The surgeon is seated and is looking at large high definition TV view of the area he will be operating on, thru optics in the instruments. The view is magnified. He then has controllers that act as his hands, and he does the surgery. The great thing about this is it allows for the movement of the surgeon to be much more exact. Think about something as simple as the mouse for your computer, you can change the sensitivity on that mouse so your hand movements on the mouse are more exact based on your movements. Robotic surgery is now used in many conditions including prostate, gall bladder surgery and many gynecologic conditions.
So are we able to use technology to improve healthcare and do we see major changes happening, absolutely. Just a few simple examples, EKG’s tracings of the electrical activity of the heart used to be hand read by cardiologists. They would have calipers and do the measurements and do the interpretation. Now we have taught the machine to do the measurements, review and compare with known definitions, the EKG machine then gives its opinion and that is validated by the cardiologist. It has saved time and in many cases is more accurate.
Telehealth is becoming more and more popular. The use of essentially skype allows for a specialist miles away evaluate a patient and make decisions where the best site of care will be for that patient. This is done with chronic conditions in rural areas but also in acute settings for stroke patients.
Doctors are now able to review CT scans of patients from anywhere in the world, and a skype exam can be done as part of the physical exam. The specialist reviewing the images can then make a decisions as to where the best site of care for the patient will be. That information can then be pushed to the site so that the proper team is available and waiting for the patient when they arrive.
At CDH we are bringing care closer to the patient by actually sending a mobile CT scan to the house of suspected stroke patients based on the 911 call. This allows lifesaving and life changing medications to be given in a much shorter time. The patient is actually being given clot busting drugs as they are transported to the hospital. Without this technology the drug is given much later after the patient is transported and imaging is completed. This is a very new technology and data is just being collected on its impact now.
What about the impact of AI, as physicians we are educated for years, and as we practice we are exposed to more patients and information and our knowledge base is expanded. However we now live in an age of data explosion and change is happening faster than anyone can imagine. The amount of research being done is overwhelming. No doctor in the world could possibly keep up with the amount of data generated daily. So what does that hold for the future. How do we know as patients we are getting the best and newest options for our care?
Well AI will impact healthcare for sure, however not as fast as we all would expect. Think about the doctor’s office and hospital, it was not a long time ago that we were all documenting on paper and we all had our medical record chart, often we would have multiple charts. The ability to learn and improve care in that setting is based on the experience of the physician. So busy doctors saw more and therefore were exposed to more patients and thus had a better knowledge base for treating patients in the future. I worked with a dermatologist in my training and he was very good, he said if I don’t know what is wrong within 1 min of walking in the room that patient is in big trouble, he was not cocky, he was experienced and had seen a lot of skin problems
Let’s take a look at radiology, for this discipline physicians are taught what to look for after reviewing a lot of images during their residency. The radiologist then continues to learn as he provides result for images thru his career. This discipline is perfect for the field of Artificial Intelligence. For AI ,the need in radiology, at present, is to be able to collect and process data, it needs to have many images of normal, abnormal and rare to be able to build a data set that can compare and identify abnormal images, similar to what has happened with EKG readings. IBM has Watson and they understand this portion of knowledge for the computer. They recently purchased a company called Merge. Merge has 30 billion images which will be used to teach Watson what is normal, what is abnormal and what to look for. This would be the first step, teaching the computer what to look for based on history of radiology. The next step and probably the more exciting, is teaching it what to look for before there is disease. What I mean by that is now reviewing images from an individual that at the time were considered normal, from a patient who has advanced to have cancer. Allowing the computer to look back and compare images looking for subtle changes that were not noted in the past. The computer has now learned that in a health individual it will also look for the subtle changes that have advanced to cancers in others. It would then note this area as an area of concern based on the images of many before not the current image, which still may be read as normal.
One of the goals with AI in radiology is to learn from other experience or data points as well as detect conditions earlier based on historical advancement of disease tracked. Radiologists review old films when available but often they are forced to make a determination based solely on the current images.
AI will not completely replace the radiologist in the future but it will decrease the need for radiologist, as the time consuming portion of reading a film may be performed by the computer. GE (they make a lot of medical imaging) is now partnering with UCSF to build imaging equipment that will read the films and help radiologist to identify more critical findings quickly
With AI the computer can be taught to look for specific anatomical changes, this will be somewhat easy as the data base of images grows and can be shared. The larger impact will be true AI, where in radiology the results of an image will trigger the monitoring of another body part or system based on data collected over the years on individuals with diseases. Or that imaging for a specific patient complaint will align with a future diagnosis.
AI will go well beyond imaging, consider the patient who presents with left arm pain. As a family doctor, I take a history, do an exam and then make decisions as to what to do next. I may send them to cardiology if I am concerned about a heart condition, I may send them to an ortho if I am concerned about a bone or tendon issue, I may send them to a neurosurgeon if I am concerned about a nerve coming out of the neck causing the pain. Each of us will have some bias when we listen and exam the patient. The computer with AI would essentially have no bias and would be basing its recommendations based on a data base of millions of people with sore left arms. The ability of AI to take more data points and look at historical outcomes of millions of patients with the same condition will help us streamline care, save healthcare funds (decrease in unnecessary testing) and have better outcomes.
IBM Watson is partnering with UNC School of Medicine, this partnership is in the field of oncology. They have taken the records of over 1,000 of their oncology patients and have run it against the Watson Oncology database. It reviewed Watson’s recommendations for care against their current plan. In 30% of the cases Watson was able to provide additional missed options based on the most recent research. UNC is not the only health system working with Watson, a few years ago IBM partnered with 14 major cancer centers to start and continue its work.
How far away is this? Probably not as far off as we all think. Look at the work of Watson from IBM. Watson is able to read and absorb every medical journal published in a few seconds, most doctors have a stack of journals on their desk and at some time will get a few minutes to look at the contents and pick out a few articles that interest them. There is no way any doctor can keep up with the vast amount of data that is published daily.