When I was a medical student, most of my brain didn’t have to work that hard. The parts of my frontal lobes that are responsible for memorization definitely did work overtime. I was very lucky in that my undergraduate degree was in both computer sciences and physiology. As I quickly discovered after my first day in medical class, the material that I had mastered as a physiology student was often times at a higher level than what was taught in medical school. As such, the synaptic connections were already there to provide me with the answers to the questions on my exams. In fact, I even tutored other medical students [for free, remember I am a socialist] to help them through these courses.
Once I was doing rotations on the floors I actually did not require the activation of other parts of my brain. Once again, I was required to memorize standard protocols and traditions. I appreciate that the term tradition is not often used in professional medical conferences. But as I have come to realize over my 25 year career, medicine has very much transformed from an apprenticeship to a formal scientific field.
I quote the following from a recent article that I just read today. The author states as follows: “Although guidelines exist for use of bridging anticoagulation during short-term interruptions of warfarin in patients with atrial fibrillation, they are based largely on expert opinion“. Let me first explain a couple of the terms used in this quote. Anticoagulation refers to a process by which the natural clotting systems in the body are interfered with. This definitely can be life-threatening, but under close observation, anticoagulation is responsible for major successes in the treatment of various life-threatening diseases.
One of the most common medications used for medically anticoagulating patients is called Coumadin or Warfarin. Under certain circumstances, it is necessary to stop medical anticoagulation so that the patient can undergo a procedure without the additional risk of enhanced bleeding. Historically, such patients would stop their Coumadin days before the procedure and be placed on another medication called Heparin.
Heparin made it possible to continue to anticoagulate the patient up until the moment that it was necessary to stop, because Heparin was short acting, i.e. it stopped having its anticoagulation effect within a very short time after stopping its delivery. Coumadin,on the other hand, would continue to anticoagulate a person for days after it was stopped. So the switch-over from Coumadin to Heparin made it possible for doctors to much more closely manage the patient’s anticoagulation, around the time of the invasive procedure.
When I was a medical student, this switch over was standard procedure. Part of my duties as a medical student was to make sure that all of the patients who were waiting for surgery, had stopped their Coumadin and were being properly medicated with Heparin. How did I know that this was the best way to manage the anticoagulation of the patients? Like most medical students and even doctors, I did not take the time to go to the library and look up the original papers that provided the science behind this protocol. I accepted the validity of this protocol, simply because my senior resident told me so. And my senior resident accepted this protocol because the chief surgeon told him so.
I think it is clear now the significance of the quoted statement at the beginning of this post. I purposely folded part of the sentence to emphasize the fact that the present protocol for peri-procedural anticoagulation is based on “expert opinion”. The obvious question now is what constitutes “expert opinion” if there is no hard science to back up this switch-over protocol? The answer is, none. When medical studies referred to expert opinion as the basis to a particular protocol, this is the equivalent of saying that tradition and personal experience are the basis to accepting the validity of the protocol. Considering that there are already countless situations in which “expert opinion” was reversed by formal study, one really cannot legitimately rely on expert opinion.
This poses a major problem for medical science. If one investigates the basis of the thousands of protocols that physicians follow, many of them will still only be based on expert opinion. In some cases, small studies have been done to validate a protocol. But even the authors admit that the studies are too small in size to truly end any further discourse on the topic. Often times, it is necessary to do large-scale studies which are extremely time-consuming and expensive.
What does this mean for the average patient? I wish I could say something incredibly comforting, but in practice, medicine is still very much an art form whose practices are passed down from mentor to apprentice, generation after generation. Unless we find a way to dramatically facilitate the means in which formal medical studies are done, we will continue to always find ourselves behind the eight ball. Every doctor will continue to practice in a way that has not been confirmed by sufficient research. There is a reason why physicians are the worst patients.
Are there any alternatives to the standard means of medical research, that would provide us with the data necessary to answer all of the pending questions in medicine? At the moment, there really isn’t anything that compares with a large-scale study in which two groups are followed, where one group is being managed with the treatment in question, and the other group is receiving placebo. But, as I stated above, this approach is too slow and too costly to answer all of our questions. I personally believe that collecting data on every single patient could be a reasonable alternative to formal studies.
if medical researchers have access to the entire medical and even personal history of the patient, then relatively new analytical tools can be used to sift through this huge amount of data and to look for patterns of significance. If you have 10,000 such patients who have similar histories, it is most likely possible to extract meaningful conclusions from the analysis of this data. It is actually possible to run such studies and to compare them to the studies that have been done according to the formal method. With enough examples that show similar conclusions between the “massive data collection” approach and the formal study approach, the research community would likely conclude that massive data collection is an acceptable proxy for the formal studies. If this conclusion comes to pass, then the genie can truly escape the bottle.
If massive data collection holds the answers to many of our questions, the next phase is to get people to accept being monitored 24/7 for most of their lives. Why is such a universal and constant monitoring necessary? Because those people who do not end up needing medical care end up being the necessary comparison group to see how people who do need treatment, fare.
Obviously, what I am suggesting is that people accept the value of constant monitoring. People would have to appreciate that the benefits to all more than outweigh the potential leaks and thus breaches of privacy. I personally have long ago accepted this and have no qualms about having my personal and medical information becoming part of a worldwide database, intended for medical research.
I personally believe that the younger generation that is already so comfortable with sharing their private moments with Facebook and WhatsApp and all other social media, will actually have no problem with having their data constantly upload. What they will want in return is information. They will want to be able to see what data has been collected and what conclusions have been made. These new generation data sharers will want to be able to access a dashboard online that indicates the uses of their personal data and the significance to them of the research outcomes.
Perhaps this will become one of the primary roles of physicians in the future where computer systems can provide all of our questions’ answers by analyzing massive collections of data. The purpose of a doctor in this future world will be to help the patient ask the appropriate questions. In some ways, this would be similar to the role of a librarian who assists a student in finding the best collection of books that can answer the question posed. Knowing how to ask the right question is extremely important in the scenario. And it could very well be that future doctors will have to become expert in this very specific task. Considering that asking the right question could make the difference between an answer that is life-saving or not, it seems that physicians will still have a very important role to play in the decades to come.
For those of you who did not break out into song when I first mentioned the term “tradition”, I think it’s been too long since you’ve watched “Fiddler on the Roof”.
Thanks for listening