-
NEW! Get email alerts when this author publishes a new articleYou will receive email alerts from this author. Manage alert preferences on your profile pageYou will no longer receive email alerts from this author. Manage alert preferences on your profile page
- Website
- RSS
Re-researching research
In past blogs, I have spoken about the way in which medical research is done. Very simply, one identifies two equivalent groups of patients. One group is given a placebo [non-active] treatment, and the other group is given the actual treatment being studied. This sounds rather simple, but in real life, this type of research involves many steps and can easily cost tens of millions to hundreds of millions of dollars to complete, depending on what specific treatment is being studied. Also, there are serious limitations on what types of studies can be done, because we cannot do human experimentation. For example, we can’t withhold a life-saving medication in order to see if a new treatment is really effective.
Another problem with modern-day research is that we have come to the realization that people’s physiology is sufficiently different, one from the other, such that population studies on hundreds to tens of thousands of patients, may in fact totally miss important findings. If a medication works very well on people who have a specific gene mutation, this critical finding could easily be lost in mountains of data that come out of large studies that include patients with a whole range of genetic codings.
None of these variations in genetic codes makes a person ill. But they will have a direct effect on how these individuals respond to specific medications. So until we are effectively doing a full genetic screen on every person undergoing treatment, we may never find better treatments for specific groups of patients. I can tell you that this single realization is extremely disheartening. This realization can also explain how one study can find significantly different results in comparison to another study, especially if it is done in a geographically distant population [which often also possesses significant differences in genetic codes].
Very interestingly, we are at a point in history when the “quantified self” is already a term being readily passed around. These days, a watch that only tells time seems antiquated. There is a growing expectation that any wearable should also have some sensing capability. This explains the expected explosion in medical information that is coming at us in a matter of years. Entire infrastructures have already been created that will be able to handle the input of the constant monitoring of millions of people. These infrastructures are not yet ready to handle such an influx of information from billions of people. But within a few years, this too will be a possibility.
In the near future, we will practically have moment by moment health information about millions of people. One question that comes to mind is if such information can be used for research. For example, many people may choose not to take a certain medication [like for high cholesterol] even though their doctor recommends it. In the near future, it will be possible to compare all those patients who do not take the recommended medication versus those who do take it. The assumption is that there isn’t some basic physiological factor that causes one person to agree to take the medication versus not to.
So basically, we would have two populations of people who can be matched up one to the other, on the basis of tens or more of physiological parameters. We can get to a point where almost the only difference between these individuals is the fact that one is taking the medication and the other is not. With genomic information, we can even dissect these million-large groups into genetic matches. For example, some people in each group may have a greater likelihood of responding to a cholesterol-lowering medication. This subgroup of individuals could be selected out for their own mini study. The potential of doing such research is astounding. In practice, it might be the only way to do certain studies, while remaining ethical.
The potential of capturing all of this information extends also to predicting the likelihood of a patient positively responding to a treatment. Imagine you have a patient who has just had a heart attack and the question is if this patient should take a special medication to thin the blood. The doctor could search through a huge database of medical information and find a large group of individuals who effectively match the patient sitting before the physician. The doctor could further look into these patients’ medical histories and see how they responded to the treatment in question. If the matching patients responded well, then the doctor would tend to offer the treatment to the patient presently being cared for. On the other hand, if the experience of other patients was negative, the doctor would tend not to prescribe the treatment.
There are many other examples of the power of doing research in this fashion. I expect to cover these topics in future blog posts. Almost every day, our potential to diagnose and treat disease is increasing, specifically, due to our ever increasing ability to collect, save and analyze huge quantities of data. The sky is not the limit. It’s fair to say, we’re not even sure where the limit lies. And that is a very good thing.
Thanks for listening