Where there’s smoke, there might just be smoke

One of the many thousands of principles in Jewish law is that you should not mislead people. This is a separate principle from not lying to people, not cheating people and the like. If you are asked a question, and you answer in such a way that the other person will most likely misinterpret your words, that would be considered misleading.

The legality of misleading someone is very distinct from the ethical issue. You could easily argue that what was said was equivalent to advertising, which is known to exaggerate certain claims. You could argue that your intent was only to gain the attention of someone, but  the expectation was that the person would investigate further before making any kind of decision. From a legal point of view, it is unlikely that someone would suffer any consequences from the simple act of misleading another. But from an ethical point of view, I think it is clear how wrong and dangerous this can be.

One of my primary reasons for writing this blog is to dispel myths that still readily flow between people, with a far greater “infection rate” than the flu or Ebola. At some point during my career as a physician, I lost track of how many patients refused vaccinations because of a single paper that they read online. At different points throughout the years, the dangers associated with vaccination varied. But the end result was always the same – an otherwise intelligent and well read person decided that their children would not be vaccinated. For some reason, the same parent was more than willing to trust the physician when it came to any other medical issue. But when it came to vaccinations, the parent insisted that they “knew the truth”.

In an excellent piece by Paul Sonnier, he notes that he had recently read an alarmist op-ed with the title “The New Health App on Apple’s iOS 8 is Literally Dangerous.” I have already discussed Apple’s new health kit and its potential to help countless people. Unquestionably, a title like this caught my eye as well. As it turns out, of the thousands of health tracking apps in the Apple app store, there was one that was considered dangerous because it had the potential to exacerbate obsessive behavior in people with obsessive compulsive disorder [OCD]. Apparently, this particular app  cannot be removed. Therefore, one could argue that owning an iPhone could be problematic for some people with OCD.

I need to be brutally honest and admit that I really don’t like Apple products. There are many reasons for this and the list has grown over the years. Nevertheless, I am also the first to admit that Apple products are of excellent quality, deliver a unique and enthralling user experience, and that the iPhone alone changed the world forever, in a positive way. Independent of my personal feelings, to argue that iOS 8 is dangerous is horribly misleading, and could in fact hurt people. If Apple does succeed in becoming the common pathway for all medical information from every EMR in the market, then Apple will save lives on a day-to-day basis. To mislead people to think that Apple is nevertheless dangerous to the public, should be criminal.

Who ultimately decides whether an app or desktop program or any other form of software is medically problematic? Ultimately, the makers of the app in question above should be able to apply to a committee and receive some form of certification. That certification should indicate what role the app plays [general information, intended only for physicians, diagnostic, for treatment, etc.]. Then, if someone makes an inappropriate claim about the app, they could be legally challenged. If the designer  of the app claims that it can be used to diagnose a disease, but the app is not certified to do so, this is grounds for bringing charges. Contrarily, if someone claims that the app is dangerous and should not be used for a given purpose, that person making the claims could be sued, depending on the certification status of the app.

Generally speaking, the FDA regulates medications and treatments in the United States. Until a few years ago, I suspect that few people ever thought that the FDA would have to monitor and certify software of the kind available today. Considering that apps are a relatively new entity, the legalities of their use are still not fully defined. To make things more difficult, new apps are being released on a daily basis. Imagine an FDA reviewer coming in on a Monday morning, and seeing 100 emails about new medically related apps that need to be classified and then certified. The task is Sisyphean.

There are protocols for classifying software and apps. That is why you will sometimes see a notification on an app that says “this is not meant to be used for diagnostics”. I especially like this notification on specialized software for viewing x-rays. I have seen many a radiologist use x-ray viewing software that clearly states “this is not intended for clinical use”.

To be clear, these specific software tools go to great lengths to explain how they adhere to all of the very strict standards for viewing x-rays. And these same tools are specifically designed to display x-rays that have been formatted for clinical use in hospitals and by physicians in their offices. But clearly, the company making this software did not want to have to deal with any certification process. The developers of the software relied on the fact that doctors would trust the software once they read its specifications.

I have used such software many times and it is more than adequate for reviewing films. This is unfortunately one of those cases where a company with an excellent product is frightened away by occasionally overzealous regulation. I guess it is fortunate that doctors can still make certain decisions on their own, even if it means cutting corners, as long as it is for the patients’ benefit.

As time goes on, more and more of our care will be provided by software. It might be software that a doctor uses to look up a protocol [like the well-known medical software called Google]. It might be software that drives a robot and is wholly responsible for making sure that a patient is kept safe throughout a surgery.

Getting FDA or other clearance for such software might be considered fundamental to its implementation in a hospital. But just like certified MRIs can break down, and pins used to hold fractures in place can themselves snap, software will not be perfect. For all of the testing that can be done to verify the correct functioning of software before it is released, no system is perfect. The question will be the same as for everything else in medicine: is the benefit significantly greater than the cost/risk. If the answer to this question is yes, then the system should be used.

This single question about cost/benefit sounds too much like a simplification of what is a very complicated situation. But it ultimately really is the only question of importance. So, the next time you see a blaring headline purporting to warn you of some hidden danger, take a moment and ask the important questions, before deciding on how to personally act.

Thanks for listening

About the Author
Dr. Nahum Kovalski received his bachelor's of science in computer science and his medical degree in Canada. He came to Israel in 1991 and married his wife of 22 years in 1992. He has 3 amazing children and has lived in Jerusalem since making Aliyah. Dr. Kovalski was with TEREM Emergency Medical Services for 21 years until June of 2014, and is now a private consultant on medicine and technology.
Related Topics
Related Posts