fbpx
Health + Tech Perspective : Access to Symptoms > Better Algorithms Health + Tech Perspective : Access to Symptoms > Better Algorithms
  Here’s a dangerous meme that I keep running into: “A doctor’s job is basically to look at symptoms, make a... Health + Tech Perspective : Access to Symptoms > Better Algorithms

 

Here’s a dangerous meme that I keep running into:

“A doctor’s job is basically to look at symptoms, make a diagnosis, then prescribe treatment. Medicine is just a big decision tree — ripe for optimization and automation.”

Of course, it’s mostly my friends in tech saying these things.

It’ll come as no surprise that the medical community rejects this viewpoint. The prevailing view in healthcare is that tech will streamline some business processes and create some new sensors and devices, but never “disrupt” the industry as a whole. When I ask why, the answer usually comes down to “human factors.”

Let’s talk human factors, because I’m mostly with the doctors on this one: “medicine = diagnosis + treatment” is a limited, doomed-to-failure approach to healthcare. Techies who think this need some fresh perspective.

Yeah, I’m going to be this guy for a bit.

The Straw Man

Here’s the limited view of medicine that I’m arguing against:

1. Diagnosis = f(Symptoms)

2. Treatment = f(Diagnosis)

3. Success = f(Diagnosis, Treatment)

I can tell when someone subscribes to this model, because they always gravitate to the same playbook:

  1. Gather data about symptoms
  2. Use machine learning to minimize the error rate on all your f’s
  3. Profit

This model is just right enough that I keep thinking about it, and so wrong that I grind my teeth every time I do.

There’s way too much here to cover in a single post, so I’m going to focus on the very top of the data funnel: Access to Symptoms.

Access to Symptoms

In the real world, “symptoms” don’t arrive on a silver platter served up by an EMR. They’re the product of a messy human process of observation and deduction.

1. Diagnosis = f(Available symptoms)

2. Available symptoms = f(Physical contact, trust, history, context)

  • Many symptoms require physical contact with the patient. Doctors ask “Does it hurt when I touch here?” all the time. Can your AutoDoc 5000 do the same? If not, you’re at a serious disadvantage for diagnosis. After all, Clean data > More data > Better algorithms.
Here’s where to place a stethoscope to listen to a patient’s heart. Physical contact matters in medicine.
  • Trust matters. I spent the last year leading development of algorithms/data systems at Aspire Health, a fast-growing provider of in-home nursing services. Despite the fact that our nurses were very busy, our lead physician always counseled them to “spend the first 10 minutes talking about the pictures on the mantle.” He knew that our ability to understand and treat was sharply constrained by relationships.
  • It’s not always easy to discern what’s a symptom. Remember House? He’s the Sherlock Holmes of medicine. Like Sherlock, his superpower is the ability to “separate the relevant and important facts from the unimportant or accidental.” Many episodes in House turn on a deliberate search for contextual information: breaking into the patient’s apartment, or DNA tests from the patient’s parents. House is fiction, but perceptive fiction: context matters for diagnosis.

Recap

For these reasons and many others, “slap a decision tree on it” is the wrong model for health+tech. Yes, clinical decision-making needs improvement — in some areasdesperately — but tech’s ability to improve decision-making is gated on upstream access to symptoms.

Access to symptoms depends on physical contact, relationships, and broader context, not to mention data interoperability that doesn’t suck.

At this stage of the game, the real challenges for clinical decision support are data problems, UX problems, and service delivery problems, not pure machine learning problems.

The problem in microcosm: a nurse holding a thick binder of patient data (which can’t be shared meaningfully online) talking to a robot tele-doctor (who can’t actually touch a patient)

More to come…

 

Original Source.

Abe Gong

Abe Gong

I solve human-centric problems with data. I build teams that ask the right questions, and data systems to get the right answers. I am deeply committed to unlocking human potential with technology. I believe that ubiquitous computing offers great opportunities to understand and improve ourselves---identity, habits, and relationships---in ways never before possible. A lifelong student of the developing science of behavior change, I speak and write regularly on the intersection of data science, human behavior, and the Internet of Things. I'm an expert at building and deploying world-class data products. My technical skills cover full-stack web development, "Big Data"​ architecture and engineering (NoSQL, AWS web infrastructure, Hadoop, and Spark), and a multi-disciplinary analytical toolset (machine learning, econometrics, Bayesian statistics, and research design.) More importantly, I know how to manage the technical roadmap for smart, personalized products. Great data products aren't built all at once---they're bootstrapped, by aggregating and refining data and feedback from many sources. I know how to pick the right methods and tools for each stage, to rapidly unlock new capabilities and experiences on a steady cadence. I understand how culture shapes data, and data shapes organizational culture. I'm comfortable building healthy, productive team dynamics in a wide variety of settings---from research and engineering teams, to grassroots activism and policy initiatives, to government and non-profit organizations. If you want to build great things with data, let’s talk.

1