Thursday, September 06, 2007

Problem Knowledge Couplers

We’ve been running some posts on material that’s been coming out lately on how doctors decide diagnoses and subsequent appropriate treatment with the goal of improved health. Our point has been that that material relates very well to how judges decide offender culpability and risk and subsequent appropriate treatment with the goal of improved public safety. Many of the same cognitive traps that doctors fall into already house judges who have fallen into the same pits, and many of the remedies proposed for the doctors would apply equally as well to judges. Well, here’s another article discussing doctors, diagnoses, and treatment for preferred outcomes that holds potential, I think, for what we do in corrections sentencing as well. Some of the key points made:

According to a 2003 Journal of the American Medical Association review of autopsy studies, doctors misdiagnose 8% to 24% of the time. Cognitive errors, such as latching onto a diagnosis that seems the most likely without considering other possibilities — which experts call "anchoring" — are among many root causes, according to Jerome Groopman, chairman of experimental medicine at Harvard University and author of the book How Doctors Think.

The solution for some is technology. Doctors are increasingly using the Internet, even search programs as basic as Google, when they're stumped, according to "Googling for a Diagnosis," a British Medical Journal study last year.


Although Isabel is used in only 18 hospitals, interest in similar decision-support systems is growing in the medical community, according to the American Medical Informatics Association. Priced at about $750 per hospital, Isabel is considered a robust tool, highly rated by the Healthcare Information and Management Systems Society.
………………..
When doctors diagnose, they "match" a patient's symptoms against the patterns of several likely diseases, narrowing down the list as they go, according to Lawrence Weed, professor emeritus of medicine at the University of Vermont. This decision-making process, known as combinatorial thinking, involves juggling too much information for it to be successful without the aid of technology, Weed says.


"The mind can't possibly deal with the complexity of the problem that a patient presents," Weed says. "What if you said, 'Let's give (doctors) eight years of geography at Harvard and then let them to drive across the country without a map'?"

Weed developed Problem Knowledge Couplers — a technology that "couples" patient symptoms with relevant medical literature. With Couplers, patients can enter their symptom information into the Web-based tool and walk through the medical knowledge with their doctor. About 50 private employers now provide access to Couplers.
……………………
Groopman, however, fears that placing too much emphasis on technology will take the spotlight off clinical judgment.


"With these cookbook-type recipes for diagnosis and treatment, the risk is that's it's garbage in, garbage out," Groopman says. "They're only as good as the physician who is identifying what the key symptom or key finding is about the patient."

Many doctors who use Isabel argue that technology doesn't supersede individual judgment but rather acts as an aid in the decision-making process.

Hear any echoes of the arguments we always get about sentencing guidelines? See why this doctor stuff is so relevant to what we do? I’m bringing this article to your attention because it coincides well with some small movement going on in our field, such as Judge Marcus’ work on “smart sentencing” in OR and a recent meeting of East Coast sentencing commission states to talk about data sharing and finding commonalities that could allow comparative analysis and broader study of offenses and effective sentencing. We hear concerns raised beyond the “don’t make sentences by computer” complaints and going on to “our burg II is different than your burg II so our data can’t ever be matched for useful comparison” stuff. There’s truth to that, of course, but, having worked with crime codes in three states now, that fear is overrated and can be dealt with.

I do think it’s possible to set up a system that allows people in different states to tune into a database that says that the usual sentence for a 22-year-old burg II (generally defined with provisos abundant as necessary in the end) guy with two nonviolent priors and one violent prior (what counts as nonviolent or not being one of many areas that raise eyebrows across states) is this and that guy returns to correctional supervision after 36 months on average 21.6% of the time. If he’s married, got a diploma, done military service, whatever, those would be nice add-ons and theoretically possible if the system got wide support, but the basics could be available for practitioners to use and academics to work statistical magic on.

But we’re not strictly just talking quantitative data systems here. Most judges already know what the usual sentence is for that guy, at least in his/her jurisdiction. In my experience, it’s not the usual case that judges need help with (although, as the article reports, often doctors think they’ve made the best diagnosis and are wrong so why shouldn’t judicial decisions that continue to produce bad recidivism results not also have light shined on them?). It’s the unusual cases, like the ones it sounds like the doctors in this story use the “Couplers” for, that judges express interest in having help for. Those, by definition, don’t lend themselves well to quantitative analysis, but having a repository of sorts, like this article describes, of qualitative information, maintained perhaps by eager law or other grad students mining case decisions in some or all the states might be able to produce valuable and searchable materials just as these doctors receive.

No one’s talking guidelines or mandatory anythings here. Just the development and dissemination of a widely available information resource to aid decision-making and to allow accountability of the decisions made, in guidelines states or not. Knowing that quantitative data are available or qualitative information about similarly odd cases is at hand removes the justification of “well, I made that mistake because I didn’t know better” while at the same time making clear just how complicated and unamenable to “one size fits all” approaches like traditional guidelines doing sentencing is. IOW, this kind of system could provide an alternative, along the “sentencing information systems” line being advocated by Steve Chanenson and Marc Miller, to the single fall-back reform we have now—guidelines. It’s at least putting some time and resources behind, isn’t it?

Any friendly little philanthropic orgs out there reading this?

No comments: