Friday, August 17, 2018

My Father Killed Himself — Advice from a Physician and Suicide Survivor.

Originally published in MedPage Today August 16, 2018 (https://www.medpagetoday.com/psychiatry/depression/74589)

“My father killed himself when I was 13 years old.” That was the opening line of my medical school admission essay, the one in which I tried to convince an anonymous administrator in another state that, as a suicide survivor, I was specially possessed of both strength and empathy—and that these traits made me specially worthy of becoming a doctor. In the essay I reminisced about the fun times with my dad, like when he would paint and I would sit on the floor poring over his medical textbooks. I described the medical diagrams he sketched on the backs of napkins at the KFC, when he took me out to dinner on our mandated Thursdays and every other weekend. My parents were divorced by then, and these were the only times I got to see my Dad. I treasured them. I didn’t tell the whole story in that essay, though. I didn’t mention how, a week before his death, my dad showed me where his will was located. “Just in case something happens to me,” he said. I also left out the night that he attempted suicide the first time. He called my mom to tell her he’d taken pills. She got the police to trace the call to a hotel. They found him, rushed him to the hospital, and pumped his stomach. He lived two more days, until he hanged himself. I didn’t tell those stories. I was focusing on the positive, you see. I wanted to get in. The real story was too messy and painful for the confines of a university admission essay.

In my first year of medical school we learned about the warning signs of suicidality, and how to intervene by asking “are you thinking of hurting yourself?” and “do you have a plan?” That was when I realized that his showing me the will and first attempts were our chances to intervene. In medical school I learned about how to contract for safety and, maybe, save a life. I learned that suicide is preventable and that public safety measures—restricted access to guns, the installation of barriers on tall buildings and bridges—has not only thwarted individuals, but has also lowered the overall suicide rate in entire regions after they have been implemented. I learned that when patients survive a suicide attempt they are unlikely to try again and die by suicide. Learning about how to prevent suicide made me feel empowered, but it also made me feel guilty, because I didn’t know these things when I was 13, and I wish I had.

I see the bodies of the suicidal on my autopsy table every week: the hangings, the incised wounds and gunshot wounds to the head and chest. Family members, survivors, are often in denial. The manner of death “just doesn’t make sense.” Sometimes I will confide that I am a survivor too, and that, even after 35 years, my father’s death still doesn’t make sense to me, either. I’ve been criticized for writing that suicide is a “selfish act”—but I stand by those words. Not out of a sense of abandonment from my dad, but because I too have lived long enough to be suicidal myself. In those moments, when suicide “made sense,” I was so engrossed in my own pain that the tunnel vision of depression made everything else—and everyone else—irrelevant. In that way the suicidal person is selfish; as in “self focused,” unwilling to believe that their pain can fade over time, and that other people rely on their presence in this world. Having coming through those periods myself with the unwavering love and support of my mother and, later, my husband, has made me stronger and more empathetic. I have overcome my father’s suicide, even if the wound it left me has never fully healed.

I have learned that it helps to talk openly about it. The more I speak out about my father’s death, the more I discover friends who have also been touched by suicide. Silence = death when it comes to suicide, just like with HIV/AIDS. By talking openly about suicide, we can overcome the stigma associated with mental illness, and signal to those who may be suffering and suicidal (even when they aren’t mentally ill, but just going through a difficult time) that it’s okay to reach out for help. Others like me have been through it. We’ve made it to the other side. It may be hard to imagine—but it gets better. 

If you are thinking of hurting yourself or you know someone who needs help call the National Suicide Prevention Hotline 1-800-273-8255.


Bio: Dr. Judy Melinek is a forensic pathologist in San Francisco, California, and the CEO of PathologyExpert Inc. She is the co-author with her husband, writer T.J. Mitchell, of the New York Times bestselling memoir Working Stiff: Two Years, 262 Bodies, and the Making of a Medical Examiner. They are currently writing a forensic fiction series entitled First Cut. You can follow her on Twitter @drjudymelinek and Facebook/DrJudyMelinekMD. 

Monday, May 21, 2018

Reasonable Uncertainty: The Limits and Expectations of an Expert’s Testimony

Reasonable Uncertainty: The Limits and Expectations of an Expert’s Testimony. Originally published in Forensic Magazine September 2017, 14, 3:18-19 (https://www.forensicmag.com/article/2017/09/reasonable-uncertainty-limits-and-expectations-experts-testimony)


The lawyer had a gleam in his eye. He had backed me into a corner—or so he thought.


“So, doctor, you just said that the article you referenced indicates that only twenty percent of patients with this disease die a sudden cardiac death, correct?”
“Yes.”
“And your testimony is given under the standard of more likely than not, right?”
“That is correct.”
“That’s the same thing as a balance of the probabilities—more than fifty percent likelihood, correct?”
“Yes.”
“Yet less than half of patients with this disease die of the cause you are advocating. How can you possibly testify that it is more likely than not?”

I have faced versions of this question over the years from countless attorneys and I can’t always tell whether they are truly confused by statistics and probability or whether they think I am. They are employing an unsound argument called the ecological fallacy: applying a statistical finding generated by a large population to an individual case. For people who have no understanding of math it “makes sense.” For an expert, these types of questions are an opportunity to educate the judge and jury—and, perhaps, even the questioning attorney—about the science of statistical power and how probability really works.

Courts require experts to testify “within reasonable scientific certainty.” Sounds legit, right? It isn’t—not to scientists. “Reasonable scientific certainty” holds no currency in the scientific community. The National Commission on Forensic Science was a federal body created in 2013 to address conflicts of professional culture between the law and the sciences. Before its charter expired in April 2017, the NCFS released a document that called for the cessation of the use of the term “reasonable scientific certainty” in forensic expert testimony. (https://www.justice.gov/archives/ncfs/page/file/641331/download). There is no agreement among experts of the actual meaning of the phrase. For one thing, nothing in science is based on reason alone; it’s based on evidence and testing. Science is rarely certain—it relies on statistical probability, and acknowledges that outliers can and do exist. We forensic scientists operate in a liminal space between science and law. In the past, we have had to accommodate our professional rhetoric to the demands of attorneys by accepting the use of this phrase when we testified. Why have we done so? Because the cost of rejecting this accommodation was to open ourselves to attack. An opposing lawyer would declare that we have no credibility as an expert, and the judge might dismiss our testimony as not adhering to evidentiary standards. If you can’t testify to something with “reasonable scientific certainty” because the evidence was insufficient, the court might throw out all your testimony rather than allowing you to testify to the uncertainty of the science.

Okay, I hear you: This is absurd! Uncertainty is not a sign of poor scientific testimony—it’s the hallmark of the honest scientist! Our paradox rises from the United States Supreme Court’s Daubert decision which limited the bounds of scientific opinion testimony by expert witnesses. Daubert v. Dow (1993) set the federal standard for admissibility of evidence. It never required “reasonable certainty,” but it did set guidelines for testimony to be admissible if the science is reliable. What did the Supreme Court say is reliable? Reliable expert testimony in science and technical fields must be tested and subject to peer review, must have a known error rate, must be maintained by standards, and must be accepted by the scientific community.

As a consequence of Daubert, forensic science disciplines that rely on inference and experience (such as forensic pathology) have been subjected to accusations of unreliability because practitioners have no published error rate and they incorporate ancillary evidence—such as information from witnesses or police—which can be inaccurately perceived as generators of cognitive bias. While testimony about the likelihood of a particular event being within a 95% confidence interval may be appropriate when describing epidemiological research or while performing lab studies on fruit flies, these types of statistics have no bearing on the day to day work of forensic disciplines such as pathology and criminalistics. Our branches of science rely on training, experience, observation, and scientific inference. There is ample inferential literature to support our observations, but not enough statistics to allow us to report on our own error rate. We don’t operate in the zone of experimental science. We can’t run double-blind tests on murdered human beings. We can’t generate fatal industrial accidents to study the mechanical dynamics at play. And so, thanks to the Daubert ruling and the persistent repetition by lawyers of the magical phrase “reasonable scientific certainty,” we have to sit up there on the stand and teach juries about statistics and probability, about inductivism and the scientific method. 

Humility is baked into scientific semantics. There are things that are not knowable based on the current state of your field of specialty. But if you are asked to answer a question on the stand and the answer is just “I don’t know” then some lawyer will find an “expert” with no credentials or integrity to follow you and declare “I know!” In my experience, juries and lawyers will prefer the expert who is willing to speak with confidence and certainty. They will defer to the voice who “knows” even if it is not based in good science. It's not good enough to say “I don't know.” If you care about justice prevailing you have to also take the time to explain why you don't know what you don’t know. You have to explain the limits of your science.

The stakes are high. In many states, once a defendant is convicted it is not possible to appeal based on scientific advancement or factual innocence, but only on procedural grounds: that the original trial judge or attorneys erred in some way. Expert testimony that had been deemed reliable by the courts in the past is now being questioned by scientists because the science has advanced, but the new data can’t be used to free those who were incarcerated based on expert witness testimony that is now obsolete or was overstated.

The future of forensic science in the United States is in flux, and scientific literacy is becoming harder to come by in the courtroom and outside it. Nowadays one of the most daunting challenges an expert has to face is to convey uncertainty without appearing unqualified. It is an unfortunate tenet of our nature that we human beings gravitate toward the person who can exude conviction with charisma, no matter his or her actual base of experience. Couple that with the testimonial result of the Dunning-Kruger effect—that the expert with the least experience and qualifications is more likely to testify with absolute certainty—and it becomes even more critical that we forensic professionals train ourselves to express clearly the limits of scientific testimony when the evidence in a case just isn’t there. As individuals with integrity we have to apply rigorous scientific principles in our reports and our testimony, and we have to acknowledge that uncertainty exists. Your training and experience is reliable. Your professional opinion (even when you are uncertain) is reasonable. Be certain that attorneys, judges and juries get that.


Bio: Dr. Judy Melinek (link to: http://www.pathologyexpert.com/drjudymelinek/)is a forensic pathologist and does autopsies for for the Alameda County Sheriff Coroner's office in California. Her New York Times Bestselling memoir Working Stiff: Two Years, 262 Bodies, and the Making of a Medical Examiner, (link to: http://www.drworkingstiff.com) co-authored with her husband, writer T.J. Mitchell, is now out in paperback. She is the CEO of PathologyExpert Inc.


Sunday, March 4, 2018

Jury Duty: Inside the Box.

Originally published in Forensic Magazine June 2017 14,2:23-24. (https://www.forensicmag.com/article/2017/06/expert-witness-jury-duty-inside-box)

Last week I was called to jury duty at my local courthouse, where I have testified as an expert witness before. I showed up. I signed in. I watched two videos explaining how our justice system works. Then I was assigned a courtroom—and this time, instead of addressing the people in the jury box, I found I was one of the people in the jury box. A lawyer stepped before us and announced that the legal action for which we’d been selected was a civil matter: asbestos injury litigation.

No way, I said to myself. No way a forensic pathologist can serve as an impartial juror in an asbestos lawsuit. I have got to get excused from this. If this were a lawsuit about larceny, or financial fraud, or building code violations, it would be perfectly appropriate for me to be impaneled. But a case that relies on medical expertise as the evidentiary lynchpin? I felt confident that the attorneys litigating the case would agree that it would be inappropriate to have a medical expert on their jury.

Each of the several attorneys trying the case had a set of questions. They went one by one through the jury candidates. Do you know anyone who died of asbestosis? Do you have any strong feelings against corporations? If a smoker has a smoking-related illness, would you consider it their fault? Every lawyer finished with the same catch-all: Do you have any biases that would influence your judgement in a case like this one?

Each time I was asked this last question, I gave the same answer: Yes. I am a medical doctor and I diagnose asbestosis when it is present in a dead body that comes to my autopsy table. I have my own very definite opinions in the matter, professional opinions founded on years of medical education and practice. To my shock, attorney after attorney then nodded, thanked me—and told me to sit back down in the jury box.

I have a lawyer brother-in-law. That evening, I called him and asked what the heck was going on. Why wouldn’t they excuse me?

“Simple," he replied. "They want someone smart on the jury. They want someone who is going to listen to the facts they present. They don't care about bias as much as they want someone rational."

The next day, when another one of the lawyers asked how I was biased, I had an answer ready. "Let me explain something to you. I am a professional expert witness. I can’t say right now exactly how I am biased, whether it is for the prosecution or for the defense, but I guarantee you that I cannot be impartial when it comes to scientific testimony. If you have medical ‘experts’ get up there and testify? I'm not going to listen to them. I'm going to listen to this”— and here I pointed at my own head—“and then I'm going to convince all these people”—and I pointed at the rest of the members of the jury—“of my opinion. Because that’s what I do."

So that's how I got excused from jury duty. I don't know if my argument finally convinced the attorneys on its merits, or if my hand-waving and air quotes when I sneered medical ‘experts’ convinced them I was a lunatic. Whichever the case, they made a wise decision. I would not be capable of accepting what another medical expert told me was the absolute truth. I would question that expert’s opinion. I would challenge it, based on my training and experience, in the jury room; and, in the end, I would come to my own opinion. I can’t do that in a case about insurance fraud, but I can’t avoid it in a case about asbestosis. It is the litigants’ job to convince the entire jury as individuals. You need everyone on the jury to participate in the process of deliberation. If one juror is instead able to sway everyone else to her opinion because she has relevant expertise and professional authority, the trial lawyers have failed.

Attorney friends have confided—and griped—that they don’t get much law school training on jury selection. They learn it on the job; and, unfortunately, that means they apply their own biases when trying to elicit the biases of potential jurors. They might make assumptions based on a combination of factors, including race, gender, religion, educational background, and professed life experience. They might make the assumption that a well-educated person is going to be able to evaluate an expert’s testimony. In fact, the opposite may be the case. A juror with education and experience in the relevant area of expertise may instead dismiss what an expert witness says, even if the expert is correct. This is a flaw in jury selection as it is currently practiced in the United States: It doesn’t succeed at eliciting implicit bias.

Implicit bias has been in the news lately. These are the prejudices and predispositions you don’t even know you have. In the course of jury selection, lawyers rely on jurors to self-report their biases—but there is nothing predictive that will tell a lawyer whether jurors are biased if the jurors aren’t even aware of the preconceptions they carry with them. Lawyers may ask about educational level as a benchmark of a juror’s knowledge base and reasoning ability, but that is a poor way to assess implicit bias. You need to delve deeper, into which sources of information people consume—especially if the case involves a scientific expert witness. In a “post-truth” age of fake news and intentional, sophisticated campaigns of mass-media deception, those who rely on the internet as their primary news portal may be woefully misinformed. And a misinformed juror is worse than an uniformed one. 

An expert doesn't win or lose a case: We neutralize other experts. If we’re going to do so effectively, we need to be aware of which false news stories might bias the public. A good expert will address this challenge directly: teach the jury the basic forensic terminology, and explain how misconceptions are spread by television dramas and police procedurals. For instance, bullets do not spin people around. Not everyone who confesses to a crime under police interrogation is telling the truth. Time of death determinations based on postmortem changes are not always accurate or ironclad. 

Just because people subscribe to scientific myths doesn’t mean a good forensic expert can’t challenge misinformation and get the jury to understand the science of a case. Unscientific misconceptions might appeal to common sense and might calcify a juror’s biases that have built up over years spent absorbing the zeitgeist’s pseudoscientific claptrap. That doesn’t mean we should surrender to scientific illiteracy. It is our duty as forensic experts to provide the remedy to implicit bias—one jury at a time.

Bio: Dr. Judy Melinek (link to: http://www.pathologyexpert.com/drjudymelinek/) is a forensic pathologist who does autopsies for the Alameda County Sheriff Coroner's office in California. Her New York Times Bestselling memoir Working Stiff: Two Years, 262 Bodies, and the Making of a Medical Examiner, (link to: http://www.drworkingstiff.com) co-authored with her husband, writer T.J. Mitchell, is now out in paperback. She is the CEO of PathologyExpert Inc.