MIC CAVAZZINI: Welcome to Pomegranate Health a podcast about the culture of medicine. I’m Mic Cavazzini for the Royal Australasian College of Physicians. In the last episode we talked about open disclosure of medical injury and what patients or their families want to hear from those responsible for their care. National best practice standards also require health practitioners to explain what is being done to remedy the situation and make sure that something similar doesn’t happen again.
But incident disclosure is rarely as thorough as it should be if it takes place at all. As Professor Simon Willcock, told me, guidelines issued by government or hospital management can drift off like a helium balloon unless they are modelled by leadership at every level of a health service. And we heard how the reluctance from health practitioners to be more transparent is in part due to a misplaced fear of litigation. In fact, civil liability laws in all jurisdictions ensure that doctors can disclose critical events to patients and apologise for them without this being admissible as a confession of liability.
But perhaps the greatest barrier to incident disclosure is the professional culture within medicine. As we’ll hear, the historic tropes of the infallible physician and the heroic surgeon are still strong today. Though team-based practice has become the norm, many doctors find it hard to admit to a mistake, not just to patients and colleagues but even to themselves. There’s a cognitive dissonance or even sense of shame that one might have failed in one’s role as a healer. It was intensive care specialist Stuart Lane who first got me interested in this topic, when I came across a lecture of his three years ago.
STUART LANE: So, Associate Professor Stuart Lane, I'm an intensive care specialist working at the Nepean Hospital in western Sydney. I’m employed by the University of Sydney to look after years three and four, so the clinical aspects of the Sydney Medical program, as well as ethics, law and professionalism and interprofessional education.
MIC CAVAZZINI: You've been interested in the way that making a mistake can jar with professional identity. I'm curious what got you interested in this in the first place.
STUART LANE: As a junior anaesthetist in the UK, I was once talking to a patient about their postoperative pain relief, and they were gonna have an epidural. And at the time I was young, I was enthusiastic to try and do all the procedures that I could. And I was saying, “The epidural gives you really good pain relief postoperatively for this major abdominal surgery you're having. And there’s like a 1 in 500,000 chance of an epidural space infection.” And this patient goes, “Is that serious?” “Well it could leave you paralysed.” And he said, “Oh, well, in that case, I'd rather not have the epidural. Thank you very much. I'll just take the morphine.” And I said, “You know, it’s a really small chance. There's no difference in the infection rate between a spontaneous abscess, and one that’s epidurally-induced.” And he went, “Well, isn't that interesting for your statistics. It's not a small chance if it happens to me, is it?”
He said to me as well, “If this happened, what would you do?” I said, “Oh, I guess I'd say I was sorry,” and he goes “What good’s that to me if I can't walk.” And I sort of thought, you've got a point there. And I sort of took a step back and reflected on myself and thought, my language has probably been coercive in the past to get people to do this, without even realizing that. And it's taken someone to say, “No, I don't think so,” for me to really think about what I'm doing in patient's best interests. So I started to get really intrigued by the whole issue of complications that happen and the way that patients react to things.
So you interviewed junior doctors who had each been involved in disclosure of medication error and looked at how they rationalized it. So let's describe some of the cognitive defence mechanisms you observed in your interviewees.
STUART LANE: You often hear things… here’s an example; You know, when people are looking at radiology, it could be not just the radiologist, but any person looking at chest-X ray. “The unappreciated lesion” is a really a very different way of saying, “the actual severe cancer that was missed.” The “lesion” is change from “cancer”—the “unappreciated” is changed from “missed” but they’re really—the essence is the same. And there may have been other reasons why that had been missed but the language in which you use something is very—it isn't dumbed down, but it softens the blow. So for example, “having a medication that was just a bit too strong for the patient.” Well, “strong” is really not how people react pharmacokinetically and dynamically-wise.
MIC CAVAZZINI: It sort of displaces the responsibility, doesn't it?
STUART LANE: It frames as a weakness of the patient rather than the prescriber and deliverer. And let's face it in the hospitals there's three drugs which are always causing huge problems because of prescriptions and maladjustments. And they are anticoagulants like intravenous heparin, insulin and opioids, those are three—and it's an ongoing issue for prescriptions, and even the actual new, you know, IT systems/electronic prescribing haven't solved those. In my studies, interns spoke about people that were unconscious from opioids; “They’re just having a asleep and snoozing it off,” because they're making a noise which sounds like snoring. But ultimately, they've got airway obstruction from being rendered unconscious by too much opioids. And even the language around that it's very much—it makes it a very natural situation. So there's significant vernacular, there's euphemisms, to make them seem less offensive to your psyche and to help you work through them.
MIC CAVAZZINI: So give us an example of “framing error relative to the outcome.” What’s going through the trainee’s head—the trainee who says, “They got her back with naloxone.”
STUART LANE: So when you listen to that phrase, something has been reversed, but it's really framed in a success. This is success story, because the patient's finally had no actual adverse outcome, they're back now. But once again, that moves the shift away from the error that actually occurred to allow that situation to happen in the first place. And so the outcome is the triumph and you can move on from the error that actually occurred. And I guess, what needs to happen is an actual reflection on the intent and what happened. And if it really was that nothing could have been done, it was done by best practice, and the best titration, the best thoughts and they were in the best position to do it, then that's really unfortunate. But in many cases, that isn't the situation.
MIC CAVAZZINI: If these errors are being veiled—not necessarily that they're not being reported or racked up—but if the individual is sort of keeping the error at arm's length with this kind of language, is there a risk that they're not actually taking on board the lessons?
STUART LANE: So, if you've rationalized that decision, it's actually a bit of a triumph or a non-event, then you're not likely to reflect on the situation that led that occur, and therefore you're not likely to learn and change in future situations. Therefore, the likelihood of repetition is always there.
But that's not uncommon amongst the whole of hospital staff. So, you know, when I went through it, you know, I didn't think to myself, “Oh, this junior person's got no idea.” They were all willing to listen and willing to be re-educated. And that's, I think, a very good thing. My interviews work with junior people—for senior people, I can't comment, but we do know that with senior people it can be very, very different discussion. Re-education is not as easy sometimes.
MIC CAVAZZINI: Some of your interviewees agreed that everyone makes mistakes. But it's okay if they're picked up by someone else. To some extent, that's true, the Safety 2 concept of having systems absorb inevitable human errors. But is there a lack of taking responsibility that is necessary…
There's a very famous case from the UK back in 2005 about a young guy that came into hospital for some intrathecal chemotherapy. And what happened was that he ended up getting the wrong drug injected intrathecally, which led to, you know, multi-organ dysfunction, and he died—it was a huge report ending inquest into the whole thing, and it was felt to be a system error, or lots of system errors to happen. But I guess the difficulty that you'd have if you were the family or related, or even looking into the error from that perspective, is that whilst there was lots of holes in the so-called Swiss cheese system, somebody could have blocked a hole at every single opportunity. Somebody could have said, “this is the wrong ward for this to happen. This is the wrong day for this to happen. This is the wrong person to do this. This is the wrong drug to potentially use.” And I think to frame it as system errors can move away from, “Okay, what was the responsibility of the people involved themselves?” And whilst we don't have a blame culture, there has to be a recognition that a lot of the errors that do occur, there are clinicians involved or a practitioner involved.
MIC CAVAZZINI: In 2006 paper, researchers from Washington School of Medicine surveyed 2600 North American physicians and surgeons who were presented with case studies in which an error had taken place, and they were given a range of possible disclosure scripts to respond with. When the error was very apparent, half of the respondents did choose to explicitly acknowledge the error, while the remainder chose more veiled explanations about adverse events. When the mistake was less apparent, or less obvious, however, the proportion who fessed up to an error dropped to 32 per cent. Is it a case of hiding the truth from the patient or not facing up to it themselves for their own cognitive dissonance.
So ultimately, only they could tell you that. I guess that there are some options we can look at as to what makes sense. So yeah, one of the things that people talk about is that if there's been no actual outcome that's detrimental to the patient, then, “If I tell them about the error, it's just going to upset them.” But there's always that position of moral attentiveness. I've got two choices here; one's a really hard conversation, which is probably the right thing to do, or one is just to move on from that difficult conversation because no real damage has been done here. And I think, it's a very strong person that constantly always takes that harder road because it's exhausting. But in the end it doesn't matter what the actual outcome was—they've got a right to know. Because when someone does finally find out and there haven't been disclosed about this, then the upset is even worse. And it’s my experience for the small errors that have happened—there’s been no detriment—when it's been mentioned to patients up front, they've been, “Oh, yeah. Okay, thanks for mentioning that,” and they want to move on. They actually rationalize it the same but I guess the point there is, you have to give them the option to rush to frame it or to see it from their perspective, it's not for you to decide for them.
MIC CAVAZZINI: And you've drawn attention to medication errors, which are maybe somewhere in between procedural errors in their directness, in the obviousness of the causality, and then diagnostic errors, which can be a bit more detached—might take time before they become obvious. Does the directness of that causality to the injury, does that affect the way that the error weighs upon the clinician and the way they respond to it?
I think so, I think with a procedural error there’s something very tactile involved with that. You know, I've obviously had things that haven't gone perfectly my clinical career with practical procedures and there's something very, very personal about that, because it's very much at the time, and you're there in person when it occurs and I think you feel that a lot more. With the medications, as you mentioned, they can happen, hours, days, weeks, after the first wrong prescriptions have occurred, or whatever wrong medication. And diagnostic once again, because it can often happen at the very beginning of someone's admission that can be done by one person diagnosing and everyone else has taking that on board—momentum bias continues.
MIC CAVAZZINI: And going back to procedural errors; in that simulation study from Washington I mentioned earlier, physicians were three times more likely to mention an error had occurred compared to surgeons. But without indulging too much in stereotypes, surgeons and obstetricians, of course, are actually way more likely to get sued, because of that more obvious connection between their actions.
Yeah, I do think a lot of the actual discussions around surgeons is very stereotypical. And I think that's happened that way, because of some of the examples, the language that's associated with practical mistakes—but because the implications are so huge, the amount of rationalization gets to seem to be to be larger. Another thing you see in hospital doctors when they talk about intubation of patients, “They were a difficult intubation” whereas the frame should be, “but was it a difficult intubation for you, personally? What is your level of seniority? What is your skill of expertise here?” It's very, very, you know, it’s very self-reflective on what your level of experience is.
MIC CAVAZZINI: In the conclusions to that paper, they looked at what properties might make an error more likely or less likely to be disclosed. And the conclusion was, “Respondents disclosed more information if they felt responsible for the error, had prior positive disclosure experiences, and we're Canadian. So I wonder if we need more Canadian simulation programs or something.
STUART LANE: I guess the there's quite very clear different disclosure laws between Canada and America. And so they might have learned in their medical schools very different approaches to just using the simply the word “sorry”. When I did my trials for my interviews with the Center for Simulation over at Harvard, it was very clear that, you know, New York State had very different apology laws or policy requirements from other states in America and definitely from the provinces in Canada. But I think that the good point there is a positive disclosure experience leads to ongoing positive feelings to impart disclosure, that’s actually really good thing to see.
MIC CAVAZZINI: Stuart Lane’s research was published in BMJ Open last year and he’s incorporated the findings into simulation training for medical students and junior doctors. One goal is to train doctors to avoid rationalisations and language that come across as evasive to victims of medical error. And also to help health care teams work through the these events without a sense of blame or guilt.
Few have described that environment as well as Minnesota GP David Hilfiker did in his 1984 confessional titled “Facing our Mistakes”. On his blog he describes this as the piece of writing for which he is most “notorious.” I tried to contact Dr Hilfiker but he’s been less public in recent years since his cognitive health has declined. In his place, I’ve asked my colleague and part-time thespian Michael Pooley, to read from this essay.
MICHAEL POOLEY as DAVID HILIKER: Everyone, of course, makes mistakes, and no one enjoys the consequences. But the potential consequences of our medical mistakes are so overwhelming that it is almost impossible for practicing physicians to deal with their errors in a psychologically healthy fashion…
Most people, doctors and patients alike, harbor deep within themselves the expectation that the physician will be perfect. No one seems prepared to accept the simple fact of life that physicians, like anyone else, will make mistakes… it is highly likely that sooner or later we will make the mistake that kills or seriously injures another person. How can we live with that knowledge? And after a serious mistake has been made, how can we continue in daily practice and expose ourselves again? How can we who see ourselves as healers deal with such guilt?
… The climate of medical school and residency training, for instance, makes it nearly impossible to confront the emotional consequences of mistakes; it is an environment in which precision seems to predominate… And when a physician does make an important mistake, it is first whispered about in the halls, as if it were a sin. Much later, a case conference is called in which experts who have had weeks to think about the situation discuss the way it should have been handled..... I cannot remember a single instance in which another physician initiated a discussion of a mistake for the purpose of clarifying his or her own emotional response or deciding how to follow up”.
STUART LANE: There's a real cultural drive to the way that people deal with error, and medicine especially. You know, doctors don't make mistakes—that’s one of the biggest drivers—and to admit mistakes is often seen as a sign of weakness—it's hard conversation—or even inform about errors they’ve found by their colleagues. People don't want to dob on each other—dobbing is a terrible thing, you know, you get taught at school, “Don't dob on your mates.” But, you know, we're meant to be a patient focused service here. And so if errors have occurred, then the whole point is that patients get the actual feedback they need as to what happened. But also doctors that have made mistakes get the feedback as to that they've made a mistake so they can improve their practice.
With junior doctors, especially, with their rosters, they can make errors and go off on a different shift. And because the error has been inherited by the team afterwards, they often don't get a chance to get these errors fed back to them, so they can't even learn from them. Most doctors they want to know if they've made errors so they can actually amend their practice but don't get chance to be informed because of structural and cultural ideals of how the healthcare system works.
MIC CAVAZZINI: Hilfiker was writing about his experiences of training in the 70s and 80s; that he was never aware that his senior supervisors— that experienced and competent physicians—also make mistakes.
STUART LANE: Yeah I agree—I completely agree because because that's what you aspire to be—s someone senior who was perfect and error free and it never happened them. You assumes as a junior person, “I'm making mistakes and how can I ever be part of this medical if I'm making these errors because my seniors don't do that.” And therefore drove a culture of perfection and exceptionalism, which obviously was detrimental to discussing and learning from errors.
These days, we have morbidity and mortality meetings and entire conferences and podcasts dedicated to unpacking medical error. Do you feel like your students today are growing up with a much healthier attitude?
STUART LANE: Yeah, so we you know, as part of the medical program that I deliver, we actually have seminars on how people rationalize error and the cognitive biases that they have; we have seminars on going through root cause analyses on our serious adverse events. And we look at the actual, you know, all the public documents that are available—the language behind the reports that people put out, and we will look at what's a system problem, what's a human problem; we discuss these and that it's not always straightforward and can be quite complex. So yeah, and that goes into postgraduate education too. However, the difficulty is that there's still a huge amount of hidden curriculum within, you know, medical school, and postgraduate and hospital environments; it's still of perfection, and the things that Hilfiker spoke about there, they still exist in today's medical societies. We haven't got rid of it yet.
MIC CAVAZZINI: Yeah. As much as we hope the blame and shame culture is gone, what is it five or six departments across New South Wales have lost their right, their accreditation to train because of the bullying culture? You know, humiliation by…
STUART LANE: Yeah, so lots of parts that, you know, maybe lead towards what people would perceive as a as a bullying culture. It's not just the fact that we shouldn’t make mistakes, but then you should, “just get on with things—suck it up—you know, that’s life—you're gonna make them.” But that's very different to actually speaking through with someone as to why this may have happened, what was going on for you at the time as to what's happened, and trying to work where you need to go from here. And in fact, what we should be doing is recognizing that we're all gonna make mistakes and be part of that ongoing culture, which is much more open and honest.
MIC CAVAZZINI: To go back to David Hilfiker, and even an a 2005 book titled, “On Apology”, psychiatrist Aaron Lazare described guilt as something you might feel from a one-off failing that you can to an extent make amends for, but shame he says, “shame is an emotional reaction to the experience of failing to live up to one's image of oneself. And the most common reaction to shame is to hide.” And Lucian Leape uses similar language; “For the physician, making a serious error is not just a practice failure, it's a character failure, and the shame can be overwhelming.” Do you talk about that to your…
STUART LANE: Yeah, we talk about to medical students these days, and to junior doctors, and even senior colleagues. So if you were going to really work on moving past the shame that somebody might feel, they'd have to really work on what their professional identity was. Because the identity we'll talk about the doctors seem to have is this perfectionist, you no, infallible. So I think if people are going to work through things they have to work through, “Well what is my professional identity? What does it mean to me and what is expected by other the people.” That's why talking through things is really, really important, because you're not gonna move on from those feelings if you can't recalibrate what you should be, you know, visualizing—if you've got the wrong identity visualized in your mind, then you're not gonna move on from those emotions. And if it happens, again, you're going to go back to them.
MIC CAVAZZINI: And I guess, in a way, the more that the public understands that it's not a perfect system, then the more forgiving they might be. I don’t know it that's an optimistic way of viewing it. But if we never tell them about even near misses, then they'll assume that it's always perfect.
STUART LANE: Yeah, and then guess when something does happen, it's even more of a shock. And I guess my thoughts are as that as a medical profession, we're also not very good at admitting something simple, like, you know, “I don't know.” And that's often a very common thing that you see. And that's very—there was soon good work done by Stuart Dunnart out of Royal North Shore Hospital, many, many years ago that looked at, from interns, to residents, to registrar's to senior. And as the more senior people got, the more likely and the more able there were to say, “I don't know.” Which I think is something to do with the more experienced and more expertise you have, you get better deal with uncertainty, and actually appreciating that, “I'm uncertain, but I'm happy to work with it.” Whereas when you start the medical workforce, there's a huge, I guess, cultural expectation that you think, “If I admit that can't work something out, then yeah, I don't belong here.” Real imposter thing for people to go through, which actually, is ironic, when it's happening to the most junior of the doctors, because they're the ones that should be asking for support more than others, and been supported more than the rest.
MIC CAVAZZINI: That's interesting, yeah, maybe the more senior were, the more confidently you can say, “Yes, I know the field. I know all the pathways we could go down. And to my best knowledge… We don't know the answers to this, that and the other but we should try this.
Yeah. So the caveat there is that they're easy to say, “I don't know,” because it's followed by, “But to find out, we'll go do this, this, this, this this.” Whereas junior staff might not know what to do next.
MIC CAVAZZINI: Serious medical injuries are traumatic for patients and their families, but they can also have a great impact on the health professionals involved. To describe this phenomenon, Baltimore physician Albert Wu coined the term “the second victim” in an editorial from 21 years ago and there’s been a wealth of literature on the subject since.
According to a systematic review in the Journal of Patient Safety from last year, more than half of practitioners involved with injurious healthcare incidents describe troubling memories, remorse, anger at oneself, distress, fear of future errors, and trouble sleeping. With observational studies, you can never be too sure about cause and effect, but there was a prospective cohort study published in JAMA in 2006 that presented some compelling findings. Over 180 internal medicine residents were followed throughout training with regular quality of life survey tools. A third of the residents reported making at least one major medical error over the three year study period, and these were associated with a subsequent worsening on all measures of burnout. In fact, the odds ratio of screening positive for depression at the next survey time point was more the threefold.
In a qualitative study titled, "The natural history of recovery for the healthcare provider 'second victim' after adverse patient events" responses were highly variable between individual clinicians. Some of the participants took the incident in stride as a learning experience. Others moved on to other units where they wouldn’t be reminded of it anymore. And a few made comments suggestive of post-traumatic stress disorder; “There isn’t a single day that this doesn’t affect me”, “I haven’t figured out how to forgive myself for that yet or to forget it” and “I cried a lot over this case.”
To learn how physicians can recognise and respond to such distress in their colleagues, I spoke to Professor Simon Willcock, an academic GP who has studied wellbeing in doctors over many years.
SIMON WILLCOCK: My name is Simon Wilcock. I'm Professor general practice currently the program head of primary care and well being at Macquarie University, and formerly the board member and chair of the Avant group of companies and they include the largest medical indemnity insurer in Australia.
I think it's a larger question of how do you monitor the health and the culture of the environment that you're working in? As a society, we've become much better at that sort of low-level recognition, the RUOK model. And again, saying to people, “you don't have to provide the assistance to somebody”—in other words, you don't have to cure the problem for them—but being aware that they have a problem, and acknowledging it is really significant. We did a study many years ago that showed that the junior doctors knew who among their colleagues were struggling, whereas the senior management, the doctor-managers, and the nurse-managers, they knew who was incompetent, but they really didn't know who was struggling. And I think the important message there is that you cannot at a high level know what's happening for everybody in your organization, you need to encourage that sort of cultural awareness within individual work units and you need to give people a toolkit to say, “What do I do if I'm worried about so and so?” And I think that RUOK, conversation is a lovely simple way of, of recognizing that.
Yeah, just as a routine, tea-room conversation.
SIMON WILLCOCK: And it boils down to a simple message is that it's very hard to work in health and not be exposed to stressful situations. We know that burnout is common among health practitioners. And we also know that I can't fix your burnout simply by completely changing the environment that you work in that you actually have to have some responsibility for self-awareness. One person's, sort of, exhilaration, the adrenaline flow and the stuff that they love it work is another person's trigger for burnout. And that's why you can have a system that just completely locks down behaviours according to an algorithm. And you therefore need to work out what are your vulnerabilities and how you're going to manage that constant level of exposure to stress. And in the same way, when we move to the pointy end of the spectrum with doctors who do have complaints against them, I talk about the fact of whether they “insight-positive” or “insight-negative”. For doctors who are insight-positive, in other words they have self-awareness as to their role within the whole situation, you can usually move to a good outcome for all parties. If a doctor is insight-negative, or blames the system for all of the errors then it's very hard
Of course, the kind of stress we've described gets a whole lot worse when things are ramped up in the medico-legal process. You've published a couple of papers, along with my former boss, psychiatrist Louise Nash
. You found that of GPs with ongoing medical legal matters, 45 per cent met criteria for psychiatric morbidity on the General Health Questionnaire. That's almost twice the already high average of 27 per cent in GPs with no medical or legal cases ongoing. And this was associated with impairments in work and daily life and higher rates of alcohol misuse. One of my reviewers wanted to know, “What your indemnity organization can do for you?” Is there a process for guiding their members through dealing with an adverse event?
SIMON WILLCOCK: Yes, so the first thing is to notify, but these days, the insurers will actually have an automatic system of responses saying, “Would you like to talk to one of our staff, we can organize—a bit similar like to an employee assistance program—we can put you in touch with counsellors who are independent to our organization. But to assist you through this time.”
MIC CAVAZZINI: From the literature, it seems that there are very few programs that are specific to clinicians, most hospitals, just rely on generic employee support programs for all their staff, which aren't necessarily fit for purpose for the clinicians, and these vary fraught situations.
SIMON WILLCOCK: That's a good point to make, and doctors are notorious for not using employee assistance programs and I guess there are a whole lot of reasons for that. But I think the biggest issue is often the management and their policies and the clinicians themselves. It's very high level and it doesn’t indicate any real empathy or compassion.
MIC CAVAZZINI: What about what about that idea that “the mistake is whispered about in the halls as if it were a sin?”
SIMON WILLCOCK: I think that that quote is very interesting— “whispered about in the halls”— it promotes that message of the mystique and the mystery of medicine. And just the fact that that is used very much shows how inculcated in the writer that whole culture is. As somebody who very early on in my career said, “God, I'm not sure I made the right choice here. I like being a doctor, I like other doctors, I like my patients, but boy, this is an unhealthy culture to work in.” I don't identify with that mystique and I wouldn't use those languages. But that feeds into the second point that doctors are very reluctant to give it up. We work very hard to move into that sort of ethereal world of health practitioner and we're very reluctant to give it up.
MIC CAVAZZINI: Well, it's an institution that is thousands of years old and that those tropes are still reinforced on TV and from within and without
SIMON WILLCOCK: Absolutely. That’s a part of the problem with medicine, we keep saying we're different. My best friend until he retired was CEO of a bus and coach association in this state; he was dealing with transport and bus drivers and commuters, and I was dealing with health and patients. Eighty per cent of our issues were exactly the same. So my feeling is that we’ve suffered from going to the tropeish examples in the end. And we also have to start breaking down some of the barriers between what's medicine and what's not.
MIC CAVAZZINI: Only 10% of over 3100 physicians surveyed in the US agreed that healthcare organizations adequately supported them in coping with error-related stress. One of few health systems that has a specialised program is Washington University School of Medicine. Clinicians who’ve been involved in an adverse event are actively referred to the program and they’re paired to a colleague who can provide support without making any judgement on the case. The peer support is focused on wellbeing and on helping the impacted clinician see that this one incident does not define them or their career- it’s just one challenging experience in their learning and in the improvement of the health system. You can read more about this program in the 2019 Journal of Patient Safety but let’s go back to that essay by David Hilfiker, written at a time when no such support existed.
He wrote the essay as a reflection on his sense of guilt following a grievous error he made as a keen young rural generalist. One of his patients, Barbara Daily, came in with a hunch that she was pregnant, but one pregnancy test after another turned out negative; four in the space of six weeks. Clearly the pregnancy had terminated but hadn’t been followed by a miscarriage. There was no need to send the Dailys on a long drive to an expensive specialist consult, so Barbara was booked in for the D and C procedure in the familiar environment of the country practice. She was anaesthetised and Dr Hilfiker got to work with care. It was only as he began removing the pieces of foetal tissue, unusually pink and healthy, that he realised with horror that he had aborted a living foetus.
MICHAEL POOLEY as DAVID HILIKER: “Although I was as honest with the Dailys as I could be in those next months… I never shared with them the agony that I underwent trying to deal with the reality of the events. I never did ask for their forgiveness. I felt somehow that they had enough sorrow without having to bear my burden as well. … it was my responsibility to deal with my guilt alone… How can I not feel guilty about the death of Barb's baby?...
Even the word "malpractice" carries the implication that one has done something more than make a natural mistake; it connotes guilt and sinfulness.…[This results] in an intolerable paradox for the physician. We see the horror of our own mistakes, yet we are given no permission to deal with their enormous emotional impact… Although mistakes are not usually sins, they engender similar feelings of guilt. The only real answer for guilt is spiritual confession, restitution, and absolution. Yet within the structure of modern medicine there is simply no place for this spiritual healing… there is no place for real confession; "This is the mistake I made; I'm sorry."
How can one say that to a grieving mother, to a family that has lost a member? It simply doesn't fit into the physician-patient relationship. Even if one were bold enough to consider such a confession, strong voices would raise objections. The nature of the physician-patient relationship makes such a reversal of roles unseemly…
Little wonder that physicians are accused of having a God complex. Little wonder that we are defensive about our judgments. Little wonder that we blame the patient or the previous physician when things go wrong; that we yell at the nurses for their mistakes; that we have such high rates of alcoholism, drug addiction, and suicide. At some point we must bring our mistakes out of the closet… We need to find healthy ways to deal with our emotional responses to those errors. Our profession is difficult enough without our having to wear the yoke of perfection.”
MIC CAVAZZINI: David Hilfiker’s observations raise difficult questions about how the people causing and receiving the injury might move on. In the last episode we heard how victims of medical injury wanted to hear a sincere apology and show of accountability. I referenced legal ethicist Lee Taft describing how an apology forges a dyad between the clinician and victim, and empowers the latter to grant forgivenss. Taft thinks that apologies protected from their consequences, especially legal ones, didn’t close this loop and were therefore lacking in moral weight.
But there’s another boundary to this moral space explored by Cornell philosopher Jeffrey Helmreich in his essay titled, ‘Does Sorry Incriminate.’ He gives the example of a driver who sees a child run unexpectedly onto the road from behind some bushes and isn’t able to stop in time to spare them. While anyone learning of the event will be shocked, as the agent of the fatality the driver has an inevitable sense of horror at his own involvement even if he’s not liable in a legal sense. As Helmreich puts it, “one can appropriately feel guilty, or take a morally self-critical view of one's past behavior, without finding oneself guilty.” And later adds, “apologizing is an appropriate moral remedy even for blameless harms. There are, in other words, sound moral reasons to apologize to those one harms blamelessly, which resemble the reasons to offer them aid or compensation.” Helmreich recognises that there’s an intrinsic value to apologising, that self-criticism represents an aspiration to always do better regardless of the degree of culpability.
I think this accurately reflects the plight of the clinician who’s made an error at the tail of many failures in the system. And importantly, it may provide a response to that tension identified by Hilfiker- what if you can’t ask the victim for forgiveness, or they’re not prepared to grant it? This kind of variability was observed in a JAMA paper from 2003; some of the victims surveyed were moved to hear that their doctor expressed sorrow and guilt as it demonstrated accountability. But others weren’t interested—they just wanted to know what the institution was going to do to make up for the injury. I asked Simon Willcock and Stuart Lane how a clinician should make sense of these traumatic events.
SIMON WILLCOCK: One of the things to remember is people respond to adverse circumstances differently. And one of my previous senior partners in life and a life mentor and very wise guy, now deceased, said to me very early in my practice, “the patients who should sue you don't, and the patients who sue you shouldn't.” And that sort of underpins so much of the whole error and indemnity and negligence argument; that the patients who have acted genuinely in their best interests and you do have bad outcomes—and as a rural practitioner they can be pretty horrendous—forgive you because they know that you acted in faith in with their interests in your heart. And often the angry patient that feels a need for some sort of punishment or to exact some sort of revenge, if you actually objectively look at the circumstances, it's unreasonable, but you know that something has taken them to that point.
You know, people become angry, it's often a manifestation of depression and that's more common in men. Now, it's quite easy to sympathize with somebody who's tearful, it's hard to sympathize or empathize with somebody who's angry and loud. And yet they both—those responses may be coming from the same place, they may be coming from a reaction to a very emotionally challenging event. I think the way to mitigate is to be open, to have the patients there with you on the journey to not pretend to be the font of all wisdom; These days, patients bring information to you and with them, you sift through it and you act as a guide and you incorporate that wisdom of your colleagues in a team-based care management. I think that's the real answer, when we're talking about reducing no indemnity and bad outcomes for both patients and practitioners in the future.
It's understandable that some people, some victims won't readily heal or forgive a health practitioner when a loved one has been injured or died. So practitioners can't rely on this absolution for their own peace of mind. Where do they go then? Does that kind of judgment need to come from one's peers in the profession?
SIMON WILLCOCK: Yeah, I don't think—I mean, there are different things we're looking for here, we're looking for judgment, but we're also looking for empathy. And they're two very different things, and quite possibly need to be given—or likely need to be given by different groups or different individuals. Empathy, you need for people to be able to understand “I feel for you. Are you okay?” all of that sort of stuff, you know, “What can I do to help?” Whereas the person may still need some sort of judgment, which may be sort of adverse to them. But I think most of my medical colleagues are very happy to offer themselves for judgment. But judgment is different too punitive judging of their actions, you know, we need things to be to be done in an objective way.
Personally speaking, you shouldn’t be asking a patient to forgive you, should you? What you should be doing is informing them of what's actually happened and then they react in the way in which they want to react, you shouldn't tell them something because you want to get something from them. The words that you talk in regard to Hilfiker there, I think it was framed in that confessional way many years ago; you know, “Forgive me for my sins.” But it's not as if they're revealing their sins to a patient. They're informing the patient of something that's happened in that patient's health care. I think confession is some aspect of admitting to errors that you've been involved with, but I think you need to take away the religious connotations of sin and absolution. But yeah, I do think ultimately, self-reflection is all about how people move on from error they've been involved with and it is their peers and their colleagues who are going to be able to help them make sense of that. Because ultimately in the end, the person has to make sense of it themselves. It shouldn’t be the responsibility of the patient to make sense of it for them.
MIC CAVAZZINI: Yeah the patient, understandably, is suffering, then they might not be able to understand all the systems issues, or whatever. But there was a paper in the BMJ called “the Natural History of Recovery for the Healthcare provider, second victim.” The tone of it was a little self-helpy, but they described the six stages of recovery for someone who's made an error, and two of them are “enduring the inquisition, and “restoring personal integrity”. It gives the sense that your peers are this, kind of, Greek choir that that has access to all the knowledge that can evaluate how big a mistake it really was.
And I think obviously the first part of that has to be, you know, admission and recognition that the error occurred in the first place before the people that help you through. But I think the peer feedback and the peer assistance has to be more constructive and honest than, “You're going to be alright.” Yes, you may be alright in the future, but if a significant error has occurred then there has to be well, “Look, you know, we're here to support you and we’re here to help you through, however, this was a significant error, and as judged by your peer body, this was not best practice. Now we've got to work out you know why that best practice didn’t happen for you on that day.”
We recognize that certain things happen in certain circumstances; we have the mnemonic HALT. Because if you're feeling ‘hungry’, you know, ‘angry’, ‘late’ or ‘tired’, then you have to really think about the actions that you're doing at that time, because the risk of making errors has significantly increased. I call it HALTS because ‘stress’ on the end of that is actually an extra. Not just the stress of your job, but maybe stresses from outside your job, your personal life, maybe impacting the way in which you're seeing things from that manner. We've going to work out why that happened for you at the time, and what you need to obviously move on from that and make sure that doesn't happen again, to the best of your ability.
And I think one of the things that you talked about there from the 80s, it was that there was corridor conversations because there wasn't the robust governance structures to report or discuss errors to what they are now. So that's what happened, everyone heard about it, but it stayed in the corridors and it became whispered and it became things of legend and myth. And you know, when these whispers started to get louder, you know, the word “malpractice”, “negligence” will be actually mentioned, rather than “mistake”, or “error” and disclosure with the family—it’s a different language that we have these days. So I do think those areas of guilt did come because of the lack of ability to talk about it safely, which I would hope is very, very different now.
MIC CAVAZZINI: Final question is kind of the converse to this self-identity. In 2019, there was an editorial in the BMJ that called for the term, “the second victim” to be retired. That editorial was authored by families and patients harmed by medical errors, as well as medics, who wrote that the term, “subtly promotes the belief that patient harm is random, caused by bad luck and simply not preventable. The second victims bear no responsibility for causing the injurious event and no accountability for addressing it. The label is a threat to enacting the deep cultural changes needed to achieve a patient-centred environment focused on patient safety.” Had you come across that?
STUART LANE: Yeah, I had been aware that the actual term had been debated—it's been around for many, many years. But I think there’s pros and cons to that term, what it means. My interpretation of that was, if patient groups feel that that term should be moved away from because it's not helping them to move on from the error that's happened to them then it should be removed. There does need to be a recognition that patients are the priority here, because that's what doctors are there for, but a recognition that doctors are also struggling by their mistakes and have the appropriate support networks. So I've got no problem with the term being lost because I think it distracts from what the conversation really should be about.
I've certainly been aware of, and been criticized myself for using the term second victim. I don't have any problems using it because I've spent a career working with doctors who are significantly damaged through the, you know, exposure to adverse events that they have varying levels of culpability, and often no real culpability in.
MIC CAVAZZINI: I mean, if wanted to play devil's advocate, even “the first victim”—to me, the term victim implies that there is a “perpetrator.”
MIC CAVAZZINI: I mean, I suppose you can be a victim of bad luck, but it's often a victim of violence and malintent, which automatically creates the wrong dynamic in most cases.
And again, you're right, I think the terminology is so fraught. “Error” is fraught—there's no subtlety to error, something was wrong, rather than there was an adverse outcome from an event but, you know, the factors behind it were very complex. And any black and white view of the process is immediately under suspicion. They lend themselves to a legal environment, where you have that black and white and who can prove that my black is the right black. And of course, that's the inherent problem with medicolegal matters, is that we're talking about cultural issues, communication issues, and we're trying to define them in very legalistic ways. Which is why I think most sensible organizations try to move away from legal resolutions, because mediation and listening are far better. And often provide, I think, better satisfaction for all parties than sort of somebody trying to prove their right or wrong. But I'd have to say that I completely disagree with the stance that you can't acknowledge the victimhood of the clinician.
MIC CAVAZZINI: In Australia, victims of medical injury need to sue healthcare institutions or professionals in order to receive financial compensation but it’s a lottery that rewards only a tiny fraction of people. By contrast, Aotearoa-New Zealand has a no-fault injury compensation model that provides prompt support without the angst and cost of litigation. That will be the topic of the next episode of Pomegranate Health.
For now I want to thank Stuart Lane and Simon Willcock for sharing their great experience with me, and my colleague Michael Pooley for lending his voice to the words of David Hilfiker.
If you’re affected by any of the themes discussed today please do reach out to someone. There are several phone services providing support and confidentiality at any time of day. The best-tailored is probably the Doctor’s Health Advisory Service. It has different numbers in every jurisdiction of Australia and Aotearoa-NZ, so look them up at dhas.org.au. The RACP partners with the counsellors at Converge International whose number in Australia is 1300 687 327 and in New Zealand it’s 0800 666 367.
You can call either of those services for advice about work-place stress or interpersonal conflict, issues around trauma and grief, mental health or drug and alcohol problems or any other challenge. Beyond Blue provides specialised counselling for depression, and if you need crisis support right now, please call Lifeline. Their number in Australia is 13 11 14 and in Aotearoa-NZ it’s 0800 54 33 54.
Finally, at racp.edu.au/fellows/wellbeing there’s guidance on how to check in on a colleague who may not be coping, and eLearning courses on caring for trainees under your supervision. You’ll see from the members stories page that even for physicians at the top of their careers the pressuressometimes gets too much. There’s no need to play it down— you’re much better off to your patients if you look after your own health too.
I want to thank all the people who provided feedback on this podcast. They’re credited by name at the episode web page, where you can also leave your own comments on this story. And always feel free to send me your feedback and ideas using the email firstname.lastname@example.org.
I’m Mic Cavazzini. I hope to hear from you.