Craig Clapper
Datica Podcast

Patient Safety: Lessons from a Decade of Improvement Efforts

July 23, 2019   Healthcare Cloud Engagement

In this episode of 4x4 Health, we’ll be discussing patient safety with Craig Clapper, a founding partner at Healthcare Performance Improvement (HPI).

In this episode of 4x4 Health, we explore the history and progress of the patient safety movement with Craig Clapper, a founding partner at Healthcare Performance Improvement (HPI). Mr. Clapper’s 25 years of experience in improving reliability in nuclear power, transportation, manufacturing and healthcare gives him a unique perspective on developing a culture of safety and demonstrating measurable progress.

We also look at how of the general principles of safety and reliability can be applied to health information technology.

Episode Transcript

Dr. Levin: Welcome to 4x4 Health sponsored by Datica. Datica; Bringing health care to the cloud. Check them out at www.datica.com. I’m your host Dr. Dave Levin. Today I’m talking with Craig Clapper. The founding partner at health care performance improvement or HPI, a Press Ganey Company. Mr. Clapper has over 25 years of experience improving reliability in nuclear power, transportation, manufacturing and health care. He specializes in cause analysis, reliability improvement and safety culture improvements. He creates these multiple engagements for safety culture for a variety of health care systems. Prior to forming HPI and focusing on healthcare Craig served in a variety of roles including chief operating officer of performance improvement international, chief engineer for Hope Creek Nuclear Generating Station and Systems Engineering Manager for Palo Verde, a nuclear generation station. Craig’s a registered professional engineer, has an MBA and is a certified manager of quality and organizational excellence by the American Society of quality. Craig’s experience is wide, deep and varied and includes work on nuclear power events and component failures commercial aviation components and the Texas CNN bonfire structure collapse. His healthcare experience is equally broad making him a formidable force for improving safety and our industry. I got to know Craig when he led efforts to develop a culture of safety that my former employer, Centura Healthcare. An experience that was rewarding and has made a lifelong impression on me. I’m sure that this will make a big impression on you as well. Welcome to 4x4 Health Craig.

Mr. Clapper: Thanks for having me Dr. Dave. I’m glad to be here.

Dr. Levin: Craig, we’re going to cover a series of four questions today and we’ll take about four minutes to answer each one to get us started tell us a little bit about yourself and your organization.

Mr. Clapper: I’d be glad to. As you said I’m one of the partners at Healthcare Performance Improvement. When we get to September it’ll be four full years as part of the Press Ganey family. And we do basically two things is we apply safety science and high reliability organizing. I got into this work as an engineer in the power industry and I looked mostly at equipment failures. Then one day I thought gee we have a lot of these equipment failures. So I became more and more interested in how to prevent recurrence. So became mostly about systems thinking and preventing human error that occurs in complex systems.

Dr. Levin: And how is that work evolved over time. What are some of the sort of basics that you’ve identified and applied to health care in particular?

Mr. Clapper: Yeah that’s a great question. I think over the last 20 years we’ve made some good progress. You know 20 years ago we were talking mostly about the individual event and how can people prevent the few human errors that lead to patient harm. I think people now are much more attuned to systems thinking and we put a lot more effort into human factors and processes, human factors and devices and equipment and we’re better about leading the system. I really think maybe that’s the big difference is today the health care leader thinks of themselves as I’m a systems engineer for my health care delivery system.

Dr. Levin: Yeah this is a really big deal and I’m old enough to live through the way it was at the transition that we’ve been going through. And you know frankly when I was in training, I think the idea of safety and preventing error was try harder Dr. Dave, don’t make mistakes. I think a lot of it really was viewed as almost like character flaws when things went wrong. You weren’t careful enough you didn’t try hard enough or you’re stupid and you know you’re talking about a very very different kind of approach and was hoping you could give us some really specific examples. I mean the one that I always site to colleagues is in the bad old days it was easy to connect the wrong gas lines in the OR. So the poor anesthesiologists could mix up the oxygen and the nitrous and you know critically injure or kill a patient. And you know the systems thinking there was well let’s redesign those fittings so that you simply can’t connect the wrong gases to the wrong lines. And so to me that’s the sort of concrete example of engineering errors out of the system. But you’re talking about things that I think will go far beyond those sorts of things. So, give us some real specific examples if you could.

Mr. Clapper: Yeah I’d be glad to Dave, I think your example is the prime example from my friend Dr. Todd Strumlaucer who is an anesthesiologist, he pointed out that Bud Coleman the song writer for Herb Alpert and the Teavana Brass died under anesthesia from just that problem. So after that we put a lot more work into the anesthesia machine. Not me personally, but you know health care folks. And we had that good human factors. We had the connectors. But then there’s some later cases where some enterprising clinical engineers built adapters for it, so that either could be connected and that opened the door again to a repeat event. So I think as we think of engineering solutions, we tend to think of them as forcing functions. And in return there tends to be the work arounds. And if we concentrate a lot on safety culture that tends to be the antidote for the work around is that the culture is more of a choosing function than a forcing function. So we’ll have people that will maintain the equipment and use the equipment and follow the protocol and use the checklist. I think that’s a big difference.

Dr. Levin: Yeah. So you’ve used this term culture and culture of safety several times. Let’s get a little deeper on that. What do you mean by that and what can an organization do to assess itself and improve its culture when it comes to safety? Seems like a really tall order and kind of complicated.

Mr. Clapper: I think it is, but it pays off handsomely. Culture is shared values and beliefs of people. And when you say safety culture, it’s a culture that put safety first. So whenever they’re doing something related to safety they tend to pay more attention and put more thought into it. They communicate a little better. They’re more likely to stop and get help. They’re definitely more compliant. So culture not only is the single biggest behavior shaping factors in a complex system, it also holds all the other ones together. So maybe a way to look at it Dave would be like a checklist in the IRA doesn’t keep patients safe. A team that thinks together using that checklist, that helps quite a bit and barcode scanning at the bedside doesn’t keep patients safe. The caregivers who scan meds and think for themselves about what’s that beep or blip mean, That helps quite a bit. So behind all these little clicks that we have on computers and all these equipment and widgets and pieces of paper, it’s still mostly us. The people who have a head to think and then a heart to care, that makes them more resilient than the other parts of the delivery system.

Dr. Levin: Well we’re going to definitely come back to those clicks from the computer for a few minutes. I really love the example of the checklist though and I think you described it well. Because to me it embodies both. There is an opportunity to design this up front so the checklist should incorporate the best practices and doing the right things in the right order and doing them consistently and all the rest. It’s a great tool to support that. But as you pointed out, if we don’t also have a culture that values that and values the use of checklists and reinforces their use and their thoughtful use, then it’s probably not going to make that much of a difference. And so as you’ve said it’s of limited value if you’ve not built the culture around if you haven’t built the resiliency into it. Am I describing that accurately and by the way all guests on 4x4 Health are free to call BS on the host. So don’t hesitate to do that if needed.

Mr. Clapper: I’ll have my BS meter set 100 percent power. You got that exactly right. You were right.

Dr. Levin: So let’s go even a little deeper on that because the other thing that I learned from working with you and I had the great pleasure of learning it also from some pilots and some astronauts was this concept of crew resource management or CRM, not to be confused with customer relationship management. But this was really about how we work together and teamwork and that sort of team performance. At least that’s what I remember from 15 years ago. Craig, enlighten us a bit on this piece of it and how it all fits together.

Mr. Clapper: Yeah that’s another good topic is teams that can think together, our single biggest advantage in safety quality patient experience, reliability of systems. From an aviation source we tend to say crew resource management. I think in healthcare we could talk about collegiality or collegial interactive team is right now where we have data at HPI, we think 40 percent of the acts that lead to serious preventable harm would be solved by us thinking together as a care delivery team. And I would point out that a lot of work has been done particularly in the last 10 years around cognitive debiasing, you how we could think more clearly and that adds nicely into the original CRM work that helps more with the communication aspect. So I think it should be part communication protocol, but also part thinking or critical thinking skills if you like that expression and the big issue there is the ability to flatten authority gradient, which is fueled in part by power distance. So we use a lot of relationship skills in our work and we have them switch hit. We use them for teams and safety, but we also use to connect a provider to patients and family. The same skills that connect us with our care delivery team also connect us with patients and family. It’s one of the few places where you can double up and have one thing to two functions.

Dr. Levin: I think that’s really interesting and the sort of tie this all back together again. And my simple little mind. You know going back to the example of a checklist being used in the OR, the place where the CRM could come into play is let’s just say for example, we might have a surgeon who was reluctant to adopt and use a checklist. I know that’s a far-fetched example. A rare hypothetic, well let’s just imagine that might happen. You know part of at least what I learned in CRM was it creates an environment where people can challenge that as you said there’s some degree it flattens the hierarchy and so in theory anyone in that OR can raise that question. And we train about how to raise those questions in a way that hopefully will be constructive and not just generate more conflict. But to me again the way this all fits together is, there’s a checklist that’s been know sort of pre designed to help us do things the right way consistently. But then there’s this whole sort of cultural thing that has to be built around that including the kinds of communication I’m referring to, to really make it reliable and consistent and have an impact. That’s the way I understand it again. Please elaborate or correct where necessary.

Mr. Clapper: Yeah I think you’re right on track again is that we pick out the items on the checklist because we think they’re best practices, they are very meaningful and I think the value in thinking together using a checklist is you get that nice cross monitoring effect. Whenever you cross monitor you multiply small error probabilities. So if the surgeon is very good and they typically are they might have a defect like one defect out of a thousand or one defect out of 10,000. But if we’re with them and we’re just as good you know and things like right patient, right site, right procedure you know we also have a low defect rate like one out of a thousand. You start multiplying those numbers together, 1 out of 1,000 times 1 out of 1,000. Now that’s one defect out of one million. So we go from as good as anybody, which is just a nice way of saying as bad as everybody to better by far because we have that great cross monitoring effect. And Dave you could make that just with two simple things is whenever one of your teammates has the courage to say wait a minute I don’t think that’s right. And then that provider turns and says Thanks. That helps quite a bit. You build that multiplication factor every time you do that.

Dr. Levin: That’s a really good example and again the part that I remember that was so powerful when I was learning this and getting some of this training was there were very specific techniques for engaging in those conversations for raising those questions that had been practiced and proven, out at least as I recall it in the field of aviation and well they really work. They worked in environments where you might wonder how well they would carry forward. There’s two other sort of concepts I want to touch on quickly and so what are those is the so-called Swiss cheese model for failure. And it’s interesting to me I got in a conversation with someone about this the other day and I was shocked that they had never actually heard about the Swiss cheese model for failure. And I got to say when I learned it it’s one of those things that I see pretty much everything in life looks that way now. So maybe I’m dating myself and that’s no longer a popular model within the world of safety but other assume it is and could you just take a moment and tell us a little bit about that idea.

Mr. Clapper: I’d be glad to. It’s not outdated, it’s vintage. It’s a nice model and it comes from a model of events that are called linear complex model and health care is very complex but it’s not very linear. So we’re always kind of stretching it a little further than it should. The father of this Swiss cheese effect is James Reason. And he noted that complex systems fail in complex ways where there’s multiple active errors coinciding with multiple latent weaknesses. The word latent is in its present but is unseen by the people and in our root cause practice we think the number is 8. So if you think of a Swiss cheese event you might have like an act of error with seven slices of Swiss cheese. The holes in the cheese are the latent problems and when that error is able to travel in a straight line through all seven of those layers it reaches the patient and causes harm. So I think in your work around electronic health care records you often find problems. We’re trying to take problems out of the system, each one of those is like a hole in Swiss cheese. So if you reduce the number of holes you have like an EHR with fewer defects, then those few errors in practice don’t carry forward to reach patients and cause harm.

Dr. Levin: Yeah I mean those are really good examples and you know I’ve sort of adopted it as a life wisdom of when things go wrong there’s usually more than one reason and that means when we go to correct them, we probably should be looking for more than one thing to do to address the problem as well. The other term that we frequently here in this realm is high reliability and the high reliability organizations. So take a minute and tell us about what those things are and what they mean.

Mr. Clapper: So this might be my rant for the discussion.

Dr. Levin: Then rant away, but please remember we are a PG 13 show.

Mr. Clapper: I’ll try to rein it in. Yeah because they’re not interchangeable. So when you say high reliability you’re just describing the socio technical system and socio technical system means they have people and I have process and I have technology and they’re all working together to accomplish something. So you’re just describing how well it does that. Is it moderate reliability, high reliability or that ultra-high reliability. When we say HRO, we’re talking about a very specific body of knowledge. High reliability organizing which was an expression coined by Carlene Roberts at Berkeley and one of her proteges or students was Carl Wyck, who went on to Michigan. So you have the Berkeley School and then the Michigan school. In high reliability organizing they try to point out there’s some super principles about how the people work in the socio technical system. And most of us know about HRO through [unclear] and Sutcliffe managing the unexpected where they had five of those super principals. But notice as you go through those five super principles they kind of leave things out like the caregivers have the right knowledge and skill for the job or your work processes function reasonably well or you’ve put some effort into your electronic health care record. So it has usability for caregivers and providers. And my thought Dave on why they don’t talk about those things is they thought that they were already discussed earlier in earlier versions of reliability. So when they go on to talk about high reliability organizing they’re talking about even bigger constructs. And that’s my thought. But maybe only a systems engineer would get upset when somebody says HRO when they meant to say merely high reliability.

Dr. Levin: If you’ve just joined us we’re talking with Craig Clapper founding partner at HPI about safety in general and safety principles applied the health care. The next question I’d like to ask you just broadly is what’s the most important or interesting thing that you’re working on right now?

Mr. Clapper: Yes we have a lot of exciting where you know you mentioned earlier that we’ve been doing this for about 20 years and I think we’ve advanced. I really believe that care is better today and patient safety is better today than it was 20 years ago. 20 years ago we are talking mostly about human error and today we talk mostly about transformational change around performance cultures. We have a single culture that puts safety first but also thinks about quality, things about patient experience, engagement of caregivers and providers, efficiency of care and improvement and run all of those things on that high reliability platform. So instead of being a safety person where I use the word as a noun like I prize safety, I want it to be used as an adverb; safely. I’m not interested in safety, I’m interested in pairing safely. I still want the care but I want it done safely. Just like when you fly on an airplane you want to fly safely. You don’t want to sit there at the gate and have them tell you we’re not flying today but at least we’re safe. There you have safety but you didn’t want safety you wanted to fly safely. So I think the big thing that’s exciting that we’re doing is that integration of all those things together. The safety, the quality, the experience, the engagement, the improvement and the efficiency all together.

Dr. Levin: That sounds both really exciting but kind of daunting. Tell us a little bit about how you’re going after that practically and what kinds of results are you seeing.

Mr. Clapper: We’re doing pretty well especially in the safety or maybe the safely aspect that we’re still seeing nice reductions that range between 50 and 80 percent in serious preventable harm. And we’re consistently seeing some nice boosts in clinical quality, especially around hospital acquired conditions perhaps. And then the measured appropriate care items. I think where we’re not being successful Dave is having breakthroughs across the board. So safety is a lot better engagement sometimes is a little better and it experiences a little better, Sometimes safety is only moderately better. So I think what we’re trying to do now is have better solutions where you can have a breakthrough in safety and experience at the same time. And one doesn’t have to be a little more modest compared to the other.

Dr. Levin: And of course that all has to be done in a way that’s economically efficient and sustainable as well.

Mr. Clapper: Yeah definitely. So you could probably say it goes without saying but whenever you say that it needs to be said.

Dr. Levin: The health care these days that definitely needs to be said.

Mr. Clapper: It’s reliability you’re not going to have to pay extra for it. I think what we’ve paid extra for is we’ve sub optimized care delivery systems where we added a lot of stuff on in one area and it made that one area a little better. But it just kind of slowed us down and made the other areas worse. So I would maintain that its a better care delivery system it will be safer and more effective and you’ll pay much for it.

Dr. Levin: Well to me that makes a great deal of sense and hopefully you’re accumulating data over time to prove that is in fact the outcome at least in some settings. The other thing I want to delve into a little bit is specifically how some of these principles can apply to healthcare IT and I would argue that with the advent of electronic health records, the stakes grew pretty dramatically. There was just far more clinical activity now based on information technology, not that it didn’t happen before but just in terms of scope and scale and intensity. This is a relatively new phenomenon in healthcare and it seems to me that this would be an obvious place to begin to apply and adapt some of those basic principles as well. I could speculate in front of the expert about what those might be, but it’s probably smart for me to ask you first. So as you know you’ve clearly got a background in engineering and you’ve worked within health care for a while. Any specific observations you’d make about either problems in health IT or opportunities to apply these concepts to improve safety from that perspective?

Mr. Clapper: Yes although I think you’re the expert. I think I might be talking to the expert and maybe that’s part of the opportunity and everybody’s an expert now and a piece of that solution is the IT right now is blamed for 50 percent of the problems but look to be the solution for 100 percent. It may only be cause and half the trouble. But we wanted to solve all of our problems. I think that we’ve made some good progress in the electronic healthcare record especially with connectivity as being able to see images and lab results and maintain some situational awareness across time and geography. I think also we’ve just kind of opened the door to what will be good human factors integration. I think the work that you’re doing when you look back two years you’ll kind of laugh at how primitive things tended to be with the breakthroughs and I think you’re really kind of poised for a new generation of health care records where they’re not just an automated version of the original paper chart that they are more intuitive and they present data to individual caregivers and providers and looks that they’re more used to seeing and thinking.

Dr. Levin: So this is, I had to be careful. I’m the one that’s going to go off on a rant on this topic but absolutely I think you know I sort of refer to this health 1.0 was we more or less paved the cow paths and health IT. 2.0 is going to be about workflow and solving real business problems. Which you refer to this idea of we need less data and we need more actionable information. And the ways, known ways to do that and present the information and presented in ways that you can act upon it, I give us a very bad grade and health IT. When it comes to human factors engineering. You know there are exceptions, but I think for the most part we didn’t recognize that there are important lessons and principles out there that we could draw upon in designing these systems. Myself and a few colleagues over the years have talked some about how the basic if you will tricks and techniques that we learn about patient safety might also apply in this realm. And so I think there is a place for checklists. We didn’t talk about it today but there’s a concept of red rules, where there’s just certain rules there’s just a few of them but you never violate those rules. And so we’ve wondered is there a place for a concept like that in health IT and I think there are some core resource management principles to be tapped as well. I like how you referred to at the start here about you know we’re each experts in our own way. I feel like some of the best work I’ve seen is when you bring those folks together and there’s a mind share that results in something very different. Let me stop there for a minute. Again I feel like I’m speculating in front of the expert here. But at least those ideas at least have the kernel of something worthwhile in them.

Mr. Clapper: Yes I think so especially the expression that we’ve paved the cow path. So we’ve kind of institutionalize the traditional workflow.

I think if we think more broadly about what’s a more sensible workflow and then build the environment of care and then the EHR to reflect that then we’re doing that true human factors integration as we’re really building the socio technical system around people. Here’s a patient and here’s a providers we need to be serving those two people. I think historically we’ve just kind of used what we have and we’ve always tried to make it better, but it’s been a real plus one strategy where end was what we had last year in the plus one a little betterment that we came up with. And I think you can see this best around that 40 percent thinking errors statistic I touched on earlier. You know originally we talked about artificial intelligence or AI. I’m sure that’s expression you’ve heard from time to time on this podcast.

Dr. Levin: Every once in a while.

Mr. Clapper: Yeah and we realize that we’re always five years away from artificial intelligence. When I was in college it was five years. Today we are five years. When my son Jack graduates, it will still be five. So while we weren’t looking they changed it and it’s still called AI, but it’s called augmented intelligence now. That I’m not going to think for you but I’m going to support you in your thinking and I think that with a more graphical interface, a more friendly interface between us and the computer I think that could just open up a whole different world. And how about as a little bonus in Jeff Raskin book, The Humane Interface his first four chapters have to be the best insight into human performance that I’ve read and I remember Jeff was a computer guy. In fact he invented the graphical interface and the mouse. And he says that I want to talk about computers but before I can talk about computers I need to tell you about people and what people can do and can’t do. And then those four pithy chapters he just kind of walks us through huge bodies of knowledge that we could all use every day.

Dr. Levin: I hadn’t thought about that book in a long time. But you’re right, it’s a classic. And that part of the discussion is really framed things beautifully. I’m sure that you’ve spent many hours poring over the proposed rules from the office of the national coordinator for health IT. Actually, I’m pretty sure you haven’t.

Mr. Clapper: I woke up last night with a gas leak. I wonder when does it go into effect.

Dr. Levin: Hopefully soon. There’s an aspect to the rules that I found really fascinating. And it’s also frankly I think an interesting window into the health IT industry at the moment. So there’s some very specific proposals in there that I think go directly to this issue of safety and building a culture of safety and health IT. So for example it has been difficult to share screenshots of some of these IT systems. Mostly I’m talking about EHRs, but they may extend to others as well. You know this is viewed as intellectual property and some vendors are known for policing this very strictly. There are licensing agreements that include what some people would term gag clauses that can inhibit discussion around performance or safety, impact on efficiency. And I don’t know that there’s been any formal study or publication around this. I can speak with some authority that I’ve had a number of colleagues over the years tell me anecdotes that go basically like this. We studied XYZ impact of our EHR and when we went to go publish this our lawyers told us that this might be a violation of the confidentially out clause in our license and so we weren’t able to share this information. Now I guess I’m anticipating a rant here because one of the things that I definitely learned from working with you in the early days was collecting and sharing this information is a critical first step. In fact prepare yourself because when you start to do that it may look like your error rates are going up. That’s probably not the case it’s just you’re more accurately reporting but this is foundational. If we exchange that information and talk about it it’s hard to know how we’re going to improve. So either reel me in or cheer me on here. Are these real issues and what’s your take on this idea of proposing these kinds of changes.

Mr. Clapper: Real issues is the transparency and the sharing help us to advance the reliability of the care delivery system. In a historically day you had pointed out in your training when you trained is quality was a competency of the individual. Today that we at least consider it an emergent property of the systems we work in. And I think what those new rules will do is enable us to be better systems engineers and see that’s the change that we need to make between this year and the next year. Instead of managing the systems as a hobbyist where we just get together and committees and counsel is that we create ways that we can systematically improve the reliability of the system and outside of health care the number one way to do that is to learn from others. And in fact they are saying is you have to study other people’s problems because you just won’t live long enough to make all the mistakes yourself. So I see that as a big breakthrough, especially the transparency and then the sharing of data and solution.

Dr. Levin: Well as we’ve discussed in a different context on this podcast we’re far better at seeing other people’s problems than our own. We recognize all those things and other people in other situations. It’s a little trickier when we’re looking at ourselves. You know you said something in passing though that I think is really a very important fundamental issue. And I think also tied to where we’re going next in health IT and I think that it was fair when you said we got together as you know sort of semi informed experts and hobbyists and it’s kind of an interesting dilemma that we faced in the early days of deploying these clinical systems. Because it’s almost a sure thing if you had showed up with you know just take the EHR out of the box and turn it on and we’ve already coded everything and you will just do it the way it says. That would have not worked in the culture of health care at the time. You had to allow people to do a lot of localization and customization and kind of put their own spin on it. And so what you typically saw was what you described as the health system would organize to design and deploy any EHR and they’d gather you know groups of varying degrees of expertise and they might start with some guidelines but a lot of it was you know invented there locally. As I said it probably was necessary to do it that way they get started. But it also creates a bunch of problems. I think it’s part of why the designs aren’t that great. It’s part of why I think workflow sometimes got a backseat. So let’s just you know pave the cow path and we didn’t have access to some of the expertise that we would’ve needed to do this in a different way. Let me just pause there for a second, is that a fair description amplification of the point you were trying to make before I go on to my next point?

Mr. Clapper: Yeah that is right on.

Dr. Levin: And again, I want to be clear. I was one of those people. So I was one of the pseudo experts in the room, so I include myself in that. What I find really fascinating about this and intriguing is you know I think this next generation Health 2.0 if you will is going to be much more about competition. I think we’re on our way back to more of an ecosystem of applications. And my hope is that you know through that approach we’re going to see better designs. Better things will emerge and to some degree some of this expertise is if you will, it’s outsourced. It’s baked into the cake when I buy it. That doesn’t absolve me of a whole bunch of other things I need to do as a system. But I’m excited about these rules that are proposed, because I think they are going to create a more level playing field. I think they are going to spur competition and innovation and I believe that’s going to lead at least to some degree to improvements, not just in performance efficiency if you will and clinical impact, but safety as well. Which I know you would count as part of performance. So again man reel me and if I’m way out there too far, but that’s what I see coming and it’s part of it’s yet another reason why I’m excited about the next generation that’s starting to emerge here.

Mr. Clapper: It’s very exciting. Remember in my practice half of the problems are attributed to the EHR, but all the solutions we’re looking for to come from the EHR. [Unclear]

Dr. Levin: Yeah but you know I think there’s another challenge wrapped up in there which is, my IT colleagues and I talk about this frequently. You know IT can enable solutions but it’s not in and of itself a solution. And I think far too often particularly in healthcare we thought well here’s this problem I’m going to buy this piece of software and that will fix the problem and I don’t know about you, but I’ve never seen that work. It’s got to, you have to very thoughtfully define what is this problem, what are the variety of thinks we’re going to need to do and how can the technology and able that. I think that’s a mindset that we probably should adopt more broadly.

Mr. Clapper: Without a doubt. High reliability is the right mix of people, technology, process, protocol that’s if it’s bedded right in that process and one that we overlook quite a bit is the organizational structure. The way we’ve done. Division of labor. And do we have the job functions correctly. I really agree that it’s really more of a systems engineering problem than an IT problem.

Dr. Levin: For the last question today, hopefully you have some left. But do you have any additional sage advice for us.

Mr. Clapper: Maybe a good way to finish is I think the best advice is to concentrate on reliability of the socio technical system and have one performance culture to drive the safety, the quality, the experience, the engagement, the efficiency and the improvement. Because right now we just sub optimize care. You know I am the safety guy. So if I’m working on safety it’s getting better. But then after me they’ll be maybe an efficiency person and the safety work will suffer. And I just think we need to end that era of sub optimization by thinking about doing all those things that are important to us together in a way that we can lift the reliability all at once.

Dr. Levin: It’s a more holistic approach that looks at those pieces and how they fit together and reinforce each other. Is that a fair way to look at it?

Mr. Clapper: Exactly like the Buddhist hotdog vendor. He said Make me one with everything.

Dr. Levin: Make me one with everything. Exactly. You know this subject, your culture is one that’s near and dear to my heart and I’ve become I will own publicly a bit of an extremist on this topic. Craig I’ve come to believe culture is the work and everything else is kind of a byproduct of that. And certainly in the last five years in the organizations where I’ve been able to exercise a degree of influence, we’ve made this a priority and I really have come to believe if you do that you’re building a great organization. It’s got a lot of the properties you talked about. It’s got resiliency, it’s thoughtful, it’s adaptable and so there’s so many parallels to that sort of general culture work and what we’ve been talking about here today when it specifically becomes to performance and safety. I’m not asking you to agree with me. But I suspect you would agree that culture is just a huge foundational piece.

Mr. Clapper: I do. I think you’re right. Because the technology comes and goes and hopefully it’ll improve continuously. But if you can build a strong performance culture within an organization, that can be very enduring. In fact Dave maybe a good question for you to ask your listeners is culture easy or hard to change. Everybody tells me it’s hard to change and I think that’s the answer. That’s why you’re so interested in it. If it were easy to change, any work that you did this year would be gone next year. But the fact that it’s difficult to change means that the changes that you do make can stick with you for quite a while and I think that’s what makes it worthwhile.

Dr. Levin: Yeah I would agree. I think the other thing that comes up is how do you measure it and how do you show performance over time and culture. And interestingly I think you and I have approached this from perhaps different domains where we’ve ended up in similar places. You’ve talked today about a number of things that can be measured in terms of performance. Likewise there’s some very interesting work around organizational culture and how to measure culture within organizations. Some things that are validated that I found particularly useful that we use in the companies that we work with. So it’s a fundamentally a very different way of looking at work and what we’re trying to do. I’ve found it to be incredibly powerful and I think you have as well.

Mr. Clapper: Dave that was another excellent point is that if I just said that culture is the strongest of the behaviors shaping factors, then one would think you’d want to measure it and improve it and remember that HPI is part of Press Ganey that we were a measurement company before we were an improvement company and we do have some good psychometric available. Not just within press Gagne but around the country. I think I’d recommend people listening have a nice balance scorecard with some leading indicators. Like safety climate and engagement and some real time indicators that talk about systems and then they’re lagging or outcome measures would be a proof of the pudding is in the eating of the pudding. So they could be measuring their safety and their quality and their experience.

Dr. Levin: Yeah I think those are really terrific points and I’ll just add that just like so many things I learned from working with you and from learning more about safety, these are general ideas and principles you can apply in a lot of different areas and so your description of you know a solid way to approach measurement. I think that can be applied in a lot of different kinds of domains. It’s just a strong general principle.

Mr. Clapper: You bet.

Dr. Levin: We’ve been talking today with Craig Clapper founding partner at HPI. Craig thanks so much for joining us.

Mr. Clapper: It’s my pleasure. Always good to talk with you Dave. Thank you for the work you do.

Dr. Levin: You’ve been listening to 4x4 Health sponsored by Datica. Datica; Bringing health care to the cloud. Check them out at www.datica.com. I hope you’ll join us next time for another 4x4 discussion with health care innovators. Until then I’m your host Dr. Dave Levine. Thanks for listening.

Today's Guest

Craig Clapper
Craig Clapper

Partner, Healthcare Performance Improvement (HPI) Press Ganey

Craig Clapper was a founding Partner of HPI. Mr. Clapper has over 25 years of experience improving reliability in nuclear power, transportation, manufacturing and health care.

Craig Clapper was a founding Partner of HPI. Mr. Clapper has over 25 years of experience improving reliability in nuclear power, transportation, manufacturing and health care.

He specializes in cause analysis (including nuclear power events and component failures, commercial aviation components, and the Texas A&M bonfire structure collapse), reliability improvement (including Feed Water & Main Turbine systems in nuclear power, manufacturing at Baker Hughes, and chemotherapy processes at St Jude’s Children’s Hospital), and safety culture improvements (for Duke Energy, U.S. Department of Energy, ABB, Westinghouse, Framatome ANP, Sentara Healthcare, and others). He now is the lead partner on several safety culture engagements for health care systems.

Prior to forming HPI, Mr. Clapper was the Chief Operating Officer of Performance Improvement International, Chief Engineer for Hope Creek Nuclear Generating Station, and Systems Engineering Manager for Palo Verde Nuclear Generation Station. He is a registered professional engineer in Ariz., has a Master in Business Administration, and is a Certified Manager of Quality and Organizational Excellence by the American Society for Quality (ASQ).

Our Interviewer

Dave Levin, MD

Chief Medical Officer

David Levin, MD is a physician executive with over 25 years of experience in healthcare information systems, clinical operations and enterprise strategic planning.