Ben Cooperman
Datica Podcast

Making Health IT Safe and Effective

July 30, 2019   Healthcare Cloud Engagement

In this episode of 4x4 Health we dive into building safer and more reliable health information systems with guest, Ben Cooperman.

In this episode of 4x4 Health we dive into building safer and more reliable health information systems with guest, Ben Cooperman. Mr. Cooperman is a Senior Associate in Healthcare Performance Improvement and has extensive experience working in healthcare technology and safety. He shares why safety first is a winning strategy as well as high reliability in the health IT industry and making technology work for us to transform how healthcare is delivered.

Episode Transcript

Dr. Levin: Welcome to 4x4 Health sponsored by Datica. Datica; Bringing health care to the cloud. Check them out at www.datica.com. I’m your host Dr. Dave Levin. This week we continue our exploration of safety and reliability with a deeper dive into the issue of safer health information technology with our guest Ben Cooperman; senior associate at health care performance improvement or HPI. HPI is a Press Ganeys company that offers consulting services to organizations seeking to improve human performance in complex systems using evidence based methods from high reliability organizations. With over 10 years of experience in health care technology and safety, Ben brings a wealth of experience and knowledge to engagements with leading health care organizations such as M.D. Anderson Cancer Center and the Mayo Clinic. Today on 4x4 health he shares his insights into building safer, more reliable and more effective health information systems and how putting safety first is increasingly a winning strategy. Welcome to Four by four health Ben.

Mr. Cooperman: Thanks Dave. Happy to be here.

Dr. Levin: I’m going to ask you a series of four questions today and will take about four minutes to answer each one. To get us started tell us a bit more about yourself and your organization.

Mr. Cooperman: Sure. So as you mentioned I’m a senior associate with health care performance improvement or HPI for short. We are a Press Ganey solution. What we do is we work with health care organizations, namely hospitals, hospital systems and physician groups and we facilitate their journey to high reliability and we bring high reliability organizing principles to those organizations. My background is slightly different than some of my colleagues. We do have a multidisciplinary group. We’ve got a number of folks coming from health care directly, physicians nurses and health care administrators. We have a number of folks coming from the nuclear power industry and we also have some folks coming from naval aviation. But as you mentioned my background is a little different on those folks and that I come with a health care technology background. So I spent a majority of my career working in the technology space, but solely focused on health care as it pertains to hospitals, hospital systems and our delivery of care here in the United States. And my focus was always around the EMR and other technology that we would use in health care. So it’s a little bit about myself and who we are at HPI.

Dr. Levin: Well I’ve definitely got a I want to get deeper into what you’ve observed particularly around their use of technology like EHR. But before we go deep on this let’s talk about a couple of general concepts. So several times you’ve used the word reliability and high reliability organizations. Take a moment and help us understand the connection between reliability and safety and exactly what is a high reliability organization and why should we care?

Mr. Cooperman: Yes sure that’s a great question. So oftentimes we’ll use a definition from [unclear] Managing the Unexpected about what a high reliability organization is or what highly reliable organizing is and that’s an organization that operates under very trying conditions all the time yet manages to have fewer than their fair share of accidents. So it’s pretty canned response but when you put an image in your mind. A lot of these principles and practices stem out of some typical industries that are considered age arose, such as nuclear power and naval aviation. Now these organizations as you can imagine operate in very trying conditions as we mentioned. So think about an aircraft carrier. What does that mean? Well we’ve got a lot of jets landing and taking off from an aircraft carrier and a lot of sailors trying to stay safe in a pretty complex environment on that ship. But we don’t hear about those jets crashing all that frequently, it’s pretty rare. So we consider that an HRO. They have fewer than their fair share of accidents if you will. So what does that mean in health care? How do we translate that to health care? I think in healthcare we operate in a very trying environment all the time. We have a lot of very sick patients coming to us with grave illnesses or some pretty significant injuries. But how are we working to stay safe? How are we working to keep our patients safe and ourselves safe as well. So that’s a little bit about it. HRO as we say a lot of times have five characteristics. So we talk about a preoccupation with failure, sensitivity to operations, a reluctance to simplify interpretations and then we talk about a commitment to resilience and a deference to expertise and how those five traits apply to the work that an HRO will do, we bring those five traits as well as many other principles and behaviors into health care.

Dr. Levin: Well this to me makes a lot of common sense from the standpoint of these are entities, organizations, industries that can reliably perform at a high level and do so safely. And the other thing that I have found to be appealing over the years is this notion of health care as another one of these industries. It makes sense conceptually and I think it also it resonates with people who work in health care, who can see themselves in that kind of an industry. You know as we’ve talked about with your colleague Craig Clapper, culture and the way we receive this information is a critical component of reliability as well. And so both these metaphors of whether it’s aerospace or nuclear power, I think those resonate with people in health care and they also provide a really useful frame very often for thinking about the issues specific to health care.

Mr. Cooperman: Yeah. You know I think when we go out to organizations one of the first impressions that some of our folks that we work with are that it’s kind of refreshing. They’ve realized that they work in a high risk, high consequence environment. They realize that the work they do is dangerous and there’s harm that does occur, but they didn’t always know what to do. They didn’t always know how to frame it, how to classify it and then what to do about it and try to remove it from healthcare. There’s been a lot of work around process improvement in health care and we try to tackle maybe one type of harm at a time, say [unclear] or things like that, but it hasn’t been holistic in nature so we’re very limited in our focus. And when folks see the high reliability principles and sort of what we start talking about, it’s kind of refreshing in nature. It gives you a path, it gives you a guiding framework to move to try to remove that harm from health care that people see but don’t always know what to do about it.

Dr. Levin: So let’s pick up that lead and go a little deeper specifically around these issues as they apply to technology and health care and particularly information systems and information technology. What are some of the general principles that you’ve observed that map to Health IT. What’s worked and then we’ll get into what are some other challenges as well.

Mr. Cooperman: Yeah sure. So the first step I think a lot of times is just defining what we’re talking about. So we have this word Iatrogenicharm or Iatrogenisis, it’s been coined in an article. I think that comes from around 2007. Some folks were trying to put a label on what this really means and the definition we have of that is patient harm caused at least in part by application of health information technology. So over the years as our technology around us has improved and gotten far more advanced, just as it has in our personal lives, think of where we were say 10 or even 15 years ago with technology in our personal lives it’s advanced greatly. It’s the same thing in health care. We’ve advanced greatly with our technology with the implementation of electronic medical records and all these other pieces of technology that we use in our delivery of care. But as it’s become more advanced it’s also become more complex and as part of that we are experiencing harm that’s caused as a result of our technology. So what does this really mean for us? Well thinking about those HRO principles, oftentimes we don’t always have that when we’re thinking about HIT, healthcare information technology HIT for short. HIT in my mind will encompass all this different technology that we use, not just the electronic medical record or the electronic health record. We have all these different definitions and different names of different technology. We’ve got a lot of it. So what are some of these principles really mean to us? Well do we have a preoccupation with failure. So I don’t know that we’re always preoccupied with how something can go wrong when we implement this technology in health care. I think we do to an extent, but we’re not hyper aware of it. We think about all the great things this technology can do, how it can really help us in providing care help our patients and that’s what we think about. We think about the positive, which is great in one respect. But we’re not always hyper aware of what can go wrong when we implement this technology. We maybe have a couple plans in place, but that’s not the forefront of our mind. Not what can go wrong. It’s always how great this is. So we don’t always do that. I would say that should be something we should start thinking about a little bit more as we bring more and more technology to health care. That’s one example Dave.

Dr. Levin: Okay Ben I’m here jumping back and forth on one foot. I got to get a couple things in edgewise. I mean that just packed a real punch and I want to unpack just a couple of things. So first this term Iatrogenisis. This is an old medical term and essentially it’s you know any harm caused inadvertently by medical care. And what you have done as you’ve pointed out, we’ve appended the letter E as we often do for modern times to make the point that Iatrogenic harm can come in many different ways, in this case it’s caused by our electronic health information systems. The other thing that you said that just kind of let me up here for a moment was and I know you’re going to walk us through other guiding principles. But as you take them off it just triggers frankly memories, maybe a little bit of PTSD in myself having these systems and I immediately thought of an example from the bad old days that involved implementing a forcing function around some ordering. So there were some orderable and some well-intended folks put in place a forcing function where you had to answer and take certain steps in order to get a medication dispensed. And it worked fine in the routine setting, but it turned out to be disastrous during a code blue. So I think that maybe is a sort of example of where we think about these things in health IT design, but we don’t always as you say preoccupy our self with failure and that it can present itself in the real world later.

Mr. Cooperman: Yeah I think you bring up a good point there. Not necessarily your PTSD because I can certainly speak to that myself. But I think this is a good example of maybe we weren’t completely preoccupied with failure here. Kind of in the early days we thought this would be a great idea. We’re going to add a forcing function around ordering this medication. We talked to our folks down in pharmacy, we talked to some of the physicians and other clinical folks, our nurses and we said you know it’s really important that we have this one piece of information when ordering this drug. We’re going to make this a required field and you always have to have it in order to order the medication. But we didn’t think about all the different scenarios that would occur here. Specifically a code blue as you mentioned. Now we’re in this situation, we’ve got a code blue. We really need this medication, we really need to place this order and now we’re stuck at a screen, because we can’t go forward in any way shape or form unless we fill out this field. And you and I both know Dave we don’t have this information at this point in time, so where do we go? What do we do. We don’t have the maybe NA, not applicable option. We put in false information just to get past the screen and then it brings us to this point well we’re simply going to go around the system in some way shape or form, because it’s not working for us. That’s a perfect example of how we haven’t implemented a preoccupation with failure in this. Maybe we didn’t do enough testing, maybe we didn’t get enough end users to the table when we were thinking about implementing this type of technology. This is where we come up with this preoccupation with failure is safety being talked about first. How is this going to fail around us and what are we going to do to mitigate those types of situations. Is that a topic of mind before we hit that go like button metaphorically speaking and yes as you mentioned, Iatrogenisis, It’s kind of a mouthful at times and as you mentioned we sort of put E or I in front of something and all of a sudden it means technology these days. But you’re right it’s an old term meaning harm and we can call something like this Iatrogenic harm. Now we have a name for it. It’s harm caused in part at least maybe completely by the technology that surrounds us and we want that technology to make us safer all the time and help us out. But these instances that we’re talking about where it just goes the other direction for us.

Dr. Levin: Well this is really great. Let’s keep going. And perhaps you could give us one or two more examples of these general principles and how they might apply specifically to Health IT.

Mr. Cooperman: Yeah sure. So another one that I like talking about frequently is deference to expertise. So this is when we’ve gotten ourselves in trouble, maybe we implemented a piece of technology it’s not working out for us; deference to expertise. Are we calling in the person or the people that have the most related knowledge or the correct expertise at this point in time to help us out. Or are we relying on some other variable to identify who’s going to help us. So do we pull rank. Okay this is a big situation here. We’re going to pull in the highest person we can find. Maybe they have something like Chief in front of their job title something like that. But the reality of the situation is we might need somebody else. We weren’t really deference to expertise there, we were going by some other variable. I don’t always see this happening sometimes, it’s the provider not too happy with the situation and they’re going to go right to the top of the metaphorical food chain. But is that a deference to expertise, are we pulling in the right people at the right time or are we just going up the chain of command, because that’s what we feel is necessary even though it’s not the right direction to go.

Dr. Levin: Ben I think you came on this show deliberately to press my hot buttons. Because you just had another one and I’ve talked about this in other settings before. The example I would give is even more basic than that, we go to design a new function or capability and we decide you know the frontline staff they’re too busy. We’ll get someone to be a proxy for them. That’s fine, it’s fine to have a proxy in the room. But they’re just a proxy. And the reality is the way work is actually done at the frontline, at the sharp end of the spear if you will is often quite different than the way it’s imagined by a manager or a director. There’s a description in the book some way of what people are supposed to be doing. And then there’s what actually happens on the frontline in the care of patients. And to me this is another example of where we don’t have the right expertise in the room. So we design something that makes perfect sense if your understanding is theoretical. But all the sudden doesn’t work very well in the real world when it meets real people and real culture.

Mr. Cooperman: Yes so what are we doing in actuality as opposed to what we’re doing in theory or what we believe is happening at the frontline. So we somewhere up at the blunt end, on our blunt and sharp end model that you mentioned there a great reference, we at the blunt end have some idea of what’s happening at the sharp end. But we don’t actually go out there and solicit what is occurring in real life. We just believe what we want to believe. We don’t pull in those folks working at the sharp end and say, hey how is this actually working for you and let’s build that into our model. We just go with what our notions are. So I didn’t purposely push your buttons but I’m kind of glad I did/ because it’s a principle that we don’t always live and breathe unfortunately.

Dr. Levin: And it’s also isn’t it about the diversity and making sure that if you’re working on a complex workflow that all the stakeholders involved in that work or at least in some way contributing to the analysis on the design, it’s a team sport isn’t it?

Mr. Cooperman: Yeah I think you hit that. I mean those are the words that were in my head as you were talking right there. Health care used to be so top down in nature. But anymore our environment is so complex as you mentioned it is one hundred percent a team sport. Everyone has to work together and everyone has to be on the same page to move forward. Otherwise we’re going to have these incidents of Iatrogenic harm as you and I are talking about quite a bit.

Dr. Levin: All right well let’s pick another one and let’s see made I can avoid one of my minefields. But go ahead.

Mr. Cooperman: Sure. So how about a commitment to resilience. So what do we do when we’re in trouble again, What’s our commitment to resilience. So do we have capabilities to detect contain and maybe bounce back from or events that do occur. Do we have those plans in place. So think about say downtime, how robust are our downtime procedures when our system goes down or systems plural go down. We may be have a rough plan in place, it’s there. But is it taking into consideration as many possibilities as we can think of. And then to push it even further Dave, do we practice our downtime procedures.

Dr. Levin: Yeah. Well again you’re giving me bad flashbacks today. You know here’s an example that I’ve see and still see occasionally is we don’t plan for example, how are we going to write prescriptions for patients when the system is down either a scheduled down or an unplanned down and this is not meant as a criticism, but many young practicing physicians have literally never written out a prescription by hand. They’ve only ordered using a computer which is great when you can do it. But what happens when the system is down? You get into these really practical things like Do we have prescription pads and that even more practical- Does this individual have the training to know how to write out a prescription on paper, have they actually even had that experience much less so training to do it. I’m sure there are many many more sophisticated examples, but that’s one that I’ve personally observed and I’ve observed repeatedly in these situations.

Mr. Cooperman: Yeah it’s a tough situation to be in. So unlike some other industries where we can take a step back and have one of those days where we can do some planning and health care, our stance is always that health care never stops as patients are always coming in the door. So we can’t really take a step back and plan. But if we’re creative, excuse me, in our thinking just quite a bit, I’m sure we can find scenarios where we can still practice what our say downtime procedures are in this specific instance as you mentioned, how do we write a prescription when prescribing is not working for us. Let’s take some time and run through that and we’ll start finding problems with our downtime procedure. Like it’s written out. We’ve got great plans in place, but nobody’s thought ahead to order some prescription pads because they’re just not around anymore. We start finding out all these little details that we missed. So this is certainly another principle that we talk about in HRO that is applicable without a doubt to health care technology.

Dr. Levin: This is really helpful. I feel like we have largely been talking about these issues from the standpoint of the clinical application of the technology. It’s the doctor or nurse using one of these information systems to care for patients. I want to shift the focus a little bit and talk about this from the perspective of a member of the IT department. How might some of these principles apply very specifically to the practice of information technology rather than the practice of clinical medicine?

Mr. Cooperman: Yeah. So let’s think about another characteristic of an HRO and that’s a sensitivity to operation. So we are health care IT workers, Dave you and I are at our health care IT desks in our office and we’re doing our work diligently. But how sensitive are we to the operation that’s going on. I think one of the problems we experience sometimes is that Dave you and I and our imaginary roles here, we are removed from health care delivery quite frequently. So I think this is a problem that is experienced across health care technology. Maybe we are trying to implement this. This characteristic; a sensitivity to operations. But we’ve got a lot of barriers in place. You and I Dave we may be sitting at our desks, we’re 30 miles down the road from the nearest hospital. We’re very much disconnected from health care. So we don’t always have a feel of what’s going on at the sharp end, because we’re so far away from it. Now how can we bring that principle to these health care IT workers? Well I think we should get them involved with clinical operations. Even just a little bit. Maybe you have a kind of bring your health care IT employee to work type day kind of deal and we get out of our desks we see what’s going on at the front line. This is certainly a way that you could take one of these HRO principles and put them in place from the health care IT perspective.

Dr. Levin: Well you’ve hit another hot button on me, but this is a good one. This is a positive one. Yeah I very much believe in this and when I had the opportunity in roles where I could foster those kinds of relationships worked hard at it. I happen to think this goes both ways. What I discovered was that connecting my IT colleagues to the deeper meaning of the clinical work was inspiring to them, help them appreciate the important value of what they were doing. Its impact on health, its impact on their neighbors and their full citizens gave them a deeper understanding of some of the clinical challenges and workflow challenges. But the reverse was also true that as my clinical colleagues began to grasp the challenges that their IT colleagues faced, that deeper understanding led to better conversations and I think better decision making as well. So there’s something on both sides of this. If you’re just joining us you’re listening to 4x4 Health and we’re talking with Ben Cooperman; senior associate at HPI. The other thing that’s just so interesting about what you’re talking about here is this is a cultural activity. I think this begins to open the window on another important aspect of all of this. That what you experts refer to as the socio technical environment and the role that culture plays as an important underpinning in all of this. And the example that you and others have given me was Well we might have a checklist for example about an important process, say I’ve been working on some code and I’m going to now move that from testing into production and perhaps I have a checklist of activities that I’m supposed to go through. Just like a pre-flight checklist for a pilot. Well it’s great to have the checklist, but if I’m not part of a culture where that’s valued and the expectation is that I will consistently do it, then it’s not clear it’s going to have much of an impact. So a simple example, but is that a reasonable example of where culture and safe practices kind of meet in the creation of the actual environment we operate in.

Mr. Cooperman: Absolutely and you touched on that word socio technical system where we have people processes and technology working together in concert every day. Now if we have that checklist that’s all well and good. That’s our process that we follow. That’s a tool that we use to make sure that we’re hitting all the steps along the process. But if our culture says well we’re not very reliable in using the checklist or we’re kind of lax in using the checklist, it’s not really that important to us. Well we’ve just skirted around a tool that we’ve put into place to make sure that we don’t have mistakes, a barrier to us experiencing adverse events if you will. And we really need to work on that culture and you mentioned that before with that teamwork. If we have our healthcare IT folks out understanding what’s going on for our clinical folks and the same thing reversed, clinical folks understanding the challenges of our healthcare IT folks, we’re going to increase that teamwork and we’re going to start affecting the culture and our organization and our culture is one of those very strong behavior shaping traits at the blunt end from that model we mentioned before.

Dr. Levin: It’s an interesting contrast to the idea of our forcing function. And your colleague Craig Clapper and I had this conversation recently and I referred to things like hard stops and other forcing functions. Which he acknowledged and he said you know culture is a choosing function. So to go back to my simple example, the checklist is there. It can be used as a kind of forcing function to check all these things. But if culturally I don’t make the choice to use it, then it’s of little value. I think this whole idea of socio technical environment resonates very deeply with those of us at work in the field. You as you might not be surprised to hear you are not the first person on this podcast to refer to what I call the Iron triad of people process and technology. And I don’t know about you Ben but I have never seen a successful project that did not address all three and as you point out there is no reliability and no safety without addressing all three as well.

Mr. Cooperman: Yeah as soon as you forget one of those three you’re leading yourself down a path to disaster in my opinion and I think a lot of folks as you mentioned would probably agree with that. We need to address all three otherwise. Our chances of success are going to be pretty low in my opinion.

Dr. Levin: Yeah. The other thing that I find really interesting about the way you and your colleagues talk about these subjects is you talk about their impact on all the human beings involved. So clearly in health care our primary mission is first do no harm. And that’s about not harming patients. And we have to take that very very seriously of course. But what I also find appealing about this topic is, it’s also about. How we care for each other, how we care for the health care providers, for the IT workers. For everyone engaged in what I consider to be a very sacred task, which is healing the community. Talk a little bit more about this aspect of it, its impact on people in general.

Mr. Cooperman: Sure. So we talk about safety and safety first and safety for all. We don’t come in and simply talk about let’s just make sure our care for our patients is safe. We talk about safety in general being first in our decision making and we think about this sometimes in our daily life and we don’t even realize it when you get on the airplane and they tell you to put the mask on first in an emergency before you help the person sitting next to you. If we’re not safe, we can’t provide safe care. So we need to worry about how safe we are as well and many of these principles and practices help us stay safe. Now health care is a very dangerous field to be working in, especially when you start looking at the numbers around it. It is very dangerous for us as employees. Some of that stuff is very obvious to us. I think our emergency departments these days when I’m going out there to our clients and talking to different individuals of these organizations, I hear it most loudly from our emergency department folks about the patient population coming in and how dangerous it is for them sometimes. But these people, these principles and processes and HRO tactics are for how safe we are as well how do we experience safety in our work every single day.

Dr. Levin: So you mentioned earlier in this discussion specifically the electronic health record and I think that bears scoring a little bit deeper. I imagine you agree with me that the widespread deployment this if you will the first phase of digitizing health care has been you know largely successful at least in terms of getting people to put down pen and paper and start using keyboards. But it has also introduced you know significant complexity to the socio technical environment. Tell us a little bit about some of the basic things you’ve observed and what you see is as kind of hope for the future if you will.

Mr. Cooperman: So that’s a great question. I think what I’ve seen and observed over the past decade is we did exactly as you mentioned. We decided it was time to put down the pens and the paper and use technology and health care and that was really stimulated by the government providing some funds for this and basically saying to hospitals, we’re going to cut some of your reimbursements unless you hit some of these goals that we’ve put in place. We have a carrot and stick type deal here. But we haven’t focused on say the human factors component of health care technology. It’s out there, it’s working for us. But isn’t working really well for us. Well that’s obviously a topic of debate. I think Dave you and I are probably on the same page. Just a hunch here that we’re experiencing a lot of pain and health care. Metaphorically speaking with the technology, it’s not always working that well for us. It’s doing some of those basic functions as you mentioned. It’s taking away the pen and the paper. It’s digitized what we’re doing, but have we really reinvented or transformed the way we deliver care with technology involved in it. I would argue that we have not. And that’s what I’d like to see in the future. That’s my vision for the future is that we start integrating human factors. We really start making this technology work for us to really transform health care is delivered. And as you mentioned how safe things are for us as health care employees.

Dr. Levin: Unfortunately I do have to agree with you. I mean I think we had to go through this first step of this basic digitalization. I certainly hoped it would go faster and farther than it has. I think it is not just you know two guys with an opinion. There is an emerging base of data to support this whether we look at clinician reports of burnout related to these systems, the sort of slow dribble of actual studies around their impact and just the fact that a lot of these sort of basic ideas of human factors and design and workflow design they were either ignored or we gave them short shrift in the rush to get these systems deployed. And now we’re going to have to go back and redo a lot of that. I think it presents a great opportunity, but we’re kind of trying to cross this chasm right now. That’s very painful. And I think difficult for most of our end users and it hasn’t delivered as much as we hoped it would in terms of benefit for patients or in terms of efficient operations.

Mr. Cooperman: Yeah I think one thing that we did do over the past 10 years is we looked at the way health care operated with all of our analog tools and we mimic that with a digital tool. Instead of saying here’s how it works from an analog perspective, how can we make this better from a digital perspective. We simply just mirrored that with a digital tool. And now we’re still experiencing some of the same pains that we had with analog, but now we’re kind of blaming them on the EMR or other technology because that’s what’s in place now. We didn’t actually transform our processes or how we delivered care. And I think that’s what’s going to come next. And I think that’s what people are hoping for. I think that’s what some of the data as you mentioned is supporting is we really need that transformation of how care is delivered with our digital tools.

Dr. Levin: Well we do and I also take some hope from just looking at the sea of technology around us and other industries. You know as I often say every day I work in healthcare and then I go home and live in the 21st century and we know it is possible to design systems that are efficient and elegant and safer and support new ways of doing work. Because we see it in virtually every other industry on a daily basis.

Mr. Cooperman: And I think we’re seeing some of that bleed into healthcare on the positive end of it. Not to be the negative Nancy on the podcast, but things like our texting technology. You and I in our personal lives can reach friends and family quite easily via text message or mobile devices and it’s a very great convenience of life. Some of that’s bleeding into health care. We see some of these companies introducing tools that are HIPAA compliant. Where providers can use their cell phones and communicate via a text message. I see that as tools bleeding into healthcare. But it’s still going to take some time for it to really mature.

Dr. Levin: Well I certainly agree with you. You can see it happening. I wish it would happens faster and I would still argue that we still don’t have an appreciation for the role that culture plays in high performance and health care in general, then also specifically when it comes to things like reliability and safety. You know I want to touch on one other general topic here, which is measurement. Because uneasy dig at this kind of discussion work is, well that all sounds kind of soft. You know how do you measure culture. How do you measure safety. And it’s a good question. You know I imagine you share the belief that I do, which is if you’re going to manage something you do need to be able to measure it. You know it’s your compass and your speedometer and a couple of things as well. So I know this is a really deep topic. We could probably spend a whole podcast on it. But give us a couple highlights about how you measure safety in an organization.

Mr. Cooperman: Sure. One simple way we go about doing that is we’ll implement at an organization something like the safety that classification system, which is HPI system and we look at events of harm that have occurred and we say to ourselves, did we deviate from the generally accepted performance standards. And if we did this is where we have deviated in some way from the best practice. Then we call that a safety event and if we harmed a patient at a level that we would consider moderate to severe harm or death, we call that a serious safety event. And from that we can determine a serious safety event rate, which is the answer to your question. How do we measure ourselves. Well what is our serious safety event rate. So that’s our measurement tool. Now if you want to dig further, as you mentioned we could probably spend a whole podcast or maybe a whole series of podcasts on how do we dig deeper on the how and the why these events occurred. We have tools and we have ways of looking at these programs. We have cause analysis to determine what some of our people causes were and what some of our system causes were. That’s where I think we see some of our technology play and that’s in our system causes. How did that system influence Dave and Ben to experience some type of error in their work. I think that’s how we do some measurement and I think that’s how we do some pretty superficial at least cause analysis to determine how technology plays into this.

Dr. Levin: And of course there’s you can develop a sort of balanced scorecard and you can also look at things like leading indicators, like compliance with checklists, user things. I’m sort of making some of this up. But there’s this mix of leading and lagging indicators that feed into they alternate serious safety of that rate as well.

Mr. Cooperman: That’s correct. We look at three different type of indicators. We’ll look at a leading indicator. That is something that’s going to tell us the future. We look at real time indicators from time to time, that’s what’s going on at the moment. And then something like a serious safety event rate is a lagging indicator. That tells us what happened in the past and how we can improve in the future. So in those three different types of indicators.

Dr. Levin: Well I’ve had the distinct pleasure I guess is the word of reading very deeply into the new proposed rules from the Office of medicine coordinator around the exchange of health information. And one of the aspects that I found very fascinating, I actually did find fascinating but I don’t think has gotten the attention it deserves are some of the proposed rules that relate to safety. And I want to specifically call these out, because they relate to the point you were just making, which is in order to develop these measures you have to collect information about performance. You have to be able to categorize that and you have to be able to carry out meaningful analysis to get a deeper understanding. This is a core functionality of creating a safer system correct.

Mr. Cooperman: I would agree. Looking at the data, measuring ourselves how we’re doing is a core functionality at some of our systems that we have in place. It make it easy to see what we’re doing. It makes it easy to collect and analyze what we’re doing. I’d say that’s a core functionality of our systems, I would 100 percent agree with that.

Dr. Levin: And so to tie this back to their proposed rules, they very explicitly address some things that have been barriers or perceived barriers at any rate. So it’s things like sharing screenshots from your EMR, sharing performance information or publishing performance information. Whether it’s economic or safety or quality or otherwise. And situation real or perceived and we could debate that is that some of the vendors, some EHR vendors in particular have rather vigorously policed the sharing of screenshots even for these kinds of purposes. They are concerned about protecting I guess intellectual property or trade secrets of some sort and that some of these, some of the license agreements for this technology includes effectively gag clauses. Things that make it at least difficult or cause the licensee to think more than once about whether they even want to share this kind of information. And I don’t know that there’s published literature on this. But I can say with confidence haven’t spoke to many colleagues around the country, they’ve encountered situations where they have collected some sort of performance data. But their organization has been unwilling to share that publicly for fear of violating contractual agreement with a vendor. So to me this is like job one in making the system safer. We kind of talk about it. We need a method for collecting this information and you know sorting it, categorizing it, labeling it, analyzing it. Again this was sort of safety one to one that you and your colleagues drilled into me back in the old days of starting patient safety. If we don’t measure it, then we really don’t have a basis for understanding what’s happening.

Mr. Cooperman: Yeah I think you hit on a couple of things there and I’m not sure that we have data about it. I’ve certainly been out on the internet like you have Dave and I’ve heard some of the commentary on the message boards about the gag clauses and can one say this. Can we not say that. But one of the first steps that we talk about is we need to make the harm visible. Whether or not we measure it, which we should. I 100 percent agree. But we need it to be visible. Maybe our measurement is one way to make it visible. But we need to know about it before we can fix it. And if our rules and our processes and our protocols and the laws say that we can’t talk about it, we can’t share it with each other and we can’t see it, then how in the world are we going to manage to be able to fix any of it. So we need to see it to fix it.

Dr. Levin: Great. I want to raise one more issue with you, which is around the cost of all of this. Because again I’ve tried to put myself in the place of the skeptic and the skeptic might say, well this all sounds great theoretical. But it also sounds like it would take a lot of time and be really expensive. I mean at some point we’ve got to get on it. We got patients to take care of. We have a business to operate, the safety stuff sounds kind of expensive. How do you address those kinds of concerns.

Mr. Cooperman: We typically a lot of times talk about HRO as a [unclear] in our work in each row is a great enabler of many different aspects of health care. One of them is a financial aspect. If we are not safe in our work, I’m not really sure how we’re going to be all that successful from a financial aspect. So we’re having all of these mistakes and errors in our work, it’s going to certainly impact our bottom line. And then another perspective is that we’re also not purchasing a whole lot of equipment or different types of materials. A lot of this HRO is simply how we operate. We’re just changing our mindset many times of how we operate and I would say that’s an expense we can’t really spare. We need to be safe first and if we’re operating in a safe manner, it’s something that we can’t put aside and it’s going to enable us in many different aspects. One of those is a financial aspect.

Dr. Levin: I think this is really helpful and again I say we step back and view these things as investments or that pay over time, both very directly in terms of its impact on patients, but also on the impact of the reputation of the organization. On the quality of the workforce that you’re able to attract and retain their satisfaction and performance. To me these things are all embedded in this work and not that you can do it endlessly. We do have to have some guardrails here. But properly managed this is an investment that pays dividends over time.

Mr. Cooperman: Absolutely. There’s a lot of indirect benefits in our perspective.

Dr. Levin: So Ben you and HPI have your hands on a lot of different things. What’s the most important or interesting thing that you’re working on right now?

Mr. Cooperman: Well I’d say the most interesting thing that I’m working on right now and it’s maybe not one thing. It’s the fact that the work we do touches the entire healthcare system. So we get an opportunity to work with clients big and small across the health care organization in the United States. We go from every facility you can think of from the small critical access facilities out in the middle of Kansas to the large facilities the large systems, the corporate home offices in any major city you’d think of maybe Denver or Dallas or even New York City. I’d say that’s the most interesting part of our work. That’s the most interesting thing we’re working on these days is being able to work with the entire spectrum of health care in the United States.

Dr. Levin: All right. So for this next question I want to remind you that the show is PG 13. So keep it family friendly. You’re a pretty upbeat kind of guy, but I’m really curious. What’s your favorite pet peeve or rant these days?

Mr. Cooperman: I’m glad you asked that question, because I do have a pet peeve that I’d like to talk about and that is our perspective of who owns health care technology Dave. So I had the opportunity of going to speak to an audience recently. I’m not going to name who it was and I played a very fun game. At least I thought it was fun. The audience maybe would have a different opinion and it was a word association game Dave. And basically what I did was I put a tool on the screen and I asked the audience to tell me what profession is associated with that tool. So I gave some pretty softball questions like a hammer. The audience unanimously said construction worker. I put an airplane on the screen. And again unanimously everybody said pilot and then I put the names of some health care technology companies on the screen. So think Epic, Cerner, McKesson companies like that. And the name that everybody threw out as the profession that owns that is the IT people. I am Paraphrasing here, the IT people right. And my biggest pet peeve, my stance is that a tool like that, whether it’s our EMR or our barcode scanner is not owned by IT. It’s owned by all of us. It’s a tool that we use to get our job done. So as we mentioned before it’s kind of like a team effort here. So nobody really just owns it. Its not just IT people that own it. We all own it. That’s my pet peeve. I’d like to change that thinking. And then we started thinking about these health care technology tools as our tools, not the IT people’s tools.

Dr. Levin: So I share that pet peeve. I could go off on quite a rant about this myself. But the insight that I would add to this is having had the pleasure and experience of working with a lot of different health systems around information technology. I feel like I can walk into an organization and in a matter of minutes I can tell is this an organization where this is a team sport and people are doing IT together or is this an organization where IT is being done to people and sadly the latter tends to predominate and it’s not as fun, it’s not as productive, it’s not as safe. It’s not any of the things that it can be. It’s being done to people. So as a sort of companion or bookend to your pet peeve there, I couldn’t agree more. And I think it’s really obvious in most organizations whether that’s happening or not.

Mr. Cooperman: Yeah I think you’re right there. I think sometimes when it comes to what’s the culture when we’re trying to do some measurement as we talked about, sometimes we can just do a measurement with a gut check when you walk into that organization. What does it feel like. Is IT being done to people or are we approaching this as a team sport. I like your words there.

Dr. Levin: So for our last question today you’ve already offered us a lot of sage advice. But what’s your most sage advice.

Mr. Cooperman: Well my most sage advice is patience Dave. I think HRO and looking at HRO principles and practices, it’s a journey Dave. This is not something you can go out and purchase and install it. Culture is not something you can go out and purchase and install it. So when we’re talking about health care IT, you and I Dave, We can go out to one of those vendors whether it’s Epic or Cerner or any other vendor we talk about, we can buy a system. We can buy a tool and we can just install it. That’s what we talk about what is IT being done to us as you mentioned. But culture is very very different. You can’t just purchase it and plug it in and expect to move forward. And it takes a lot of time when we’re trying to shape our culture. So my sage advice is patience and keep putting one foot in front of the other. And over time your culture will look the way you hoped it would.

Dr. Levin: Well again I couldn’t agree more. Listeners to the podcast know I’m a bit of an extreme ass on this subject of culture and I own that publicly. I’ve come to believe culture is the work and pretty much everything else is a byproduct of that. And so not only do I agree with you about being patient about that cause it’s hard. It’s also tremendously rewarding in many many different ways. It’s rewarding to the organization that makes the investment, because they’re more likely to achieve their goals and it’s rewarding to all the people involved. Because it’s frankly a more humane way. And a more productive way for all of us to work together. So your admonition, your advice for us to be patient is excellent.

Mr. Cooperman: Good things come to those who wait Dave.

Dr. Levin: That’s right. And persistence. The good news is there’s a whole body of work out there now. Patient safety movement is not new. And there’s much to be gained by studying that body of work. And there’s wonderful opportunities to apply this as we go forward.

Mr. Cooperman: I agree. I agree.

Dr. Levin: We’ve been talking with Ben Cooperman; senior associate of health care performance improvement or HPI. Ben thanks for joining us today.

Mr. Cooperman: Thanks for having me Dave.

Dr. Levin: You’ve been listening to 4x4 Health, sponsored by Datica. Datica; Bringing health care to the cloud. Check them out at www.datica.com. I hope you’ll join us next time for another 4x4 discussion with health care innovators. Until then I’m your host Dr. Dave Levine. Thanks for listening.

Today's Guest

Ben Cooperman
Ben Cooperman

Senior Associate in Healthcare Performance Improvement (HPI), Press Ganey

Ben has over 10 years of experience working in healthcare, specializing in healthcare technology and safety. Ben has worked on engagements with many notable healthcare organizations such as MD Anderson Cancer Center and the Mayo Clinic. Prior to working for HPI and Press Ganey consulting, Ben was a Senior Associate with PricewaterhouseCoopers Advisory Services in the Data and Analytics Strategy practice.

Ben has over 10 years of experience working in healthcare, specializing in healthcare technology and safety. Ben has worked on engagements with many notable healthcare organizations such as MD Anderson Cancer Center and the Mayo Clinic. Prior to working for HPI and Press Ganey consulting, Ben was a Senior Associate with PricewaterhouseCoopers Advisory Services in the Data and Analytics Strategy practice.

HPI is a consulting group that specializes in improving human performance in complex systems using evidence-based methods from high reliability organizations.

Ben has over 10 years of experience working in healthcare, specializing in healthcare technology and safety. Ben has worked on engagements with many notable healthcare organizations such as MD Anderson Cancer Center and the Mayo Clinic. Prior to working for HPI and Press Ganey consulting, Ben was a Senior Associate with PricewaterhouseCoopers Advisory Services in the Data and Analytics Strategy practice.

Our Interviewer

Dave Levin, MD

Chief Medical Officer

David Levin, MD is a physician executive with over 25 years of experience in healthcare information systems, clinical operations and enterprise strategic planning.