Webinar Transcript

advertisement
This transcript is intended to provide webinar content in an alternate format to aid accessibility. We
apologize for any inaudible or unclear content as a result of audio quality.
Introduction To Creating Just Culture
Presented by [Marie Dotseth] (53-minute Webinar) [date e.g., 02-12-2014]
Kristi Wergin: Hello everyone, this is Kristi Wergin with Stratis Health, the Quality Improvement
Organization in Minnesota. I’d like to welcome you to this educational session entitled
Introduction to Creating a Just Culture.
Nursing Homes in the United States will soon be required to develop quality assurance
performance improvement or QAPI. QAPI will take many nursing homes into a new
realm in quality. A systematic, comprehensive, data driven, proactive approach to
performance management and improvement.
One important component of QAPI is that the governing body ensures that while staff are
held accountable, there exists an atmosphere in which staff are encouraged to identify
and report quality problems, as well as opportunities for improvement. Leadership should
be working on creating a climate of open communication and respect, where caregivers
feel free to bring quality concerns forward without fear of punishment. This is sometimes
referred to as a ‘just culture’.
We’re pleased to have Marie Dotseth as our guest speaker this afternoon, to help us get
a better understanding of what a just culture is and how we can work to create it. Marie
is the Executive Director for the Minnesota Alliance for Patient Safety (MAPS). This is
Minnesota’s public and private patient safety coalition. She’s been in this role since July
2012. Marie was instrumental in helping organize MAPS in 2000, while working for the
Minnesota Commissioner of Health at the State Health Department, and she’s been
active in various MAPS committees and work group throughout its history.
Welcome Marie, and thank you for being with us today. Let me turn the presentation
over to you.
Marie Dotseth: Thank you very much for inviting me to do this. I’m looking forward to giving the
presentation and answering any questions you may have about just culture. It’s an
interesting topic area and it’s challenging, but also very rewarding.
I’m going to talk to you briefly about MAPS, our rule in this whole thing and then I’ll talk a
little about why a just culture and then we’ll go through an overview. It’s high level and
won’t be a comprehensive presentation. One of the things Kristi didn’t mention is that
prior to my work at MAPS I did consulting and worked with an organization called
Outcome Engineering. I used to teach just culture across the country at different
healthcare organizations, including long-term care organizations and this was a two-day
introductory course.
Therefore, we’re only skimming the surface here, but hopefully I can leave you with
enough to get started and point you in the direction of some other resources.
MAPS- some of you are familiar with this, and I wish we were all here in one room so I
could get a better sense of who’s knows about this. We were established in 2000 by the
Health Department, Medical Association and the Hospital Association.
Stratis Health | 952–854-3306 | www.stratishealth.org
Page 1
A few years ago the group decided it was going on a bold new strategy to redefine itself,
so five organizations, including Stratis Health came together to commit significant time
and resources to basically recreate MAPS, hire staff, myself and a project manager and
then we more actively recruited members from across the community and part of that
active recruitment was going and seeking long-term care organizations, those being
Care Providers and Aging Services of Minnesota have a seat on the MAPS board, and
we’re now a 501(c)3 organization. That’s just a couple words about MAPS.
Now I’ll try to answer the question of why you should care about a just culture.
The simplest reason is that QAPI requires it, as Kristi just mentioned. From the overview,
leaders need to create an environment where caregivers feel free to bring quality
concerns without fear of punishment. In the self assessment for QAPI, the question is…
has an organization established a culture in which caregivers are held accountable for
their performance, but not punished for errors and do not fear retaliation for reporting
quality concerns.
There is mention throughout the QAPI materials about the need to create a culture that
has quality assurance and performance improvement built into it. So, one reason to do
this is because it’s required by the new regulations. Another reason is that we like the
concept of having a place to work and being an employer that is fair and just. We
appreciate those values and who wouldn’t want to be fair, who wouldn’t want to treat
folks in a fair way? I would argue, the reason we would want to work in an environment
that was a just culture environment is because that’s the place where we can achieve
constant learning and improvement.
It’s because we open ourselves up to learn about the risks in our environment. We won’t
be able to understand what’s going on if every time someone raises their hand with a
question or concern they get punished “to inform our risk model”… which is to get a
better understanding of the risks around it. We aren’t going to get where we need to go
by simply reacting to errors in advance. We really need to figure out what is going on
with our humans and the systems they’re working in, before an adverse event occurs.
So this framework is a way for us to understand things before the bad things happen and
to really redesign the way we provide the service, both the human and technical systems
that support those outcomes. To circle back to the first reason you’d want to implement a
just culture, is the reason QAPI is requiring it in the first place.
We all know that our adverse events, the things that result in a bad thing happening to
the residents, to patients, to consumers, are just the tip of the iceberg.

They are really other things we sometimes call ‘near misses’. They are a lot more
frequent that don’t result in any harm and they are at the tip of the pyramid.

Below that, we have to live with those adverse effects and human errors. Those
are outcomes of whatever the system is that we designed to get those results.

Below that we have systems, for example, medication and administration
systems or dietary systems or lifting systems, whatever they are in order to
accomplish a task.

Then there is staff choices, behaviors or our managers and staff, which
combined give us the adverse events and errors.
Now, what we’re hoping to achieve and why we have a just culture to get that feedback
loop going, so we base everything on a culture that learns and that is just. A just culture
really is all about supporting that learning culture and it focuses on the management of
the system and behavioral choices and hopefully provides an objective and fair response
to events that will ultimately happen.
Stratis Health | 952–854-3306 | www.stratishealth.org
Page 2
Back in 2000, when the Institute of Medicine produced its report To Err is Human there
was a lot of confusion about patient safety and resident safety. There was a leader that
stepped forward with testimony to Congress over a decade ago. No physician or
nurse/nursing assistant or administrator, wants to hurt our patients or residents.
Everyone we work with is highly trained. They hold themselves to very high standards,
but paradoxically it’s that focus on individual responsibility and our question to not make
mistakes that we reinforce the punishment and makes healthcare so terribly unsafe.
The person that said that is Dr. Lucien Leep and in his testimony before Congress, he
said the single greatest impediment to error prevention in the healthcare industry is that
we punish people for making mistakes. Isn’t it ironic that we punish people for making
mistakes and that’s actually impeding us from having safer healthcare? What he meant
is that by having a blame/shame culture, we squash our ability to learn and prevent the
next bad thing from happening.
With that as background, I’m going to talk about a scenario. Think about this. An
experienced nurse manager sees a piece of resident lifting equipment at a conference.
Back at the nursing home the sales rep persuades them to use the equipment for a lift.
You’ve never used the equipment before and you accidentally drops a resident while
positioning them. Other staff are present during this incident. The nursing home has a
policy that says new equipment will be officially approved and training will be conducted
prior to its use.
So, there are a couple things I want to point out about this scenario. First, is that on the
far right hand side there is the box that says increased risk of resident harm. Notice in
the scenario I didn’t say whether or not the resident got hurt seriously or otherwise, or
whether there was no harmful outcome, because in a just culture model we try to
approach things and the risk without regard to outcome. We try not to have an outcome
bias.
If you were doing an event investigation, we have a couple things going on here. The
error that happened was that the resident was dropped. That was something other than
what was intended. We also had a manager using equipment without any training. Why?
Usually we don’t go back far enough and ask why. We also had something else going
on, in that the staff didn’t speak up and stop the action of the manager.
Therefore, where we find ourselves is that a lot of what’s going on goes unexplained and
part of the just culture model is to be able to explain more of what’s going on and to
understand more of what’s giving us the results that we’re getting. If you’re thinking
about the best system you can design to support safety, you want it somewhere along
that line of between blame free and punitive. There are a couple myths that I hope I can
dispel as I go through this and hopefully one of them is that somehow a just culture is
about being blame-free.
In some of the QAPI materials they talk about a blame-free reporting culture and while
reporting is blame-free, having a blame-free culture isn’t the same as having a just
culture. A just culture is an accountable culture that requires accountability at all levels,
but we don’t blame the human error and you certainly wouldn’t blame reporting. So,
we’re looking for that place between punitive and blame-free where we can get a
maximum support for patient and/or resident safety.
What are the cornerstones of a just culture? We create an open fair and accountable
culture, so systems and humans share the accountability, and all members are held to
the same standards. We create a learning culture and I talked about that and the reason
why we would want a just culture. We spend a lot of time in a just culture looking at safe
system designs and thinking about those we have designed and the unintended
consequences and the work-arounds that are created from our policies and procedures
and think about better ways to design those systems. It focuses on managing behavioral
choices of the human component of the system.
Stratis Health | 952–854-3306 | www.stratishealth.org
Page 3
Just culture is all about shared accountability, it’s not blame free or overly punitive, but
it’s about shared accountability and again the things we can control are system design
and management of human behavior, which are the things we work on. We have to base
it on a just culture, to help us to learn. Another way to look at it is where you start on the
left with your value and expectation and today we’re talking about safety, quality and
safety, but this would work around other values that an organization has.
For example, you could say we have expectations around customer satisfaction and
there really is only two things we can do to affect our errors and outcomes, and we have
to live with those, but we can design a system and manage behavioral choices, and
those two combined will give us our errors and outcomes.
Now I’ll talk about safe system design and managing behavior, giving you some of the
key components of a just culture in each of those areas. Those two things are key when
talking about just culture.
Managing behavior – we have an example of sub-optimal behavior of Brittney Spears
back in the day when she was driving with a small child on her lap. That’s obviously not
the kind of behavior… that falls under reckless behavior and we’ll talk about the different
categories soon. We can expect these three human behaviors to occur.
1. Human error – inadvertent and doing other than what should have been done
that we call it a slip, lapse or mistake.
2. At risk behavior – defined is it’s a behavioral choice that increases risk where
risk is not recognized or is mistakenly believed to be justified.
I like using driving examples because it’s outside our healthcare world, but is something
we all do daily and that we’re familiar with. In that risk behavior is that driving 5-7 miles
over the speed limit. We’re making the choice to do it. We think it’s okay and most of the
time it is, except on a day when there is black ice on the road or when suddenly we’re in
a residential area and didn’t expect to be.
3. Reckless behavior – that is a behavioral choice to consciously disregard a
substantial and unjustifiable risk.
Again, an example of that would be drunk driving or not driving 5-10 miles over the
speed limit, but 75 miles over the speed limit, like when you see people weaving in and
out of traffic going clearly at a dangerous speed.
So those are the three kinds of behaviors and again, if we were in a room together I
would ask you to give me your assessment on which type of behavior we see most often
in work or other settings. I will tell you, pure human error is predictable, but fairly
uncommon. Reckless behavior is also fairly uncommon and the one that’s tricky and is
our most common behavior is the at-risk behavior. You’ll find that most often when you
do event investigations. You will find that folks have figured out some type of workaround or some cutting corners and that’s where you’ll have your trickiest behavior.
The human error, in the example there’s a lift that had a missile and I believe the way the
story goes is it was an arms missile that drove right off the end of a loading dock. Clearly
inadvertently doing something other than what should have been done and it’s often a
product of our current system design. Another example of human error is where a car
drove away with the gas nozzle still attached.
In my experience and those of my colleagues, this actually turns out to be about a one in
one hundred event, so if there were 100 of us in a room, one of us could or should raise
your hand saying that’s happened to you.
Stratis Health | 952–854-3306 | www.stratishealth.org
Page 4
There are system designs involved here as well and in the next example the gas nozzle
is on the side opposite of the driver, so a lot of safe system design is to turn it around so
the gas tank is actually on the side with the driver, although in my training I’ve
encountered folks who have stepped over the hose to get into the car to drive away with
the nozzle still in the car.
The other thing, because this is a frequent human error event that’s happened is
nowadays there’s an automatic shutoff at the gas pump so gas isn’t spewing out all over
the place, so if the hose is disconnected a safe system design has led to auto shutoff.
Every human task will have some rate of human error. It cannot be zero. If we designed
a system that requires humans to be perfect it will fail. Normal human reliability is that
humans are accurate about one in one thousand. The next example is to demonstrate
that there are certain kinds of skills that have observation rates where there is actual
human error at these rates. Again, we design a system with human beings in it and at
some point in that system it will fail at these rates.
Then, of course, in our environment we find our most difficult errors coming when we
have things where there are many things going on, stressful situations, loud, busy and
low light all those things which increase the rate of human error failure. So for human
you asked, if an employee was making the correct behavioral choices and managing
their own personal performance shaping factors so things like, getting enough sleep, and
if they are then the only answer is to console them. To say we are sorry this event
happened to you. It’s tough to do.
Policies are not setup to console the human error and it’s especially daunting, because
we do this evaluation, regardless of the outcome but I know circumstances where there
was a bad outcome and it’s difficult to console, but it’s the right response in a just culture
model.
Let’s talk about reckless behavior now. This is the type that is managed through
disciplinary action or punishment.
At-risk behavior is trickier. Think of cutting corners on a policy. The nurse having a
couple medications for a couple patients in her pocket or driving five miles over the
speed limit and maybe the nurse example is something that’s reckless, but used to be
that folks were trying to speed up to make their jobs more efficient so they would do this
thing where they take unintentional risk taking without understanding the consequences
and knowing that it’s not justified.
The thing about human beings… and the next example is Steve Irwin, who became
sensitized to working around these big animals, like the time he was holding his newborn
son while working with a crocodile… humans will drift. We will stop seeing things that are
risky around us and begin to not perceive the risk as it is and we drift. That’s why at-risk
behavior is particularly difficult to manage.
At-risk behaviors are managed by taking the perception of risk, adding functions,
increasing situational awareness and through coaching, so they are coached back into a
safe place. Like human error, at-risk happens at fairly predictable rates and depending
on the systems folks find themselves in they will drift at predictable rates.
If you’re familiar with just culture you’ll understand the next example. In just culture we
have the three behaviors… human error, at-risk and reckless… and some tips on how
they’re managed. In summary, we console the human error, coach the at-risk and punish
the reckless. Again, there’s more to this but in the overview these are the three
behaviors and how they’re managed.
I’m going to move on now to talk about system design to give you a high level overview.
Stratis Health | 952–854-3306 | www.stratishealth.org
Page 5
There are two components to a system.
1. Human
2. Technical
If we’re talking about human reliability, we have an example of representation of how we
can have poor to good reliability but we never reach 100%, because we can make
humans more reliable but we can’t get to 100%. The things we can do to make humans
more reliable are listed in the material and you’re probably familiar with several of those
things, but if you want to systematically go through and figure out how to improve the
reliability of the human component of your system, those are some of the factors.
Similarly, for the technical systems, we can control its rate of reliability. So the rate we
are successful and the rate that it fails, we can manage that through some of the factors
previously listed. Things like barriers, recovery or redundancy are the things we would
put into place to make our systems a lot more reliable.
The thing about system reliability is that all the things you can do to make your systems
more reliable come at a cost, which is often financial but not necessarily only financial,
sometimes the system we design could come at a cost with other values. For example,
one that you could think of in a health type setting is that we have patient or resident
comfort and satisfaction, which sometimes competes with safety.
Therefore, if you were to give medications in the middle of the night or in low light
situations, the residents and patients appreciate it more when you aren’t turning on the
light or making noise, but a safe system design would say we need to be able to clearly
see the medicines we’re administering. It’s easy post event to say we could have done
all these things to make our systems safer, but realizing that we live in a real world with
constraints, again often financial as well as other constraints, we have to put safety in
the middle of a constrained system.
Again, systems can be designed to be reliable, but perfection isn’t possible. We know we
have human components, equipment and technical components that are flawed and we
have to figure out what level of reliability we want. Some examples of that… commercial
aviation and it’s a bit easier when you think of an engineering type system, easier to
think about reliability and it’s more difficult when you’re thinking of systems that involve
outcomes with people. We can design systems for certain reliabilities.
Commercial aviation is designed to be safe to one in six million. So we have something
designed to be one in a billion and it’s not. We’ve heard about this notion of sic sigma,
how many defects per million, that’s a three defect per million of sic sigma and I will tell
you that there is probably almost nothing in healthcare that is a sic sigma defection rate,
although some things are getting close through the science of good system design.
Current wrong surgery rates in the U.S. are one in thirty thousand. Current rate of
iatrogenic death in the healthcare system, in hospitals are largely due to infections, and
that’s not a great rate. The current rate of space shuttle accidents when that program
was in tact was one in sixty. You can tell then that we have a range of reliability and we
can design it in advance and we can keep track of it over time.
There are a lot of things you can do to manage risk. Things like barriers, recovery and
redundancy are the higher level design strategies to catch and prevent mistakes. I won’t
go into all of them, but make no mistakes isn’t a design strategy but it’s there because
we often use it as our top strategy when something bad happens.
That’s our response unfortunately and oftentimes to an error is that we tell the nurse or
housekeeper or dietician, don’t do that again. It doesn’t get us far in either managing the
human component or redesigning the system to get different results.
Stratis Health | 952–854-3306 | www.stratishealth.org
Page 6
One of the things that’s a component of a just culture is doing a good event
investigation, because before you can evaluate either the system or human behavior,
you really need to understand what happened. I would say that in my work with
organizations that are implementing just culture, what happens oftentimes is they get
excited to work on looking into their systems and evaluating the human behavior but
they don’t have all the input information that they need.
I know you all have done work with Stratis and others on root cause analysis and things
like that to do better event investigations. I can’t emphasize enough how important it is to
have a good event investigation system as part of your just culture work, and
improvements. You’ve probably been to all day long root cause analysis training or other
event investigation training, but I will say that this framework has been helpful when
we’ve worked through events in trial situations where we mock up an event.
It’s important to understand what happens, but not just that you have to ask yourself
what normally happens and then ask what the procedure requires. I will tell you that in
the organizations I’ve worked with, the tendency is to say that what normally happens is
the same as what procedure requires. However, if you do a good event investigation you
can find and often will find that what normally happens is the drift, the thing that we did
because the procedure won’t allow us to actually deliver all the meals on time or to make
sure that the medicines are administered at their proper times.
So sometimes we write policies and procedures that everybody who works in that setting
says we never do in that way, because if you do then you’ll never get it done or done
right. I remember an organization who discovered they had a medication cart that was
supposed to be in a room with the patient when delivering the medicines, except the
medicine carts they ordered were too big to fit in the door, so of course they figured out a
work-around.
Again, there is a lot of value in understanding the difference between what normally
happens and what your procedures require and then asking yourself why it happened,
because that can be quite a different thing from what actually happened. Really
understanding what kind of behavior was at play, what kind of failure happened in our
system and the other question is how were we managing it? Before the event, there’s
often a tendency to talk about after the event and throwing all these things like, we
should do this and we have to do this and we have to do that, etc.
But, it could very well be that because no system is perfect that we were managing it the
best we were able to do at the time in the constrained resources we had. The question
then about how we were managing it is actually more instructive than it looks on the
surface, because you take the chance as a team to think about that being a good
question and what was going on here, how were we minimizing the risk of a fall or
minimizing the risk of a pressure ulcer?
Everyone on this call probably has great systems in place regards to those two things,
but think about some other events where it might not be as obvious how the risk of some
adverse event was being managed. Event investigation is critically important and I’m not
giving it the service it deserves, but it’s an important part of a just culture.
What is this all about and what’s different? What’s different is creating a proactive
learning culture. Because, if you don’t with our adverse events, our institutions and
organizations are playing a game of Whack-a-mole. We’re just waiting for that next thing
to happen and then we’re going to bring the hammer down, and I’m sure we all feel that
way, in that this is also how we feel the regulatory response is. They just wait around for
that next thing to happen, but I will say that I think QAPI is an opportunity to begin to
change that framework where it’s not just about coming down hard when bad things
happen, but its beginning to be about, looking at those systems and the culture and
learning that goes on within the organization to manage events before they happen.
Stratis Health | 952–854-3306 | www.stratishealth.org
Page 7
A just culture is really about opportunities to inform our risk model and think about
system risk, behavior risk and to understand what’s key. Our management decisions are
based upon where limited resources, financial and otherwise, can be applied to minimize
risk of harm, knowing our system is compromised of sometimes faulty equipment,
imperfect processes and fallible human beings. None of that let’s us off the hook or says
that we’re not accountable for outcomes, because we’re always accountable for those,
but it’s asking us to learn as much as we can about the risk and used the constrained
resources to minimize the risks to the extent we can and make sure we track and learn
from our outcomes and redesign our system.
So, it’s a continuous learning process which is what QAPI is asking of us.
What about the regulator? I have done just culture work also with a lot of regulatory
organizations and believe it or not, they also want to improve their outcomes. I think they
understand the Whack-a-mole game that’s often played or at least some do, and the
change of focus from outcomes and errors to system design and behavioral choices.
Those are more important than the outcomes and errors.
Organizations produce the outcomes and to do that they have to design good systems
and help employees make good choices and then individuals or components of those
systems and they have to make good behavioral choices within the system. So the
accountability is that no one is off the hook. Organizations are accountable for designing
safe systems and managing the behavior of their employees and individuals are
accountable for the behavior choices within the system.
In a nutshell that’s it for all of us, but if we had an ask of the regulators we would ask
them to change their focus to holding organizations accountable for their outcomes and
individuals accountable for their outcomes.
What can you do to create a just culture? Culture change truly starts from the top, so
senior leadership exposure and training is key, as is finding champions to do the hard
work. Change is hard. New things are hard. There will be a lot of resistance but if you
have a group of champions and senior leadership buy-in and you get the managers to
understand the concepts than you’re well on your way.
There are safety culture surveys, which can help prioritize the needs in your own
organization. AHRQ just came out with their safety culture survey for long-term care
organizations. They’ve had one for quite a while that a number of hospitals use. The
next place you can look is your HR disciplinary policies to see if they punish human error
and at-risk behavior. Remember, the appropriate response to human error is to console
and you rarely see any HR policy that mentions consoling.
Looking at the at-risk behavior, the appropriate response there is to coach. Now, usually
in HR you’ll see counseling, which isn’t the same thing as coaching. Coaching really can
be an informal conversation to help people understand the risk of the behaviors they’re
doing and look at your disciplinary policies also to see if there’s an outcome bias.
There isn’t a lot we can do in our current regulatory environment about the outcome bias,
when bad things happen they need to be reported under the current structure of things,
but internal policies, you can learn a lot about the risks inside your organization if you
systematically try to get rid of the outcome bias and say there could be reckless behavior
going on that just by sheer luck hasn’t resulted in a bad outcome. Or, there could be a
system we’ve designed around, we’re just lucky nothing has happened to this point
maybe in regards to dietary. It’s not that you necessarily have the right structural
components it’s been lucky and that’s the good side about it.
The bad side is that you could have been doing everything right and had a bad outcome,
because remember, there’s no perfect system or perfect human being.
Stratis Health | 952–854-3306 | www.stratishealth.org
Page 8
I can’t emphasize enough to improve your event investigation process because there’s a
lot of information we aren’t getting to learn from events that take place.
Clearly identify those external barriers. I know folks have said to me that just culture
doesn’t apply or can’t apply to long-term care and I hope you see that that’s not true. I’ve
also had folks tell me there’s nothing we can do because of CMS or the health
department or because of the department of human services, etc. When you get into
looking at your own internal processes for management of human behavior and
redesigning systems, there are actually fewer barriers than you think.
It’s not naïve to think that you won’t encounter something where it’s inconsistent, where
human error has happened and you have to report the individuals involved to the nursing
board or pharmacy board, because that happens. Believe it or not, a lot of those
agencies are also realizing that disciplinary action for human error isn’t productively
going to get them anywhere down the road. Again, I’m not naïve in thinking those
barriers aren’t there. I think there are fewer than you know.
I’ve provided you with some resources for additional information on just culture. This is a
big process. There’s a group co-founders of Optimum Ingenuity, Scott Griffith has also
started his own organization and we at MAPS have a very comprehensive guide that I
hope many of you can look at. Our just culture roadmap, one of its domains is justice
and we have some tools available for free. There’s also a safety culture survey from
ARC which is also free.
If you go to our website you can see a map where you’ll also find some tools broken
down by care settings. There are several for long-term care and I would look also at
those identified for hospitals because many would apply across settings.
Kristi, I’m going to turn it back over to you.
Kristi Wergin: Thank you Marie.
Lori:
Hi. This has been very intriguing and the language is going to come out in the
regulations. It’s quite valuable. I’m looking at high risk behavior and when we work for a
large company that we aren’t able to change our coaching, they are counseling. I
wonder if I need to come up with one-on-one education strategies. When we’re in a
union environment you have to have evidence on documentation of that step. What
about when someone is coached about high risk practice and then it continues?
Marie Dotseth: There are a number of great points in this question. You brought in some things that are
both helpful and constraints, working in a large organization and how you can manage
your HR policies at a local level. I’ve seen that cut both ways where folks say we were
able to implement just culture on this unit but couldn’t do it organization-wide or in this
one local building or whatever. It is difficult. I would say that you can do informal
coaching before you bring it to an official HR policy.
Let’s say you notice somebody not doing a certain check they’re supposed to do.
Coaching can be informal and not taken to the HR. A manager can keep a running tab of
it on their own. Again, it’s more challenging in most environments, but we’ve worked in
labor organizations and oftentimes, just culture is an area where labor and management
actually come together. It’s about being fair and accountable, which is a win-win.
The other thing is repeated at-risk behavior. There is a point where repeated at-risk
behavior, reassignment, termination or something can happen when people are not
coachable so it can lead to a disciplinary action. That’s a good point to bring up.
Lori:
Thank you.
Stratis Health | 952–854-3306 | www.stratishealth.org
Page 9
Kristi Wergin: With no further questions we’ll conclude our educational session. Thank you again Marie
for sharing this good information with us.
What you will all receive is a link to a short evaluation via email, so please provide
feedback for this session. Once you do that we’ll send you a certificate of completion.
Everyone have a great afternoon and thank you for participating.
This material was prepared by Stratis Health, the Minnesota Medicare Quality Improvement Organization, under
contract with the Centers for Medicare & Medicaid Services (CMS), an agency of the U.S. Department of Health and
Human Services. The materials do not necessarily reflect CMS policy. 10SOW-MN-C7-14-26 031014
Stratis Health | 952–854-3306 | www.stratishealth.org
Page 10
Download