Watching this video on Teamwork and the Science of Safety.
Here we're going to dig a little deeper into what we mean
by teamwork and we're going to really talk about the relationship
between teamwork and safety and why
this concept has emerged as one of the critical themes
in high reliability and in
the safety science and the operational world of high risk industries.
This is a magazine ad from the mid-70s picturing Captain van Zanten of KLM Industries,
who at the time was their chief aviation officer and their chief quality officer.
Captain van Zanten and also was the pilot
most responsible for the worst aviation disaster in recorded history,
on the island of Tenerife in the Canary Islands.
In 1977, he flew his fully loaded 747 jumbo jet,
into another fully loaded 747 jumbo jet,
resulting in 583 people losing their lives.
This accident spurred a lot of conversation,
a lot of reform,
a lot of safety initiatives in the aviation industry, across the industry.
Many things came out of this incident,
to give a little background and context.
What happened is a terrorist attack at the main airport for the Canary Islands,
diverted much of the air traffic to a smaller airport on Tenerife,
that was not accustomed to that type of aircraft,
that volume of airplanes that were sent there.
All of the aircraft there were scrambling to take off for many reasons,
mostly being at the time regulations that they did not take off
in a certain amount of time they would not be able to take off for eight hours or so.
Everyone wanted to leave,
the weather conditions were awful,
the airport, there was heavy fog.
The airport was not equipped with ground radar.
The fog inhibited everyone's situational awareness of what was going on.
The air traffic controller was attempting to position the plane to be ready for take off.
The pilot, Captain van Zanten,
interpreted that to mean to take off.
At which point, he pushed the throttle forward,
rolling down the tarmac,
and before they could see the other aircraft crossing the runway, crashed into it.
And what is really terrifying is if you read the logs,
the transcripts of the cockpit voice recorders for this incident,
it was clear that the copilot in Captain van Zanten's plane,
interpreted the air traffic control command,
appropriately tried to warn the pilot that they were in danger,
and was shut down.
This incident really illustrated how
a status and hierarchy can influence someone's behavior negatively.
The copilot in that plane had the correct information,
had the right information that could have avoided
this disaster but because of these other influences,
the social influences in that cockpit,
they did not and that's terrifying.
Put yourself in that position,
no one would think that you would do that,
but the evidence is clear that most of the time most people will do that.
They will sit down and be quiet,
when they're told by someone in a position of high authority to do so.
This led to many, many reforms.
Again, many issues in this environment,
part of it being structured communication.
Now, no one says the word take off unless they mean a very specific thing,
so there's not an opportunity for people to miscommunicate.
But also there was a huge initiative on team training,
that they labelled crew resource management training,
to try to combat these types
of pressures that were under these types of behaviors that we exhibit,
when we're in these situations.
And that has led to a lot of parallels between some of the issues in health care.
Very different setting obviously but some of these same issues impact us.
And clearly communication is still a challenge for us.
The Joint Commission in their review of sentinel events,
which are incidents of severe and lasting patient harm,
they continually show that communication and communication failures are one of
the leading causes or contributing factors to those types of incidents.
In a fantastic review of the Australian healthcare system,
what they found is, every preventable death that they had in their system,
it was twice as likely to be caused by a failure of
communication than it was a failure of technical competency.
If you call back to the difference between being an expert and part of an expert team,
we have plenty of experts,
we have few expert teams and that's where we can introduce risk into our system.
Similar findings in much more qualitative but thorough reviews
of what happens in operating rooms for example and about 30% of all communications,
there were found to be failures in terms of communicating the wrong information,
communicating information too late,
communicating information to people who cannot act on it.
In a very similar finding in review of adverse events and critical care findings as well.
And from closed claims malpractice suits,
communication is a most frequently occurring kind of
behavioral failure that get raised in these.
So lots of evidence that very similar factors
driving risk in aviation and other industries are with us as well too.
This is just issues about how we interact in social groups,
in teams that can introduce risks and if we don't manage
that mindfully and if we don't manage that effectively,
we're putting patients at risk.
And one more study that I think encapsulates a lot of this.
This was published in 2014,
and what they did is they asked people
to report how much of their work happens in a real team,
which they define as two more people having shared unvalued goals.
So again this gets at
our interdependence and are we trying to achieve something together.
And of course that interdependence,
I can't get to that goal without you,
you can't get there without me.
And the last piece was that we engage in collective reflective activities.
So, with some regularity, we stop,
we reflect on how we're managing or work together
and we talk about that with an improvement lens in mind.
So they ask people how much of your work happens in real teams,
versus how much of your work happens in these co-acting groups.
And they define coacting groups as two or more people.
The members have a common purpose but it's really not as specific as a goal.
You know, we're here for the patient,
which is a great sentiment but it's not exactly
specific enough to help you manage your work very well.
And then the accountability focus is on the individual.
If something goes wrong, we look for
the person that was responsible and we don't necessarily
attend to how the group or the team or other factors may have played.
What's really fascinating, and this was done in
the National Healthcare System in the UK, 62,000 people responding.
People that reported more of their time in
real teams witnessed fewer errors and incidents,
which we expect from the previous data.
But they also experienced themselves fewer work related injuries and illness,
they were less likely to be victims of violence and harassment.
There are also less likely to intend to leave their current employment.
At the hospital level,
higher team membership in real teams was associated with lower levels of
patient mortality and also less sickness and absence from staff in those organisations.
So those are pretty broad findings but they make a lot of sense when we think
back to this idea that our tasks in healthcare are inherently interdependent.
I cannot do everything I need to do to manage patients care on my own,
but if we're not set up to manage that effectively,
I now have a confrontational relationship with the system at times.
It feels like I'm the one trying to do everything for this patient.
That can create a lot of stress and anxiety
and create situations that are not just bad for
patients but that are also bad for the people working within that system.
So we want to do what we can to take
these inherently interdependent tasks and have real teams manage this task,
where we're explicit about interdependencies and we manage them well.
And high reliability organizing is a set of principles,
a set of values that describe expert organizations.
So organizations are able to maintain high levels of safety outcomes,
when it looks like they maybe shouldn't,
given the complexity and the risk that's there,
they do much better than you'd initially think it would.
So some of these values involve deference to expertise.
So the first one involves deference to expertise.
And what this means is that people respect input, judgment, analysis,
opinions based on the expertise of the person and
the experience of the person and not their formal role or title or responsibility.
So we pay attention to the actual content of the contribution and
not necessarily to the signs and signifiers of who it is.
This is a way to get around what we saw with
Captain van Zanten of shutting people down because of role,
status and not the value of
the information or the value of what they have to offer to the group.
The next one is sensitivity operations.
In here, we really need to have,
to be reliable we really need to understand what the work practices are
and what a patient is experiencing at any one point in time.
We cannot do this if information is not flowing openly and we need to actively build
that shared situational awareness among everyone
that's responsible for having hands on care for that patient.
And again that cannot happen unless we have a strong team dynamic.
You have a reluctance to simplify interpretations
in organizations that engage in high reliability organizing.
And here, team members are comfortable to
question the assumptions in the plans that are put forward.
This again is contrary,
it does not happen in organizations with a rigid hierarchy,
where people do not feel comfortable speaking up against or speaking
out against plans put forward by someone in a position of higher authority.
And commitment to resilience,
so team members are proactive in identifying and learning from errors and behaviors.
I mean, this requires an openness that does not quash and put based on hierarchy.
In preoccupation with failure,
team members share their concerns about potential risks without fear of retaliation.
These are the core values of
a higher reliability organization
and they are not achievable without high functioning teams.
These are social in nature,
they're about how we work together as individuals,
again to manage those tasks,
to stop and reflect on how we're collectively managing those tasks.
In the interpersonal peace,
people need to feel trust in their colleagues,
trust in the organization that if they do offer advice that's
contrary to the prevailing plans or
the prevailing ideas that that's welcome and they're
not going to have any personal retribution for that.
So those are the kind of key points of why teams are central
to reliability in any industry.
The interdependence, the need for openly flowing information,
the need to respect input based on
its value and not the role of the person who's putting it forward.
In the next video we're going to talk about one of
the key interventions for improving teams in healthcare and in general.
We'll dig into details of team training.