Good evening, everybody. My name is Andy Mee from Veterinary Management Consulting. And, tonight, it's my pleasure to introduce you to Dan Tipney, who's gonna give us a talk about human factors in practise.
This webinar is sponsored by MWI Animal Health. Dan provides an array of insights into human performance gained from his perspective across numerous disciplines. Formerly as an international athlete and sports coach, pilot instructor, and currently as an airline pilot and human factors trainer.
Download the human factors training programme at a major UK airline and has since developed and delivered non-clinical training and in practise supports to both veterinary and healthcare professionals. He represented Great Britain both as an athlete and a coach and has a great passion for supporting teams so that they can consistently achieve their goals. Across all fields, Downer has consistently observed positive change as a result of non-technical factors including leadership, communication, well-being, and systemic support tools such as checklists.
Underpinning much of his work is the study of human behaviour and the associated impact of workplace environments and culture. The vet lad, Dan collaborates with experts from veterinary medicine, aviation, psychology, and healthcare training with the mission to enhance performance in practise through the reliable delivery of clinical skills. Over to you, Dan.
Brilliant, thanks Andy, and good evening everyone, thank you very much for joining us. Yeah, following on from the the previous session that I ran on human factors, so for anyone that was there, that was an opportunity to introduce the topic, the very much the what and the why of of human factors. In veterinary practise.
And this is really following on from that, looking, zooming in more specifically on, the elements of safety culture and really how that applies both to the patients, so we're talking about patient safety, but actually we're talking very much about the perception of safety, more from a psychological. Logical sense for the team. So, very, very important topic, and, something I'm looking forward to talking about, in a sec.
So, just very generally, human factors, for those of you that didn't get a chance to, to come to the, or join with the introduction session a few, a few months ago, it's effectively the study that that looks at exploring the gap that exists between possessing clinical skills and knowledge and achieving the outcomes. And as we found in lots of professions, lots of research, originally from aviation into into healthcare and all sorts of high, sort of safety critical, professions. We find that very rarely when things, go wrong, is it actually because of a lack of technical skills and in, in, in, in healthcare and animal healthcare is no different.
We're talking about the clinical skills. So when we explore that gap that exists between the clinical skills and knowledge and the outcomes that we want to achieve, so specifically here for your patients, what we find is that, what, what is, what fills that gap are elements of communication, leadership, decision making. The impact of, emotion, on, on our emotional state on our decision making.
We think about human limitations around memory and, and our ability to, to focus and pay attention. But surrounding all of that is this sort of this, this huge topic of culture, and in, in one sense, culture is almost one of the most complicated things you could ever talk about if you consider the study of anthropology and, and the, the, the history of our species and, and, and all the cultures. That surrounded it and and you think of it from, from a sort of religious sense and and everything that that's, that's accompanies our development on this planet.
But on the one hand, so on the one hand it's incredibly complicated, but on the other hand, culture is actually incredibly simple. It's a collection of what everybody does. People talk about beliefs, they talk about intentions, they talk about attitudes, and that's absolutely right.
But it's actually what everybody does on a day to day basis. So one of the single biggest things that we can do is change the language and really, really. Empower people to, to believe that everything that they do all the time is influencing what other people see and therefore what they feel, what they think, what they do, and then that influences what other people see, what they feel, what they do.
So on the one hand, culture is incredibly complicated. On the other hand, it's actually incredibly simple. It's what people do all the time.
And in this sense we're talking about the impact that has on patient safety, but of course there are lots of different elements of culture and the, the, the expression, the way things, the way we do things around here is that kind of that very broad. It's expression that covers, that covers this, this idea of culture within, within an organisation, within a team. So that's the sort of the very general overview, and I'm talking about it from my experience, originally from sport, my, my passion is human performance, looking at, at performance at a, at a, at a, at a fundamental level, looking at the human being.
My experience within aviation and the, the human factors training that I've delivered for pilots and cabin crew members. And really the, the, the culture that has evolved within the, the, the professions that I've worked with aviation healthcare in the last 4 or 5 years, within, within the veterinary profession. So that's the perspective that I'm bringing to this.
And and that's the sort of the general intro. So what to expect from tonight? I'm gonna talk a bit about systems thinking.
I want to talk a bit about components for a safety culture. I'm gonna talk a bit about alternatives to a blame culture and and what what that looks like and what we might call that. I really, really thinking about psychological safety, so that's.
That's how this applies to the people as much as it does to the patients. So that sense of feeling safe within our team and why that's so important for performance, for safety and for wellbeing. Balancing learning with accountability and where we have to have a line, but ultimately it is about, it is about the learning, it's not about blame, but how we can achieve that.
And introduce learning beyond failure and how it's not only when things go wrong that we can we can benefit, but of course we can be learning from things all the time every day and and what maybe what that looks like. Before I, before I go into anything, I want to talk about human behaviour at a at a very fundamental level, and I want to talk a bit about how we all have a tendency for our boundaries to migrate, and this is based on all these things that you can see here, life pressures, vulnerabilities, belief systems and what we all, we all probably associate with this, if we think about something we're all familiar with. We all know about this, the legal, speed limit, 70 miles an hour.
But of course we all have that tendency. We've all, we've all experienced it before, based on our, our, our experience of success, we go driving and we, we, we, we have a successful drive, we don't get caught, and we, we, we, we, we, you know, it's a safe experience. And so inevitably we, as, as a species we are hardwired to push boundaries and so we end up in this what's considered to be an illegal but normal space.
It's technically not within our protocol, you know, if we think, I'm using the word protocol because that's, that's the sort of closest equivalent we've got in on the road, but it's, it's a rule, it's a law, but it's technically considered normal. And then again, we, we do that day after day, it, it ingrains our, it conditions us to believe that that's OK, that that's normal. And quite often we end up in this, this, this illegal, illegal, this dangerous space, where eventually something inevitably for most of us, you know, if we get to a point, it's, it, it can get very unsafe, and then something happens.
It might be an accident, but it might be that we simply get caught or have a scare. But quite often what that means is that we end up back to the start, and it's this very, very fundamental human condition. And it's, it's known as a systemic migration of boundaries.
One of the, one of the sort of, leading figures within the world of, of patient safety and, and human factors discussed this someone called Rene Almaberty who's written some really interesting work, discussed this, and this is just a very fundamental principle, to, to a lot of what I'm gonna talk about. And so we can look at it in another sense. We can talk about in a, in a clinical sense, and this applies to, of course, human and animal healthcare.
We can think about hand washing and how it, you know, especially, any, any healthcare organisations that follow the who guidelines for, for the five key moments for hand washing. You have this initial rule for the protocol, which says, you know, hand washing every patient every time, and all these for all the different moments where it should occur. But then, of course, we, we are inclined to to push boundaries.
We are inclined to develop workarounds to get jobs done. And that's not because we're necessarily bad people or bad professionals, but it's because as a species, that is how we're hardwired, and that is the reason that we have developed, one of the reasons that we have developed as a species to where we are. So you can imagine how, of course, things can develop hand washing only when patients have suspected infectious disease.
And eventually it only becomes on on audit days, and then, of course, we end up in the same situation. Now again, that's not to say that's exactly what happens to everybody, but it's an example of exactly the sort of thing that can happen to a lot of people. And as I said, it's not because they're bad people, it's not because they need to be blamed or punished, it's because that is a human tendency, and the more we can understand that human tendency, the more we can can sort of learn to understand almost our humanness.
And, and, and this is such a key element of that, the better we are to, to develop safe systems that allow for our tendencies as human beings. And one of the things that's related to this is something called the varieties of Work. Stephen Schrock is a psychologist.
He's also a key figure within Eurocontrol, the air traffic control body across Europe, and has done as a chartered psychologist and a chartered, a chartered member of the Human Factors, Ergonomics Institute as well. And he talks about this model that he's developed and it's it's very interesting because it it effectively describes the different areas. You can almost imagine on the far left that, you know, on that the work as disclosed is what people say that they do, because it's got, you know, want the, we all want to sound as though we, we abide by the rules and we follow the protocols.
Work is imagined is maybe in, in a lot of organisations in a lot of settings, what. Maybe people who set the rules, who write protocols, who write procedures. Imagine people are doing on a day to day basis.
Work as prescribed is what they then actually ask them to do, and then work is done is actually what's really happening out there based upon all these different elements, one of them being this systemic migration of boundaries that we all tend to find workarounds to achieve the job that we need to do for whatever reason. And also, and work is done is also based around many other factors of being human, our emotional response being one of them. So it's just this model, and we can think of this in lots of ways, work is imagined versus work is done.
It's all very well designing something and telling people where they should go and putting signs up to tell people where they should walk. But ultimately, most of the time people are gonna do what people do. And again, that doesn't make people people bad or, or, or it's, you know, it's easy to say, well, I wish people would just be more like.
That's an expression you hear everywhere. And quite often the further detached we can become from frontline use, and this is just a day to day example, frontline use being the people who use the path, the further detached we become from that, the, the easier it is to, to, or the more inclined maybe we all are to, to start using those expressions. People need to be more like, they need to follow the rules.
And actually what we need to do is understand what it is people are really doing and why. This is a great example of something desired past. It's basically, I, I just, it's something I came across and they, they.
They looked at these, these old pathways that were, that were used, you know, from old photographs. This is at Ohio State University. And when they redesigned the pathways, they looked at these old photos and they designed the new pathways based upon these old pictures of where people actually walked.
So that it's this idea that we're, we're developing something based on work as done, or maybe not work, but walking as done, as opposed to what they imagine people will do or what they should do. And they design a system that in that is that fits around what people really need. And it sounds like the most obvious thing in the world, but it's actually not very easy to get right and it's not particularly intuitive all the time.
Because. When we're sitting at the end, you know, particularly in leadership and management roles, it's sometimes very hard to know what people really need, what people really do, because that takes a lot of time, that takes a lot of energy. And so developing safe systems and culture and supporting cultures, I never use the expression creating when I talk about culture, it's about supporting an environment that is that that that will that will that will .
Encourage behaviours that are consistent with the culture that you want because it's not as simple as creating. But when we're doing those things, it's, it's very hard sometimes to really know what people do and it takes a lot of effort. Work as imagined and work as prescribed, here's some examples, of course, you know, it's very easy to say with, with an example such as medications that we need to pay attention and that, you know, logically that we, we'll know, that the sort of the, the best practise standards and we know that the the the sort of things that we need to avoid and the sort of things that can happen about selecting the incorrect medications or.
Picking up the wrong syringe or whatever it might be that might lead to a medication error, and it's very easy to simply say people need to pay more attention. The problem with just pointing out something like, a, a potential attention lapse is that we're not really solving the problem, we're just, we're just, again, we're just highlighting it. And so this is very much, you know, workers imagine the work as prescribed is to is to is to set the the guidelines around, around what people should be doing and the, the, the, that, that's one thing.
What imagining work has done is to actually stop and consider, OK, well, what might actually be be causing some issues here, and this is actually a photo we got sent, the doxycycline, dexamethasone syringes, which of course technically are high are indicated in terms of what they are. They are, they are the the writing is there, you could argue it's very clear, but the problem is, you know, on a dark night, in, in, you know, when, when in if, if someone's stressed, distracted. Is, is, is, is tired.
There's lots of other things going on around them. It's, it's all very well to say what they should do, they should make sure they check their medications carefully, but actually if they go to a draw, and usually, and, and, and, and, and they, the last 999 times they've been to that draw, they've they've been familiar with finding one medication. And the next time, for some reason they go and there's another one which looks almost the same.
It's very easy to say what they should have done and harder to sometimes understand why maybe they would be inclined to do something else. And it's this diff difference between work as imagined, work as prescribed, and work as done. So it takes energy and it takes effort, but it's, it's, it's crucially important.
And the reality is, in practise, there's all sorts of examples in terms of things that we can do, you know, and again, again, it's not to say that these things are OK, but you know, in terms of thinking, why is this relevant to me, you know, and considering when I say me, I mean this the sort of the, the, the broader we as across the profession, you know, do we routinely develop these kind of workarounds so that, you know, the boundaries that have migrated, and behaviours normalised, you know, what, what sort of things are very easy to end up normalising? you know, where it's OK to do this, and, you know, and when, when do our boundaries need to be re-established. So consent forms are an example, you know, our consent forms always signed.
Of course, we know they should be, and I'm sure in lots of places they absolutely are. But it's an example of something that if time pressures and, and, and all sorts of other things that are happening on a day to day basis, it would, it although it's something we know in theory should be. Happening, it's, it's one, it's one area or it's one boundary that could migrate, and as soon as enough people start to do it and it becomes normalised, it's surprising how easy it is to be easy for that is to become that illegal normal that we talked about, even though everyone knows that that technically it shouldn't be.
But this is the element of culture that we were talking about earlier. As soon as you reach a tipping point where enough people do something a certain way, other people see that, and it's not a conscious thing, it's a subconscious. Desire to belong.
Again, this is a fundamental human tendency. It's a fundamental human desire to want to belong, to want to be a part of a group and to want to do things the same way. And so we're subconsciously being driven by that.
And as soon as something is normalised, it's very, very hard to, to, to, to keep track of exactly where you are, between that illegal normal and then that sort of dangerous space. There's so many examples of it, you know, and In terms of, you know, the, the, the, discipline around counting swabs, whether or not it's being said, but actually being done in the way that it's possibly prescribed in a protocol or maybe it's in a way which is consistent with what is likely to achieve the safest outcome. ID colours are another one, particularly in bigger, bigger practises.
In referral centres where there's lots of patients, but to be honest, anywhere, it's an example of culture. You know, you could go to one place, and although they might know the same thing, you know, that, that, and those the people that work in, in one practise might know theoretically, of the risks around patient identification. It's surprising how quickly something becomes normalised if people aren't familiar or aren't used to applying ID collars.
Whereas someone that works in that practise could go to Other practise where it is normal and would very quickly start, start, start doing it naturally, because that's what other people are doing. And it's amazing just, again, how, how these norms influence what other people do. And of course, again, the example we use there in terms of whether or not we are double checking medications.
Again, it's something which we all know is likely to improve patient safety, but it's very, even if it's a protocol within a practise that medications are double checked. And even if they've been through training to say, actually, let's let's make sure that we all understand the importance of using open questions instead of leading questions when we use when we when we perform our drugs checks, it's very, very easy for a new normal to become you know, accepted if because people develop workarounds. And so I'm, I'm using these examples because.
We, we, you know, we, we can, and we all make errors, but as well as that, it's, it's about trying to not only understand the errors that we might make, but it's about understanding these unhelpful norms that we can all develop because our human tendency is to migrate boundaries and to cut corners to achieve goals. So one of the reasons we've developed, I said one of, but it's, it's not the only, but it's one of the reasons we've developed as a species. So it's a starting point and a significant piece of the jigsaw when it comes to taking a systems approach, and we'll talk a bit more about that in a second, and developing this culture.
A safety culture, which is supporting an environment and and which is normal to acknowledge this, and I use this word humanness, you know, because it's, it's the best way of describing it almost, our humanness. And also an environment where it's safe to talk about error and safe to talk about our, our tendency to migrate boundaries. And this, this use of language and normalising these things is incredibly important.
So that's just a few examples there. Another example is, is one from Aviation which I think I touched upon last time in, in the, in the session quite briefly, but just as a reminder, this is a, a of, of, of another. Situation where or or an example of where a systems approach was necessary, and we're coming on to talking about that.
So I just want to, I want to touch this on this example. So this is an aircraft, the B-17 bomber from the 1930s, so Boeing, the American manufacturer, made this aircraft. And were incredibly keen for the world to see it and to demonstrate it how incredible it was, as you can imagine in the 1930s, this thing is a picture of the cockpit, was, was a bit of a spaceship.
It was never, they'd never seen anything so, so impressive, so complicated. And as a result, even though they had some of the best pilots in America who were test flying this and demonstrating it to the world, they were experiencing some fairly major errors and, and actually some, you know, very embarrassing and, and very, tragic accidents during these tests and demonstration flights. Because the aircraft was complicated and because things were giving missed for a variety of reasons.
And it would have been very easy. It would have been very easy for Boeing to just simply say, well, don't forget, in the same way it'd be, it's very easy when medication errors occur to just say, well, we need to pay more attention. And it's very easy to, to, to just focus on this work as imagined or this work as prescribed and just saying to people, this is what you should do.
These are the standards you should strive for. And it's not that that's wrong, it's not that Boeing are wrong in the sense that they're part that they want their pilots not to forget. And it's not that they're wrong when they're saying to people, we want you to remember.
And it's not that anyone's wrong to say, pay more attention if a medication error occurs. The, the problem is, it, it, it's very rarely gonna solve the problem. And so actually Boeing realised that, and that was, that was the the reason that they actually had to get together and say, well, .
We, we acknowledge that we, that these are the best pilots in the country and they are clearly it's clearly not working for them, and we probably need to take some responsibility for the fact that we have developed this very complicated aeroplane, and maybe we need to do something that will make help help make it easy to get it right and hard to get it wrong. And I, I, I really like that expression, you know, what can we provide that will make it easy to get it right and hard to get it wrong. So they, they, as far as we know, developed in the 1930s.
What is, what is the, the, what was the first professional checklist that was used on an aeroplane to basically determine that before you move on to the next stage of what you're about to do, so before you go from having a taxi to the runway and take off, before you land, before you do all these critical steps, just make sure you've not forgotten anything really, really important. And that, as far as we know, was the first professional checklist which, which has been continued through aviation and it was, was, was a large piece of motivation behind the work that Ail Gwande did, which, which led to the, the, the, the the research from the World Health organisation and the development of. The surgical safety checklist, variations of which have been used in lots of different clinical settings.
Human and animal healthcare have benefited from this enormous piece of research based upon this same principle, but it's an example of taking a systems approach. Because it's all very well to identify the the the the the the the the standard that you want to achieve, which is that we want to have safe surgery and a lot of that will involve, and now, OK, this is, this is one of the, an example, this is the original World Health organisation copy, but you know, you want to achieve outcomes which are consistent with a lot of those points on there, but it's very easy to talk about what you want to achieve. But sometimes a systems approach is actually saying, well, what are all the things that need to happen?
That are going to contribute towards that other than just simply stating what the standards and what the outcome should be. And it's about this idea of doing something to make it easy for people to get it right and hard to get it wrong, and that's what aviation's done so well. And that's almost the reason that it's very hard to compare aviation to any form of healthcare because the system which you operate in.
In aviation has become so incredibly safe for so many reasons, because of the level of manufacturing, because of the level of air traffic control, and because of things like checklists and procedures. And because, of course, aeroplanes are on the whole a lot more predictable than patients, animals and humans. But the system has become very safe and and that's, that's why I'm so interested in this.
So a systems approach in veterinary practise, this is a quote from from someone that I've done, done some, had the chance to do some work with, Matt McMillan, and he's written a paper on checklists in veterinary anaesthesia. And he talked about this idea of, you know, achieving best practise standards is one thing. You can say what those practise standards should be in the same way that aviation can set out what they want to achieve in terms of a safe outcome for a flight.
Or, you know, in terms of the things that they want to make sure pilots do, but sometimes it's not enough to simply say what the standards should be, but what you need to do is find specific safety tools to assist this or to assist people in the implementation of those standards. And that's exactly what a checklist does, it's just an example of systems in systems thinking in practise. So Developing that slightly, a system model of accident causation, this might be something that some of you have come across before, but essentially it starts with this concept that in any environment, and we're gonna talk generally about safety critical environments, so when I say safety critical, I mean professions whereby the difference between a good and a and a and a and a a an unexpected or a negative outcome doesn't just involve the loss of money like it would in a retail, for example.
But a safety critical environment is one where a negative outcome involves potentially the loss of life, whether they be human life or animal life. We're talking about healthcare, we're talking about veterinary, we're talking about aviation. We're talking about the military, we're talking about oil and gas.
Lots of professions. And in those professions you have a, a million1 potential threats and hazards, things that could potentially spoil everyone's day and really prevent the outcome from being what they want it to be. But in every profession there's there's always barriers, there's always things that are put in place to prevent those potential threats and hazards from becoming adverse events.
And Professor James Reasons, so this is someone who was a professor of psychology at the University of Manchester. He's done a huge amount of work and he's, he was, he was certainly one of the original er researchers and original authors on a lot of the sort of work around safety culture, which is predominantly featured in aviation healthcare, but is so, so relevant to what we're doing in the veterinary profession as well. And these successive layers of defences.
Are put in place to prevent that. The problem is inevitably they're designed by humans and as humans, er as fallible humans, that means that therefore the defences themselves will be flawed and so you end up with this Swiss cheese model where each layer has its inherent weaknesses. But usually they don't line up.
Usually your weaknesses in one layer will be caught by another layer. And there's lots of different ways of imagining this, but you can imagine almost the different layers representing lots of different things that we'll talk about in a second, right from the organisation. In terms of the the actual procedures and protocols which are put in place, through to the training, through to the leadership that's in place, the supervision that's there, all the way down to the frontline professionals who are, who are the ones ultimately who will be sometimes the difference between a positive outcome and a and an adverse event.
But as we said, it's only when these line up and they're they're they're sort of constantly moving these holes, and it's only when they line up that we, we end up with this adverse event. As I said, they could be due to organisational failures, these, these weaknesses. That could be around training, procedures, protocols, staffing, so really always a really interesting one is around the, the sort of systemic issues which, which put people in compromised situations.
It could be around, supervision, you know, the level of experience of people that are working together. The ability for people to perform double checks, based around who's actually there, for example, using that medication example. Or preconditions, factors around fatigue and stress, things that are causing emotional states which are, are possibly or possibly not conducive towards performing cognitively, at a high level.
And all sorts of different things, and then ultimately you've got at the end of all of that, as we said, that last layer of defence. It's the people, it's the frontline team, the vets, the RBNs, and everybody who works in, in that, in that, in that frontline team. And really, and, and I, I like to, I don't always like to use models like this, but I do think this is a useful one, because it helps us to picture all the different things that we're doing, and it gives us a way of imagining why we do so many things about protocols, around, around training, around.
Designing things like checklists and all the things we can do that can strengthen that can create barriers, and the things that we can do to strengthen those barriers and almost having this visualisation of those layers and imagining the things that we could do which either will cover up a hole or potentially create a hole. And then really bearing in mind that those frontline team members, the ones that ultimately can be the difference between those weaknesses potentially becoming an issue, an event and not. And there's different ways we can look at this, we can look at active failures.
And we can look at what's called latent conditions, so an active failure is something that that would happen. That's something which is specific to that event, you know, you might consider it almost like a a one off. And it could be, it could be a mistake by an individual, or it could be just something that happened on that day, and it's very hard to predict again in the future.
Whereas the latent conditions are often described as these kind of resident pathogens within a system, these are the things that arise from decisions made by designers, builders, procedural, you know, managers, top level people that design things in the first place. And these latent conditions are weaknesses that are there all the time. And then it, but the problem is it's very hard sometimes to identify what they are.
If you have an adverse event or you have a near miss event, it's hard to sometimes understand what are, what were the latent conditions. The things that if we address will potentially reduce the likelihood of a similar event in the future. Because it's easier to address the active failures, cos they're much easier to spot.
The problem with addressing the active failures is it doesn't actually reduce the likelihood of a reoccurrence very much. So we're talking about, you know, the active failure might be rushing and selecting an incorrect medication, whereas the the the latent condition might be the storage of the medication, not having a protocol for double checking. And of course the difference there is huge.
Because addressing simply the fact that someone was rushing and selected an incorrect medication actually is identifying the workarounds, the migration of boundaries, which is important, and may well be a learning point for someone, but actually doesn't address the issue for someone in the future. It's the latent conditions, the underlying factors, the systemic factors that are much more important. And of course ultimately it's, it's what we, what we're trying to do is look beyond the individual.
And I'm gonna talk later about how we can balance this because it's not as simple as just simply don't blame anyone and everything, anything goes. What we want to do is try and look beyond the individual and get a balance between accountability and learning. Doctor Hadiza Baaba, as you may remember from the news in 2015, was convicted of manslaughter for gross negligence for the death of, of a of a little boy called Jack Adcock, who died of sepsis in 2011.
And yes, she made some mistakes, and they were of course symptoms. They, so they, they were, they were factors in his death, but there was a whole load, without going into detail, it's a, it's a very tragic, but very interesting example of systemic failures. And, and by simply addressing the active failure, which on this occasion.
That was her active failure that led to the serious incident. But essentially that was the last hole in the cheese. It was essentially she was the goalie.
She was the goalkeeper on a on a football pitch, who let the goal in. But the goal didn't get scored only because she let the goal in. The goal was scored because in the same way on a on a football team, everyone else had their part to play in that goal coming in, not just the players on the pitch, but everyone involved in that team on that day and in the days, weeks, months and years leading up to that game.
And it's exactly the same as as this situation here, so it was a very tragic example of how, how much harm can be done from, from simply only addressing the active failures. So it kind of leads us on to talking a bit about this topic of blame culture, you know, why is it so common and what can we do about it in practise. And of course, inevitably it's common because it seems to solve the problem.
It's an easy way. Addressing active failures is an easy way of seeing, seeming to solve a problem. But of course actually what it does is lead to fear, it leads to to a to to a stifling of, of people's willingness to come up with ideas.
. So what we can do about it is this is when we come, this is when we come into talking about safety culture and the components of a safety culture in terms of how we can address and come up with an alternative to a blame culture. So safety culture, again, this is the work partly from James Reason who we just talked about and also someone called Sydney Decker, who's done a lot of work around something we'll talk about later, called Just Culture. But safety culture has these components.
So we talk about informed culture, which is, you know, so that those involved, those who manage and operate this system have a knowledge about the human, the technical, organisational and environmental factors that determine the safety as a whole. So this basically means those in decision making roles need to know, understand work is done, which is what we talked about earlier. A flexible culture, cultural an organisation is able to reconfigure themselves in the face of high tempo operations and things like that.
And they often need to shift from this conventional hierarchical model to a flatter, flatter hierarchy, and they want to help people, . And And so I'll sorry, I'll come to the flexible culture in a second. The, the reporting culture.
This is really, really relevant to what you're doing in practise, because. The reporting culture is your opportunity to report adverse events and near miss events. So if you use a system, whether it's your own system, whether it's a book, whether it's an online system such as VetSafe, having a systematic way of reporting adverse events and near misses is an absolutely critical piece of the jigsaw when it comes to having a safety culture.
And again, this comes from a lot of, a lot of work, from these two. This is sort of a combination of, of, of their work from these two figures that we, that we're talking about here. A learning culture, and this is really that kind of humility and organisation must possess this willingness and the competence to draw the right conclusions from the information it's getting and willing to implement whatever changes they need to do.
So it's this humility, it's a, it's, it's that willingness to do that. And what else are we talking about? We're talking about this flexible culture, apologies, I, I did mention that earlier.
I got my bullet points the wrong way around. But this flexible culture, as I mentioned earlier, is that, that willingness, that, that, that ability to sort of flatten this hierarchy and, and actually be prepared to. To have that flexibility, to, to reconfigure themselves if they need to.
So being informed, being flexible, having the ability to report and, and, and having that humility to learn, is fundamental. And what we're trying to do is actually look at where we can learn from. So, it just, this is, this is something that's developed from originally someone called, it was originally known as Heinrich's Pyramid.
And, it's been developed over the years. So this is from a more recent some more recent work around industrial accident prevention. But again, it applies to any safety critical profession.
And what it's basically showing us is that at the top there, one serious incident, that's what the SI stands for. What they found is that, is that now, although these figures are going to change, so whether it's, whether it's 3300 and 3000, or whether it's 5500 and 5000, or whatever, whatever the exact numbers are, the point is that for every event, you're gonna get, there's gonna be some minor events. And for every minor event there's gonna be a lot more near misses, and for every near miss, there's gonna be a whole load of stuff going on that we need to be learning about.
The problem is it's easier to learn from the serious incidents and the events, things that actually occur. But it's much, much harder. It takes more effort to learn from these no harm learning opportunities.
Now, potential risks are really hard to spot. Although sometimes we can do it by identifying, proactively identifying medications that are stored next to each other that look the same and by doing all sorts of things like that. But near misses, things that nearly happened that possibly only luck prevented them from becoming a serious event.
They're the chances to learn, because nothing's actually happened, but we can proactively identify the things that we can change for the future. But we need a reporting system. We need, we also need a culture that supports that safety and allows people to, to do that without the fear of, of punishment.
What about when things go right? So we've been talking a lot here about adverse events and near miss events, and it's got to be a starting point. When we talk about improving safety, it's got to be a starting point to prevent it from going wrong.
The problem is we don't only want to learn by stopping things from going wrong. Because if we actually look at statistically, in healthcare, for example, they reckon across the UK across the whole of healthcare, which is incredibly broad, but they reckon things go wrong about 1% of the time. Which means that 99% of the time things are going right.
And there's all these, it's, there's all these different analogies we can use about the idea of, you know, again, the sporting analogy of, of, of trying to improve your performance of the team by studying the one that's the team that's about to get relegated rather than studying the team that's about to get promoted or about to win the championship. So this idea of what we call safety one and safety 2, safety one is a concept around studying failure, it's around looking at accidents and incidents which actually only represent a very small percentage of of events. At the far end of the scale, you've got excellence and you've got exceptional events where things very difficult, circumstances had a positive outcome.
But actually in the middle you've got a whole load of really valuable stuff, people just doing their job, work as done, what are people doing every day in that 99%, which is leading to expected or to positive outcomes. It's much harder to learn from that, much easier to learn from the negative events. But by, but by having a culture where it's normal to learn from positive events and normal events, we are creating a much more psychologically safe environment.
And, and that's something we'll talk a bit more about in a sec. So safety one and safety 2, safety one being the tendency to look at accidents and incidents, safety 2 being an alternative. And a lot of this has come from the work, of, of someone called Eric, whole male, and it's a, it's a name that I quite often pronounce incorrectly.
But he was a senior professor of patient safety at the university in in in Sweden. And has done some really interesting work on human reliability, cognitive systems engineering, and, and all sorts of human factors. And he talks a lot about, .
These two principles. So safety one being about the study of accident causation, you know, why did something happen and what can we do to prevent it happening again? Safety 2 is much more about the anticipation of events and problems based around what people are really doing right now.
That takes effort, it takes time, but it's, it's, it's obviously going to, you're not waiting for something to happen. A safety one is the understanding is, you know, it's trying to understand what goes wrong as opposed to safety 2, which is actually looking at the events that go well and looking at how we can replicate that. Safety 1 is about avoiding errors.
Safety 2 is about repeating what goes right. And when I say what goes right, I don't just mean the excellence and the exceptional, I mean the normal everyday, just stuff that goes fine, stuff that goes OK. You know, safety one about reducing losses, safety 2 about enforcing successful behaviours.
So this is really important, you know, when people do things that are, that are conducive towards safe safety and and successful outcomes, what are we doing to reinforce that? Safety one is reactive. Safety 2 is very proactive.
Safety one, you know, accidents are caused by failures and safety 2 is this, you know, understanding the factors for success. And the last one is such an interesting and such an important one that safety one is about seeing staff and the team as a liability and safety 2 about seeing staff as a resource, and it's so easy to see. To see people as a, as a, as a, you know, a a a a problem to solve rather than a solution to, you know, achieving the outcomes we want to achieve.
And it's a subtle change of language, it's a subtle change of focus that has an incredible impact on people's experience and therefore their emotional state and therefore their ability to think, perform and communicate and do everything that we need them to do because it has a direct link, impact on, on our ability, on our cognitive abilities. So therefore the last this kind of missing piece there, what we've been talking about is the informed culture, the flexible culture, the learning culture, the reporting. We want to report and we want to learn from adverse events and near miss events, but also positive and events that are happening all the time.
But such a key part of this is having an alternative to a blaming culture. And what we, so, so at the, at the er sort of the, the, the foundation of all of what we're talking about is actually looking at this and actually saying, OK, what does that look like? We need to be able to recognise that regardless of skill or experience, well-meaning team members can make mistakes.
So that's almost the foundation of going, of accepting that that's something that can happen. But we've got to build on from that, we've got to also say that we acknowledge that, well, we can all develop these unhelpful norms in pursuit of achieving our goals, and that's the migration of boundaries, that's this tendency for human beings to push the boundaries, to want to achieve, to want to move on to the next stage. And that's, it's a fundamental element, we're hardwired to do it.
But we got to recognise that there's a line in the sand, and we got to address reckless and intentional harm accordingly. And this is why we don't call it a no blame culture, but we're trying to find an alternative to a blame culture. And what we're trying to achieve is this psychological safety, this sense that people feel safe within their team, feel safe that if they make a mistake, they won't be punished.
Feel safe that if they, they do unknowingly develop a workaround and don't follow their protocols because they're trying to get their job done, they're not going to be treated unfairly. But they also have to trust that if. If someone else in their team does do something knowingly to cause harm, then actually that that that's gonna be treated accordingly.
And all of these things contribute towards psychological safety, that sense of feeling safe in your team. And it's incredibly important. Some work from Amy Edmonson at Harvard over a, over a decade has really, really highlighted this, the, the link between psychological safety and patient safety.
So what we're talking about is something which has been labelled and been given the name of a just culture. A just culture is something that seeks to find a balance between learning and accountability. So what are we saying then?
So to achieve psychological safety, it's about this idea, it's not about relaxing standards, feeling comfortable, being nice and agreeable, or giving unconditional praise, but it's about trust, respect, openness, and that people can raise concerns and ideas and not have that fear of what will happen. And it leads us to talking about this, this link between Google and a saber-toothed tiger. It might be something you remember I mentioned last time.
But essentially Google were involved in a huge project called Project Aristotle, which looked at the consistent factors around high performing teams. And what they found across the world, across all their thousands of teams, the one thing that was most consistent across all of them was psychological safety. And the reason that that links to the saber-toothed tiger is that if you feel safe.
We are in an emotional state which is consistent with with logical rational thought processing and decision making. As soon as we feel threatened, our amygdala, our our emotional part of our brain doesn't necessarily know the difference between that perceived threat and an actual threat. So it responds in the same way by shutting down our frontal cortex, stopping us from thinking so that it prepares us to fight, flight or freeze.
So it's stuff that we often all know, but you know, it's, this is why psychological safety, and an alternative to a blame culture, a just culture is so important, because our brain doesn't know the difference between the saber-toothed tiger and unrealistic workloads, lack of respect, unfair treatment, not being heard and being unappreciated. So really, really, really important. So what we're talking about when we, when we talk about a just culture is actually saying, yeah, there has to be a line in the sand, we have to kind of say there are some things that aren't OK and everyone needs to believe that because actually I don't think anyone wants to work somewhere where those things are OK.
So that's why sometimes a no blame culture isn't the expression which is used. But ultimately we're saying that sometimes it's because of adverse events occur because of a lack of knowledge or skill. But again, that's quite rare.
And in fact, you know, in aviation we know that about 80% of accidents and incidents are caused by something to do with human factors, not because of a lack of technical knowledge or skill. The research from the paper from Catherine Opsterby in 2015 in in errors in Veterinary practise showed that only about 14% of the errors were that they studied were as a result of a lack of clinical knowledge or skill. But it does happen and if it does, then we need to know.
Most of the events, 80% in the research from aviation, from healthcare, and from veterinary are these honest mistakes or systemic failures. And we need to get this balance because we've got to respond accordingly. Because it it it it and and contribute towards this psychological safety.
And we've got to consider which is most likely. It's this idea, and I like this expression, you know what we want to do is find out the why behind the what. Quite often it's easy to know what happens, but really understanding why it happens.
What was it that meant that that person made that mistake or cut that corner? What, what element of their humanness was it and what role did the system have to play? And how can we balance learning with accountability?
And that's such an important one. So a just culture, it leads to this reduced anxiety, this psychological safety. And that's something we talked a lot about, and that leads to people being more willing to report and, and talk about stuff that goes, goes, potentially goes wrong.
Stuff that maybe nearly goes wrong, the near misses, but also to talk about what goes right. And the more safe people feel in an environment, the more prepared they'll be to openly tell you what's going on. Because learning and improvement is what we want, and if you combine all of that with this other elements of a safety culture we talked about being the flexible, adaptable and informed and having that learning culture, that's, they are the ingredients to our safety culture.
But the great thing about the a a just culture is that as an alternative to a blame culture. We're talking about other things too. People are more likely to demonstrate these adaptive behaviours.
So basically what that means is, you know, they're likely to take necessary risks if they have to, to achieve their outcomes. And we all know that sometimes that everyone in, in, in all professions has to be, feel they have the autonomy to do that. We flatten hierarchies, which helps people to feel safe to speak up if they have a concern.
It helps people to come up with ideas to solve solution, sorry, solve problems. And it helps second victims, and I won't go into that in too much detail, but these are the people that suffer as a result of errors that are made. Not the victims themselves, directly, the patients or the, or the, the owners, the carers of the patients, but the people involved in those areas can suffer.
And actually having a just culture where good can come from adverse events is a big, big key, towards their sort of recovery as these, these so-called second victims. And just to talk a bit more about culture, as, as a, at a, at a more, sort of foundational, you know, fundamental level, there's a really interesting paper, following, it's, it's, it's about, I say recently, it's from 2017, a paper in healthcare, which follows a two year mixed method interventional study. It was, it was known as Leadership saves lives, and it was designed to promote positive change in organisational culture and 10 hospitals in the in the United States, as I, as I said, across two years.
And in the 6 hospitals that experienced substantial positive cultural shifts, changes that were most prominent. So the things that we they identified the most were the key areas associated and said leadership saves lives is what it was called. And it was in these domains, it was learning and improvement and learning environment.
We talked about a learning culture from reporting, from that humility. We talked about senior management support, and they say psychological safety. And the really interesting thing here is that the hospitals that had these marked positive shifting culture experienced significantly greater decrease in these mortality rates when they were looking at .
It was actually around, heart attacks in, in, in the, the patients across these hospitals. So, really, really interesting study, if you get more time to have a look at that, I'm happy to send the, the, the details of that, but it's just to talk about the link between these perceived changes in culture and the link that that can have on safety and outcomes, specifically around learning, specifically around the management from support from management and specifically around this topic of psychological safety. So, there's a lot, there's a lot there, those are some of the core principles, some of the core, as is sort of foundational, .
Studies, some of the, the core material that supports these principles of just culture, these principles around reporting, these principles around psychological safety. These principles around having, a systems approach. But there's a lot more to it.
And so hopefully this is a, as a starting point and something that some of you may have heard before, but hopefully it's, it's, it's a, it's a, it's a, a trigger for, for changes or, or cha considerations in, in what you're doing in practise. So human factors in veterinary practise covers so many different things, and today we talked about, specifically, we talked, you know, it's all about improving performance and patient safety, but some, if you look at some of the elements there, we talked a bit about checklists, we talked a bit about systems, we talked a bit about the impact of leadership. But it's about and and and so this topic of safety culture and systems thinking is.
An element within human factors, but it's just one of so many, and this is just to kind of give you that, that bigger picture. But that's what we focused on today. Within the topic of human factors we talked about safety culture and and everything that that relates to.
And it's just as a sort of concluding point here, you know, we cannot change the human condition. We can change the conditions under which humans work. And there's an extra, extra, quotation mark there, for some reason.
But that's what it's about, it's about understanding the conditions, the things that help. Make it easy for people to get things right, hard to get things wrong, whether that's because of working in a safe environment, where they feel safe, where they are more likely to be in a, in a cognitive state which is conducive to performing and delivering their clinical skills, whether that's because they've got tools such as checklists. But either way, it's understanding our very humanness and creating systems and a and a and a culture that's that, that, that supports them.
So what did we talk about? We talked about systems thinking, we talked a bit about safety culture in practise. We talked about psychological safety, talked about just culture as an alternative to a blame culture.
We talked about balancing learning with accountability, so it's not simply about saying. It's no blame. There are things that have to be agreed, aren't OK if someone sets out to cause harm, but ultimately, if something is, if it is if it's an honest mistake or a systemic failure, then we'll find out why it happened and we'll do something about it, not simply seek to find blame.
And we're learning beyond failure, so this is safety too, this principle around it's not only when things go wrong that we've got the opportunity to learn, but it's also when things go right. Yes, it's about when things are exceptionally good, but it's also about the everyday. It's about understanding work has done.
It's about observing people, it's about talking to people, it's about doing surveys, it's about gathering as much information you can and supporting people to help make it easy to get it right and hard to get it wrong. So that's what I've got time for here, very happy and interested to take questions for as long as anyone's able to stay, and otherwise that's it from me. Brilliant.
Thanks very much, Dan. We do have several questions for you that have been stacking up as you've been going through. So first one, do we have a systems approach, checklists from the BVA or BSAVA or any other organisation that you're aware of?
Yeah, there are some, some templates out there. the most recent ones that that I've, I've seen that I've used RCVSnowledge have actually developed quite a a comprehensive guide for checklists in practise, particularly for surgical checklists. So this doesn't go into sort of more in-depth checklists around sort of an endoscopy checklist or anything like that.
But certainly for your sort of more standard surgical checklist, the RCVS Knowledge guide, which I think is downloadable on their material online, I'm happy to send a link. If, if anyone needs it. It's probably the most comprehensive resource, but, as a yes, I, I, I'm trying to remember now exactly what BVA have got available.
I'm trying to picture it now. But the one that comes to mind that said is the RCPS knowledge, material, because I believe that's a combination of, of, people that have, sort of representatives from different places over the, over the years who have, who have combined to, to put that material together. So it's, it is very useful.
OK, great. I think the anaesthesia Association have a checklist as well, don't they? Yes, you're absolutely right, they do, yeah, it's, that's, that's quite a good one too.
Yeah. Next question then. Does anyone else find that if it is quiet, that on those days we may be more likely to have a mistake or adverse event than on a very busy day.
When we're busy, we seem to be working at our best. How would you address that? Mm, it's interesting, it's almost, it's trying to.
It's, it's this proactive identification of, of conditions which are, are, are, are likely to, you know, or likely or unlikely to allow people to perform at their best, and I, I think you're absolutely right, identifying those, those quiet days where, where either complacency or, or sort of our, you know, we're we're we're likely to maybe not be stimulated. And then we, we, we sort of talk about, about this idea of having the optimal level of sort of stress or stimulation. And I think being the more we normalise conversations around this sort of thing, well, the more we normalise proactive considerate so so briefings before surgery or before procedures, the more we normalise those sort of things which have become very common in healthcare and in aviation, the more we are likely to actually recognise those sorts of things.
And even if it is actually saying, hey, look, we've all sat around all day and we've suddenly had an emergency come in, we're probably not at our best right now. And actually normalising the ability to talk about those sort of things, rather than one person bringing that up, and it, it seeming a bit weird, and maybe someone making a slightly, dismissive remark, around, you know, and, and, and actually, you know, again, it is the word normalise is, is so important. So, I'm not sure that quite answered the question exactly, but I, I, I certainly think that that's, that's relevant to this.
I don't know if you've got any thoughts. Yeah, I said, well, I think you, you talked about like checklists and things. So even on a quiet day, if you're going through a checklist, if you're doing that systematically, you, you talked about a talk of Andy, I mean like great, he's written lots of great books, but the checklist manifesto is, is, is a brilliant one.
He talks about the concepts of the pre-mortem, doesn't he, which I think is, is an army. Thing originally, but like you say, at the beginning of the day, you just spend 5 minutes, right? What have we got on today?
What do we anticipate, etc. Yeah, and I think just that that's something else. Someone I mentioned briefly, you know, Martin Bromley, who started the Clinical Human Factors Group as a result of a very tragic passing of his wife following, a death in anaesthesia in 2005.
So he starts the Clinical Human Factors Group and he's done a very emotive video for anyone that's not seen it. It's called just, and it's the reason I mention it is because it's called just a routine operation. Because it was supposed to be a very normal, very everyday event.
It was an anaesthesia for a sinus operation on an elective, you know, surgery for a healthy patient. And she, she, you know, she very sadly passed away. And it made me to think what you were saying there, that, that sometimes when things appear to be very normal and very easy, and that everything's going well, we are sometimes our most vulnerable.
Absolutely. And and to name check another book there, of course, Matthew said, Black Box Thinking, has that as the introduction to the book. That whole case, and I think we've talked about that before on one of these or several of these practise management webinars but a great book as well.
Next question then, thank you for that. What do you do if you have one person who constantly tries to find a workaround? If that one's not allowed, they try another and another and another rather than following the checklist.
Yeah, honestly, I think the reality, and it's not to be defeatist, the reality is that no matter where we get to as a professional within your practise, there's, you know, and of course depending on the size of a practise, but you know, if we talk about large numbers, there is always going to be outliers who, who, you know, even when we get to a place where these things are totally normal and everyone is doing checklists or or is following protocols or, or briefing before surgery, always just gonna get exceptions to that. And the problem is. All we know is that the more those people that say those people, but people who have that inclination are told to an extent sometimes they're less likely they are to do it, .
And in terms of, you know, where that leads, you know, from a sort of, human resources perspective, isn't for me to comment unnecessarily if ultimately someone's being, is not, not being asked to doing what they're asked to be doing. But all I do know, and this is my experience from aviation as well, from just those characters. So in aviation, human factors training has been normal now for, for 30 years.
But you still get people who don't, who don't see the value and don't, don't engage and don't, don't, and don't appear to, to, to follow the, the, the guidance. And you can't change their behaviour by telling them, but you can focus on everybody else. And the more you focus on the people who are doing it and the people who do get it, and you encourage that and you support that, the more likely you are to help the people that don't.
See it and don't get it, the more likely they have to change their behaviour because of what everyone else is doing. And if you, if you have if if you eventually, the more they are, they are made to be the exception, and the more other people demonstrate those behaviours, the more likely they are to follow or ultimately, if they feel left out because they're the only ones not doing it anymore, they might choose not to be, not not not to want to be a part of it anymore. But that ultimately is, is, is the, is sometimes the most effective way for those outliers, .
Because they're always gonna be there sadly. Is there any mileage in having the conversation around why they're not doing it, or, or is that in your experience, is that a bit of a waste of time? I think so.
It just depends very much on the situation and sometimes it can be, you can, it can be a lot of, it's not to say that, as I said, there's there's no point at all in that. I think it just sometimes can get to a point where your energy is better spent on the people who are doing it. But I, I, I absolutely agree.
I think understanding why, I think, is always, is always should always be the starting point in terms of, you know, why might this person be finding it difficult? Why, why might it be potentially, a threat to, you know, a checklist to some people, understandably is a threat if they've done something very successfully for 20 years and they've never used a checklist. The suggestion that they now inverted commas need a checklist is is understandably potentially going to trigger their threat response in their brain, which is gonna shut down their, their frontal cortex, the bit of their brain they need to think logically.
So if you can do something to understand why that might be the case and to empathise with that and to to have that discussion, you have a much better chance. . But I think the the context I was discussing, I suppose, was maybe where we, where you, it got to a stage where maybe that wasn't working, but absolutely, I totally agree that that, that would be a, a, a very helpful starting point.
Yeah, I think you've, you've maybe covered this next one. It's kind of linked to the first one. I don't know it's from the same person because they're both anonymous, but it says, so if the person constantly says they don't know why they do what they do, what can those in management do about that?
They don't know why, as in they don't know why they're being asked to do, is that the con the context I'm not sure, to be honest, I'm, I'm, my interpretation is they're not following the checklist, but they don't really know why they're not following the checklist, but I, I don't know if whoever's asked that if they want to clarify if you've got that wrong. I think that's a really good example of, of exactly what you said there in terms of why, you know, what might it sometimes it's, it's. It's for some people it's about triggers and they have every intention to do it, but their trigger, it's just been so ingrained not to.
It's understanding, helping work with them to understand why the trigger's not there and what, you know, and what, what can be done to, to improve that trigger, so, so that it becomes a habit. So yeah, that, that might be, that, that would be an interesting one to, to understand is, is, is the context of that question. Because I think that, that can be one of the barriers you've got, we've had some clarification, that's correct.
They do not follow the checklist and they cannot give a reason why. Yeah, and so you can, I think, I think going through the, I think going through and understanding, and, and, I think if, if, if, if it's, if it's not a productive conversation, as I said, and, and, you know, sometimes it can feel tempting to pursue that for days, weeks, months, and it's not to say that it's, it's, it's OK to simply say they shouldn't be doing it. And, and I think, offering feedback when it's, when it's available.
But all I, all I would say is you're more likely to change that behaviour if it's consistent and, and, and there appears to be no willingness to change that behaviour. You're more likely to change it through what you do to other people and what they experience from other people. And, and, and it's not to say that it will guarantee that that will change their behaviour either, but my experience is that when people get to a stage where they're not even engaging with that conversation.
They're more likely to, to, to, to shift through just almost subconsciously because of what everyone else is doing and if it becomes the norm, then eventually they may well start doing it too. But it isn't easy, and I'm, yeah, I do appreciate how difficult that situation can be. The, the challenge is reaching a tipping point where 90+% of people are doing it the way you want them to be doing it.
And, and, and that's much more likely to influence the people that aren't. OK, great. We've had a comment.
Thank you. Yes, the link or links will be most welcome. I'm not 100% sure.
I know you mentioned you talked about offering links for something I can't remember what it was. Oh, it was the paper, the, the, the paper on culture and the, the American hospitals. Yes, I will, I don't know what's, how best to do that.
Is it what's that? I, I think if. Are, are you thinking of a PDF or just a.
It's yeah, I think I can either do it as a link cos I think it's an open sor. I think it's it's available, so I I I have to try and remember now whether I can send it as a link or as a PDF. I'll be perfectly honest, I don't know the best way to do this, but if you have a word with, the, the admin people at the webinar that, and it will probably be a case we could probably post it because the recording of this will go up.
So we can probably get it posted in the, the blurb goes goes with the recording. It's no problem at all if that's, if that's, if that's helpful, then I'll, I'll, I'll sort that out. And same person as I struggling to hear the book recommendations.
Again, do you want to repeat yours and I'll repeat mine. Yeah, so the book recommendations that I mentioned were, well I think the ones that we both mentioned, you mentioned the Checklist Manifesto from Atawande. I mentioned, I did, it wasn't a book, but I mentioned a video on, it's actually available on YouTube if you, if you.
For just a routine operation. It's, it's narrated and presented by someone called Martin Bromley. And so you should find it quite easily if you do that.
It's about 13 minutes long. It's very, very powerful, very emotive. And, and I think as you said, Andy, the, the, the story is described at the beginning of the book that you mentioned, Black Box Thinking by Matthew Zayed.
I think those were all of them. Can you, were there any others? Off the top of my head, I can't think of any, but, again, whoever's asking that question, if you want to, email the webinar if you're still not clear.
And the, the other one I would mention is, is a book, . By there's sev several books around psychological safety by Amy Edmondson. And she's written a few, the exact titles escape me.
But there's one around organisational culture and psychological safety that she's written, which is excellent. And it's, it's, it's very, it's very well written in terms of in terms of how easy it is to understand. So, that, that, that I would recommend very much.
Great. OK, thank you very much. Well, you'll be pleased to know that's the end of the questions.
Which you've handled very well. So it's good to, you've obviously generated some thoughts amongst the audience. So it's really just to say thanks again, Dan.
That was an excellent presentation. I really enjoyed it and I'm going off the questions. I'm sure lots of people found that very useful and very stimulating.
So thanks again. Brilliant. Thank you very much.
Thanks everyone for, for joining. OK, great. Thanks.
Good night, everyone.