Description

The study of Human factors is an established discipline that uses scientific knowledge about the human body, mind and behaviour to better understand our fundamental capabilities and limitations. It is ultimately about creating the best possible fit between the people and the environment in which they work. Our Introduction to Human Factors in Veterinary Practice course will help you and your teams to understand the components that make up consistently high performing individuals and organisations. We will explore issues associated with non-technical skills, health and wellbeing and tools such as checklists, all of which combine to enable your teams to reliably deliver their clinical skills.

Transcription

Good evening, everybody, and welcome to the latest, in our, practise management webinar series. My name is Andy Mee from Veterinary Management Consulting. And tonight, our speaker is Dan Tipney.
Dan's role at VETE primarily involves providing insights into human performance gained from his perspective across multiple fields. Formerly as a semi-professional athlete, sports coach, pilot instructor, cabin crew member, and currently as a commercial pilot and human factors trainer. Across all fields, Dan has consistently observed positive change as a direct result of non-technical factors such as leadership, communication, well-being, and in particular, positive organisational culture.
He has written and delivered training courses to cabin crews and pilots, combining non-technical skills with systems and processes such as checklists and safety reporting. As such, the power of a just culture and the associated growth mindset forms the basis for much of his work. Dan greatly enjoys developing his knowledge across a wide range of subjects which he believes contributes towards human performance.
He has recently completed courses in mindfulness-based cognitive therapy, positive psychology, and process communication model. This evening's webinar is sponsored by MWI Animal Health and VetE. Over to you, Dan.
Thank you very much, Andy, and good evening everyone and for those watching live, thank you too for joining at the end of what's been another gorgeous sunny day. So I hope it's been the same for you wherever you're joining from. So I'm here tonight to discuss human factors in veterinary practise.
I'm delivering this from a non-clinical perspective, as my introduction would suggest. So it's from my experience as an athlete and a sports coach and currently as an airline pilot. And specifically as a human factors training, trainer for aviation crew, more recently within the healthcare sector, specifically, in the last 3 or 4 years with VETLE, delivering and, and actually, applying the principles of human factors from those other sectors, those other professions into the veterinary, sector.
And it's never as simple as just cut and paste. You can't just take the principles as they do it within aviation or within healthcare, and simply apply that straight, as it's done elsewhere, but very much the principle that applies. So I'm here to talk about that.
I, I got a real interest in human performance at a very fundamental level, and that goes back to my, my, sports days, . And if I was the, the simplest way that I would describe human factors is if you were to imagine someone's technical skills. So, in your profession, and if you're in a clinical role, that would be your clinical skills and knowledge.
So if you imagine those in one hand, and then your other hand, some distance apart, if you were to imagine the outcomes that you want to achieve. So for for you in your clinical roles, that would be the outcomes for the patients, for example. And in something like aviation, that would be flight safety, so safe flight for the passengers, safe and comfortable flight.
But it's not as simple as just sim as, as having the skills, you know, will, will automatically result in achieving the outcomes. And there's a gap that exists. If you imagine those two hands, there's a gap that exists between the two.
And human factors is very much looking at exploring. The gap that exists to help professionals in all sorts of roles to more reliably deliver their technical skill set to achieve the outcomes they want to achieve. And so that's the perspective that I'm bringing you tonight, based upon the evidence and the learning and the insights from other professions and applying it to, to within the veterinary sector.
And a very good way and a very interesting way of starting this is to ask this question, what are the day to day aspects which may affect you or your team's performance? So, put another way, if you were to imagine a good day, compared to not such a good day, what are the things that can vary from one day to another, which will help one day be, you know, better from a, certainly from an an outcome perspective than another? And I've asked this question maybe to, to dozens of, of, teams, professional teams from, the aviation sector, from healthcare, and from the veterinary sector.
And across one year, this is a sort of summary of the sort of responses that I got. So the bigger the words, the more it was said or or a variation of that word. And I've colour coded them.
So you can see that the words in purple, apply, to things that are specifically. Relate to clinical skills and knowledge. And the things in pink are things that on the whole are outside of our control.
And what you're left with is the majority, and this is very representative of every time I've asked this question, that the majority of those things are in some way or other, within our control, or, or at least we have some influence over them, and they're actually nothing to do with the technical skills of the people involved, in your case, your clinical skills. And so, but, but, but the, the, the thing is, when you see those things, such as teamwork, such as the way we look after ourselves, which is the way we support each other, it can sometimes seem really obvious, and it can sometimes seem like common sense that these are just things that any good professional will get right. But what we've learned, the evidence that we've learned, we've gathered, particularly when from when events don't go so well.
For example, within aviation, within healthcare, and more recently, within the veterinary sector, that actually it's, it's, it's very often something to do with these factors that, that determine the outcome. And so we don't want to leave it to chance. We don't want to just simply assume that good professionals will get this right, and we want to take a very positive intentional approach towards these areas.
And the expression I use sometimes is being good by design rather than being good by luck. So I want to pick up on that expression that sometimes you see these things and think, well, surely it's obvious. I want to use an example now from aviation, surely it's obvious.
Just imagine for a second that you're flying, an aircraft. You're about to take off and it's very foggy. So it sounds obvious that you should confirm whether or not the runway ahead is clear.
Now this is an example from 1977, so I'd like to stress that nothing like this has happened since, but this is something that we learned a huge amount from. So although, it might sound obvious, the captain of the aircraft on the right, did start to accelerate down the runway, not knowing that there was an aircraft still, on, on the runway further down. So as a result, they collided, so there was an accident, a very serious, accident.
Now it'd be very easy to just say, well, it's obvious that captain should have known, should have, should have known to check that the runway was clear. He was very experienced, he was very, very highly regarded, very, very highly skilled. He should have known the answer to that question, therefore, it was his fault, and that's, that's the end of the, of the investigation.
But luckily there was more information to find out. There was black box recorders, there was an accident investigation, and actually what they found out that it was nothing to do with his lack of skill or lack of experience, but a whole host of reasons that fill the gap that I talked about. So one of the things was that there was a co-pilot in the cockpit as well who had a pretty good idea of what was going on.
But because of the culture in the airline at the time, it wasn't normal for junior crew members to speak up and raise those kind of concerns, and you can hear him hinting. He wasn't helped with the non the assertiveness skills that he needed to be equipped with, and the, the, the culture within the profession didn't support that kind of behaviour. Equally, the stress and the fatigue of the impact of the foggy day, and the impact that it had on their, their wellbeing and their decision making was a, was a massive factor, but it wasn't something that was ever touched upon or explored or discussed within the sector at the time.
Another thing was that there was assumptions that were made. So the, the air traffic controllers and realised at the last minute that the runway wasn't clear, and they said to them, do not take off. We say again, you're not cleared for takeoff.
But the part of the message was blocked and the the pilots only heard the word takeoff. And upon hearing that word, they assumed that therefore they were cleared for takeoff. Now it now sounds obvious to say, well, surely they should have they should have explored and, and considered the fact that maybe they were saying you're not cleared.
But in that moment, that wasn't, that didn't seem obvious because we all have cognitive biases, things that are driving us to make decisions at a subconscious level. And another thing that we learned about here was, more about our cognitive biases, and that's been a big function, and, and, and factor within, this sort of training within aviation, and it's one of the reasons that an event like this hasn't happened since. So if we, if we think about that as an example, nothing to do with the technical skills, nothing to do with anything that was wrong with the aeroplane, but very much to do with humans at a very fundamental level.
And if we think about this, highly skilled people with good intentions. An example from healthcare, in 2005, someone called Elaine Bromley went into hospital for what should have been quite a routine operation on her sinus. She was a healthy patient, it was elective surgery, but they were unable to intubate Elaine, after she'd been anaesthetized, and they continued trying to intubate her for 20 minutes.
Now this again, like in. I'm very sorry about that. I'm just gonna have to wait for my bell to still ring, I can't do anything about that.
Anyway, I'll continue telling you the story, . They were continued trying to intubate Elaine for 20 minutes. And .
Like I said, in the Tenerife example, there was a very experienced surgeon and a very experienced anaesthetist involved in this procedure. And it wasn't through lack of knowing, but when we explore what happened in this event in more detail, we found out that that actually there was a a great deal learnt about what happens when we become very, very focused on a particular task. And one of the things that happens when we become very focused is we lose track of time.
So it wasn't because they didn't know what they needed to do, it was because on that occasion, in that moment, they failed to deliver that knowledge. And at the time, there was, it wasn't, it wasn't deemed necessary to understand these kind of things and teach these kind of processes and develop awareness around these kind of factors because it had never been seen as necessary, and it's only through an understanding of events such as this that understanding more about our human limitations and how we can mitigate those. Just like in Tenerife, one of the nurses involved had a pretty good idea what was going on, didn't feel that she could speak up.
Not necessarily because of what that team did on that day, but because of maybe how she'd been treated on previous days by other teams. So she didn't feel that she could speak up assertively. She wasn't equipped and trained to speak up assertively, and a lot that's been learned about these kind of events has equipped teams in the future to do things just like that.
Now within the veterinary sector, there isn't yet the same level of evidence within the human factors field. Now we do have some understanding. The last 5 years have given us a lot more information within the field of patient safety and human factors, but we still don't have the same level of evidence and and data that that exists within aviation and with or within healthcare.
But what we are finding is that the evidence that is coming out is suggesting a very similar picture that when events go, don't go as planned, it's very rarely because people don't know what they're doing. And it's very rarely even more rarely because they didn't have the right intentions. It's because they're human beings.
So what we're trying to do is it is it sorry, interpolate information from other sectors and combine it with what we're learning from within this sector to utilise these principles within, the veterinary profession. So what I can talk about adverse events in practise from the, the research that was carried out in 2015. Catherine Oxterby carried out, research on errors in veterinary practise.
And not surprisingly, it it almost sounds too obvious to say when you, when you see it like that, that human error was the largest cause of diverse events, in practise. But what that means is just like within aviation, just like in healthcare, it wasn't because people didn't know what they were doing, it wasn't because of something totally unforeseen from a technical perspective. It was because humans are humans.
And an understanding of that. Is the thing that's likely to have the biggest impact on. Their ability to improve.
So the majority of these occur when distracted or under stress. So an understanding of that, what the impact on our cognitive ability is absolutely key. And bearing in mind that inadequate care or negligence is very, very rare, and that's why we use the expression.
It's, it's, it's highly skilled people with good intentions. . And it all starts with awareness, as a starting point.
So the more we learn about our cognitive biases, the more we learn about the impact of stress on decision making, the more we learn about, the impact of culture on open communication and sharing concerns. That awareness is a starting point from which point we can then develop, ways of improving. So accepting and understanding error, as a summary, the lack of technical knowledge, within, within across most professions, but particularly highly skilled, highly trained professions, is rare.
Inadequate care is very rare, and this is, this is what we're seeing, you know, again, it's, it's a similar picture across all the professions that I'm talking about. So it's not necessarily a lack of technical skill, but the reliable delivery of those skills when things don't go as expected. So that's kind of the very that's a very general introduction to what all this is about.
So I want to talk now about some of the specific aspects of human factors, and this is about the cognitive aspects. So we're only human after all, and our brain can do absolutely amazing things, but we need to understand a bit more about it at a fundamental level. Now some of you might have seen this before, this is based on some research at Cambridge University some years ago.
For those of you that have seen it, I'll just let you read it again. For those of you that haven't, again, I'll I'll just let you have a read. But the really interesting thing about this is that.
It, it shows that when we are an expert at something, so as the user of an as the user of the English language, if you are fluent, you are therefore an expert in that topic. And as an expert, we're very good at matching patterns and filling in the blanks. So although technically this is total nonsense and makes no sense at all from a technical perspective, you're actually able to make perfect sense of it because of your experience in the language, your brain is able to, to, to, to, to make sense of, of things that don't make sense.
That can be really helpful. And certainly when it came to studying, even if the words are in the the letters are in the right order, it can be really helpful because if you had to read it every single letter consciously, it would have taken you a lifetime to have gained a veterinary degree or to learn how to gain any qualifications at all. So the fact that we can process this information at a subconscious level and match these patterns is incredibly useful.
But Does our amazing brain always give us the information we need? So you might have seen something that popped up on the screen there, and again, if we had more time and the ability to sort of to talk about it in person, we'd maybe get an idea for what it is you saw. But I'll just let you have another look at it.
And, and again, you know, try and recall to yourselves what it is that you saw that came up on the screen there. Now, if we have another look at it again, it resembles a very well known childhood phrase. But if we actually look at it again, now when I did an exercise like this the first time, I had to read it about 8 or 9 times before I realised there was an extra word.
So, there is of course an extravert, and the thing is, your brain will have almost certainly seen the whole sentence, but. It did you a favour, and our brain is constantly doing these things subconsciously. And one of the things our brain is trying to do is solve problems for us, and most of the things that our brain is doing at a subconscious level is there for a very good reason, but it's usually there based upon existing in a world that we no longer live in.
We no longer live in a world where survival is our primary is our primary concern, but our brain still functions if as if it was, and solving problems is a big factor there. Now what it will do is it will discard information that it doesn't consider relevant. And on this occasion, because you're familiar with the expression, it's probably the second the your brain discarded.
And on this occasion that did you a favour because it meant that you were able to recognise a well-known childhood phrase, without having seen it for long enough to consciously process it. However, of course, if the situation was different, and if that metaphorically, if that second the was actually important. Then you'd be missing something really critical, and your brain would have done you a massive disfavour and it's doing these sort of things all the time.
And the more we understand this sort of thing, the more we're able to mitigate the potential limitations. So, if we think about it from a practical perspective, you know, sometimes we see what we expect to see, and this was a very useful, image that was sent to us on our human factors, our veterinary human factors group, a few years ago. Some of you may be familiar with the syringes, if, if you, if you use them specifically.
But this was, this arrived in a box, a box that had mostly doxycycline in it, but it also had a couple of syringes of dexamethasone. And they arrived all just loosely packaged in a box together. Now, especially if someone who was unpacking that had had unpacked a box full of doxycyclone every single time, the last 1012, 15 times they'd done it, it would be totally understandable why, although logically it makes sense to check everything really carefully, it's not because of a lack of knowing, it's because the more we do something, the more we become conditioned to expect the situation to be the same, and it's happening at a subconscious level.
So we know that we can make mistakes, we know that these sort of things can happen, but what we, what it does point out is that the more experienced we are at doing something, the the less likely we are to notice if something's out of place, and this is, this is, has really big implications for several reasons, . The a really important one here is that actually an inexperienced team member is really, really valuable because they are much, much more likely to notice if something's out of place because their brain isn't in fully autonomous, automatic mode, it's having to consciously process quite a lot. The other thing it relates to here are the sort of things we can do to overcome these kind of of issues.
And one of the things within the NHS that's become quite common are drug checks, which I'm sure some of you are very familiar with. But the interesting thing about drug checks is it wasn't, they didn't necessarily find that doing the drug checks were what made the difference. It was how they were done.
So the difference in between picking up a syringe and saying this is 30 millilitres of dexamethasone, isn't it? Versus, could you tell me what's in this syringe please? Oh, might sound pedantic and might sound like the same thing, but they have entirely different consequences potentially on our abilities to pick up errors.
So open questions a big, a big factor here. So again, just introducing some different topics here. And we're only human after all, this is relating to something I'm sure we can all relate to.
If only I could remember what I came in the kitchen for. I've done it many times. We did count the swabs asking spa loader, didn't we?
And again, we know that we've got limitations with our working memory, but all too often, when particularly when we are, under pressure to perform in a professional role, the solution can be seen as just simply saying, well, don't forget, or if we, if we, if we don't pay, if we, if we, had an issue with our attention or we got distracted, a solution might seem to just say, well, just pay attention. And. We know deep down that these things don't work because people don't forget because they're meant to forget.
They forget because they're human. And this relates to a story from the 1930s. There was an aircraft called the B-17.
It was a bomber in America. It was a very complicated aircraft, and there was a few accidents involving the test flights because there was a safety mechanism that was built into the aircraft that needed to be removed, that needed to be taken out before the aircraft was flown because it was designed to keep the aircraft safe on the ground. But because the pilots were were working in unusual circumstances, they were having to do demonstration flights and they were having to do press interviews.
They were outside their normal flow of events. And so they were forgetting to take this safety feature out. And as a result, they were getting airborne without full control of the aeroplane, and they were having these accidents.
Now it would have been very easy for Boeing, the the manufacturer to just say, well, you know, we need better pilots, we need to just tell them to to not forget. But fortunately they had the, they, they took the accountability to realise that actually, that they, they had some responsibility here. They had designed this aircraft and it was partly up to them to help come up with a solution.
And as far as we know, this is the first checklist that existed within a professional setting, and this was something, a recognition that they that they took to say actually, well, maybe we should provide something that makes it easy to get it right and hard to get it wrong. And I like that expression, providing something that makes it easy to get it right and hard to get it wrong, because these people were really good at what they did. These pilots were really good, but they were getting it wrong and they, they, they the the the manufacturer took that accountability to make it easy to get it right and hard to get it wrong.
And it wasn't telling them how to do their job, it was just recognising that before a really critical thing happens, such as taking off, landing, starting the engines, let's just check we've done the really important bits, the sort of things that could really ruin our day. And they, they, they, they, they just all they, they figured out a way of just being able to draw a line under us at one part of a procedure before moving on to the next. And that's another really powerful and important aspect of using a checklist, is it draws that line and helps us move on in a in a system in systematic way.
And some of you may be aware that the World Health organisation, research back in 2008. Atul Gwande was someone who was, heavily behind this. He's written a book called The Checklist Manifesto and was very involved in the research around Checklists.
So they basically used the same principles. It wasn't cut and paste, but it was the same principles as saying, well, before you continue on to the next part, let's just make sure that you've done the really, really important bits. And I'm sure some of you have seen the versions, and I'm sure many of you are using versions possibly, that are applicable within the veterinary sector.
In fact, we have got a very brief poll, Andy, that might, we could use here that might just give us an idea just out of interest to see if, to see this, and I'm just always interested to know because there's always a variety of, of, you want me to launch the poll now? Yeah, please launch the poll. Thanks, Andy.
OK, so you should be able to see there. The question is how would you describe your use of surgical checklists in your practise? No surgical checklists, aware of a checklist, but never or rarely used.
Checklists used on the days when we perceive there to be additional risk. We regularly use surgical checklists or we use surgical checklists for every procedure. OK, there are still answers coming in.
And this is really just to get an idea, this is it's just always interesting because. Standardly there's a very, there's there's different needs within different practises, and there is no one size fits all. OK, I'll end the poll there.
Looks like we're just about done. Thank you. So we have the most popular is we regularly use surgical checklists, 46%.
0% for every procedure, 23% when it's perceived, there's an additional risk. 8% aware of a checklist but rarely used, and 23% no surgical checklist. Yeah, and it's, and it's interesting, and this, and, and again, every practise has different needs and it's just I think where, where everyone is on this, so where what we found with these is that it's not necessarily what the World Health organisation also found was it wasn't necessarily having a checklist.
There was a few key points that determined whether or not they really, really made a difference, and it was how they were used. Were they involving everyone in the room? Were they really a time where everyone can engage with the checklist?
Were they? So that everyone could be involved. They read in their entirety every single time, and that was the point.
They, sometimes I've seen their seatbelt analogy used, that if we only use the seatbelt on the day we plan to have a crash that it is unlikely to do its job the day we need it. And that was the sort of analogy used with the checklist. And again, it's, it's just trying to, to, to look at the fact that checklists do need to be a adapted for each practise, but it's not necessarily having them, but it's how they used and how often they use that will like indicate whether they are likely to make a big difference to patient safety.
So again, a huge, a huge point within the patient safety human factors field. Just again. I'm just introducing and and going over different areas really, as opposed to going into anything in loads of detail.
But there's so much more we could say about checklists. But they're an example of a simple systematic change, something which is being given to people to make it easier to get it right and harder to get it wrong. And they, and this is what Ashowande said, that they provided no new equipment, no new staff or clinical resources during their 18 month trial at 8 hospitals around the world.
And then they found that, you know, on average, across the 8 hospitals, they had a 47% reduction in death rates in surgery, and a 35% reduction in major surgical complications. Now of course there were other variables, but it was also the . The the actual, the written feedback that came back that was equally interesting in terms of people's perceptions and how they changed over the time during the study.
So again, really interesting, the checklis manifesto, if anyone's not heard of it or or read it, gives you a lot more information and obviously a lot more we could say. Other simple systematic changes are communication structures, communication structures for critical moments such as handovers, rounds with The ward rounds or patient rounds, or, you know, those of you in referral centres, ICU rounds, different forms of of round process and emergencies. And having a communication structure, is, can be really, really powerful.
The, the, there's something you might have heard of called SBAR. SBAR stands for situation, Background, assessment and requests, and it was actually developed by the American Navy for teams on submarines to help communicate in critical moments. And the reason they found it so useful wasn't because the people involved didn't know what to say, but in these critical moments, it gave them a higher likelihood of covering the the really essential points.
But as well as that, it also gave them the ability to chunk information into pieces, and that gives the the receiver a much better chance of remembering the really important parts. It also gave them a standardised way of doing it, so that although there were lots of ways they could have communicated this critical information, it meant that it was communicated the same way every time, and that was actually a really big factor. So there, there's a version that you, again, you may have heard of, that's been used across different healthcare professions and is used in some veterinary organisations.
And they've added the word, the the the identity, just because of course, when you're talking about patients knowing who you're talking to and who you're talking about is really important. But really it's the same thing, it's just to provide this structure that makes it easy to get it right and harder to get it wrong. And what we're finding is that one of the really important bits is that when people are under pressure to do these things quickly, it's easy to cover the identity and maybe the situation a bit about the background, but quite often we were finding that the assessment and the request possibly was missed out.
And by having this standardised structure, it just means that in Those pressurised moments, you're more likely to remember the things you need to remember, not because you don't know what you're doing, but because you're human. And it's a recognition of that, and it's coming up with something to resolve that. So that's just another example.
Another one that I really, I'm really interested in is patient safety briefings. Now, briefings are something else that relate very much to the aviation sector. a briefing will occur between half an hour and 1 hour before takeoff and landing.
And it isn't the chance to go through every last detail of what's about to happen, where you're flying to and how long it's gonna take, and all the details that you should already know. It's the chance to cover the really important things that you might have overlooked. And there isn't necessarily a structure that works for everyone here, but if nothing else, having a moment that you agree as a team, whenever you can, it's, you find the opportunity to do this.
So for example, before clinical procedures, or before surgery, for example. Those, you know, whoever's about to go and scrub in, and that's a trigger for everyone who's involved to just spend 2 or 3 minutes. And if he wants to have a very basic way of of running it, you ask this question, what's different about today?
And literally got everyone, give everyone the chance to just think about it for a second. What could be different about today? It might seem like the most routine thing, apart from that one thing we forgot to mention, which could make all the difference.
Or what challenges could this procedure present and how could we make it safer? And if no one can think of anything, if nothing. Else, the most senior person in the room is able to then say, well, that's great, thanks for, you know, thanks for considering it.
And if anyone sees anything out of place or has any concerns at any point, please speak up. Even if they said it last time and the time before, saying that explicitly every single time will give them a better chance of getting the information they need, if there's a concern. And as we've seen from previous examples, it can make all the difference.
So again, another little structure that that has been seen to make big differences in different sectors. . It's all about what I want to talk about about assistance thinking, and this idea that some of you may have seen this, the Swiss cheese model, which is a model that that indicates that when adverse events occur, it's because there's a, 1 different threats and hazards in lots of sectors and and veterinary is no different.
There's so many different things at any one moment that could present a risk to the outcome. But not all of them result in adverse events. It's only when all load loads of different aspects line up at once.
And if we think about those layers, those bits of cheese almost, what they represent is different barriers, and they might be the training that you've got in place. They might be things like checklists, they might be the leadership, they might be the experience of your team, they might be the depth of your team. And what happens is we have moments of weakness where maybe there is a weakness within a procedure or a protocol, where there is a weakness because we didn't have enough people one day.
And we can split that into latent conditions, and active failures. Latent conditions are things that are there all the time. They're, they're there permanently.
So for example, a latent condition around staff levels, which is something that could lead to an adverse event if you didn't have enough people. For a particular day and it put people under pressure and increased chance of error. If it was every Wednesday evening there was a shortage of of staff, which led to a higher risk every single week, well, that's probably a laing condition because it's something which is there all the time.
If on one particular day, several people called in sick and it wasn't foreseeable, well that's actually an active failure. Or if someone makes a mistake, and this is often the case, that the very last bit of cheese, the last line of defence is the people involved. And quite often if they make a mistake, that's the thing that is seen to cause the failure, but actually, we're missing all the different lay.
Conditions that were there that were pre-existing. It's easy to identify the active failures and the individual errors. It's harder to identify the latent conditions, but if we can if we can identify the latent conditions, we're likely to, to, achieve a better, more reliable result.
So we need to look beyond the individual whenever we can. And this is linked to this topic around blame culture, because blame culture is is exists in so many professions for a very important reason that it's seen to solve the problem. If we, if we, if we can, and it's not to be critical of that, it's to understand why maybe this perception of blame is so, is so common.
But what we're trying to do is come up with an alternative, to say that actually, . We recognise that regardless of skill or experience, well-meaning team members can make mistakes. Beyond that, we also acknowledge that it we can all develop unhelpful norms in the pursuit of achieving goals, and this means that some people develop shortcuts if we need to, to meet a goal, sometimes good professionals will do that, and, and it's, and it's important to be able to accept why they might be doing that.
But equally, we have to know that there's a line in the sand, whereby reckless and intentional harm must be treated accordingly. So it's balancing this accountability with learning. And what we're trying to do is create this this this this concept of psychological safety.
And we're gonna talk about that a bit more in a minute, but what we're really talking about is a just culture, and the reason we haven't called it no blame culture is because we do need to have that balance. There needs to be that line in the sand, but we still need to help people to feel safe. So psychological safety is something which has been looked at from lots, by lots of different people, particularly someone called Amy Edmonson from Harvard, she's written a really interesting book.
And, Adam Grant is a professor of management and psychology at Wharton in America, and he says this, that psychological safety is not about relaxing standards or just feeling comfortable, being nice. It's about a culture of respect, trust and openness where it's not risky to raise concerns and ideas, and I'm sure you can see the the the the overlaps there with the examples I used earlier. Another question is, you know, what's the link between Google and a saber-toothed tiger?
It doesn't immediately appear obvious, other than you could of course Google Sabretoothed tiger and get an image probably quite like that one. But as well as that, the, the, the, the, our, our brain's perception to a threat. Is is the same whether it's a real threat or a perceived threat.
So of course our emotional part of our brain, our amygdala, our limbic system is there to get our attention. And one of the main functions to get our attention is to survive in a primitive, in a, in a world that we no longer live in. Survival is very rarely a factor for us anymore.
But our brain doesn't know the difference, our amygdala specifically doesn't know the difference between a perceived threat, the way someone speaks to us. For example, not being being blamed, and an actual threat. But both of them shut our brain down in the same way because they need to bypass our frontal cortex, the bit of our brain we need to think, and they, they just totally take control.
And a study at Google found that the key differentiator between higher and lower performing teams across the thousands of different teams they have across the world, or thousands of employees and hundreds of teams I should say, was this concept of psychological safety. The teams that felt safe performed the best because they, they were in a better place to perform cognitively. Because our brain doesn't know the difference, our amygdala specifically between a saber-toothed tiger and all these different things that our brain perceives as threats, unrealistic workload, lack of respect, unfair treatment.
Again, that's particularly pertinent to adverse or near miss events, not being heard and appreciated. So just really interesting what we're trying to achieve by developing this just culture. So following an adverse event, what we're saying is there have to be things that aren't OK.
And we have to have ways of ascertaining whether or not that that that the any sort of behaviours sit in that category. But the reality is that's very, very rare. And sometimes it's a lack of knowledge or skill.
And the research I mentioned earlier in the veterinary sector indicates that about 14% of the errors they researched were because a lack of a lack of technical knowledge or skill. But that leaves the vast majority elsewhere, which is down here, and they're honest mistakes or systemic failures, and we need to get this balance. And we need to be able to react accordingly.
So it's just really building on, on, on stuff that I'm absolutely sure you discuss and, and thinking about and putting putting a sort of a deepening the understanding around it, really. Because that's, that's really what we're talking about. And what we really want to do is help to understand and this is the expression I like, it's saying, you know, what is the why behind the what.
Quite often we know what happens. And it can be easy to want to know who was involved, which of course we need to know, but the really important question is why. Why did it make sense to them at the time?
And the more we can understand, the more likely we are to make systemic changes to help make it easy to get right and hard to get it wrong. And just culture is what we're talking about. And just to build on that a little bit, what we're talking about is just culture and the links with reduced anxiety, psychological safety.
If you've got a reporting system, if you use something like Vetsafe or if you have your own versus near miss reporting system, then you're gonna get much more honest open reports, and that's a key ingredient towards a safety culture in general. And again, there's lots of different information on that available, if you'd like to read more about it. It also, when they feel safe, they're more likely to demonstrate adaptive behaviours, i.e., they're all they're more likely to take necessary risks for the benefit of the patients if they, if they, if they feel safe to do it.
They're more likely you're more likely to have a flatter hierarchy, so people will speak up and raise concerns, and it's much, much easier for second victims so if someone does make a mistake and they feel that sense of guilt following those mistakes, which is so, so common within professions like this. If something good comes from it, and we can learn positively and we can move on positively, it's a really, really helpful process for to, to help, that second victim. So again, just touching on another really important aspect within human factors.
Thinking about another one now, we're only human after all. Now this is very much about our physical and mental state and the impact on our performance. So this is something that some of you may have heard of.
It's an acronym which I've used before. Holt. It's been around for several decades and used across various care professions.
And it stands for hungry, thirsty, angry, anxious, late, lonely and tired. Now, when I've brought that up a few times, a few people have said, well, that just sounds like a normal day to me. And, and the reality is, of course, there are these factors that will be involved and.
It's not a lack of knowing. Everyone involved in particularly people like yourselves who are, have a greater depth of knowledge around our physiology than most people will know more than anyone that this stuff matters and that, that if we're compromised in some of these areas, it can impact. Our ability to perform and look after patients.
But interestingly, when the Medical Protection Society looked into doctors under investigation following complaints and events in healthcare, they found that these were common factors that led to the mistakes. So people that know about it the best. Are often less likely to to actually pay attention to these things.
So it's not a lack of knowing, it's a lack of sometimes paying attention to these things, and so much of that is the culture that we, that we we we function in. And we can sometimes just assume that people will get this stuff right, will, will, will figure out a way of looking after themselves and sometimes it's really not that simple. So it's really making, making a creating a common language around this.
I really like this expression, secure your own mask first, your own oxygen mask first. You may well have heard that on a safety demonstration on an aircraft, but there's a very. If your oxygen fails on the aircraft, you've only got a matter of 12 to 15 seconds at 38,000 ft to get your mask on before you are gonna be conscious, well you'll be conscious, but your useful consciousness will have expired.
And so you're no good then to help yourself and you're no good to help the people next to you. So if you get your own mask on, you're then in a place to help others, and that's the message. Someone called Mike Farquhar, he's a consultant in sleep medicine at Guy in Saint Thomas.
He said this, unless critically ill patients require your immediate attention. Patients are always served by clinicians who have appropriate periods of rest during their shift. So as an expert in sleep and rest and wellbeing, he was really concerned that people in the NHS trusts that he worked in were not looking after themselves, and they were doing it because they really cared about their patients, but he was seeing that it was having a detrimental impact on them.
So he ran, he was the first to start the whole campaign. A campaign to raise awareness and develop a common language and change the culture around these seemingly obvious areas of hunger, thirst, tiredness, and our emotional aspects as well. And it was by shifting that mentality around it that people were able to find time to have a little bit more rest and be in a better state to look after their patients.
Not by having anything changed around their workload, but just through a change of culture and the language they used. He talked a little bit. About this, the superhero mentality.
And it's, it's not, it's not meant in a, in a negative way. He was using it in a very, very credible, complimentary way, saying that people who give care, and this applies to veterinary professionals is that as much as it does to healthcare professionals, are statistically less likely to look after themselves. And that's just because it's, it's the way our brains can work.
If we see ourselves as the givers of care, we can be less likely to look after ourselves. And it's incredibly courageous and honourable, but it's actually not doing the patients any, any favour and so the more we can do to be open about this and to and to create a culture where we do prioritise it, the more likely we are to deliver those consistent levels of care that we want to. So we, actually at VETE ran a whole campaign ourselves which is open to the entire profession.
You can download it from the website, it's free of charge, so, it's, it's not too much of a shameless plug, honestly, it's, it's, it's totally free, and we've just updated it actually this year, as a way of trying to achieve the same thing that, Doctor Farquhar achieved, and we did work in collaboration with him and guys in Saint Thomas to develop something that, that, that can hopefully create that language and that's, that focus. And certainly there have been some positive, feedback from some of the practises that have adopted this campaign. So again, just something else, and I, I'm just going through different aspects within the field of human factors, and, and building on it a little bit more.
So the question is, why is self-care so easy to ignore? We might know the importance of addressing our own wellbeing, and I said, you know, your yourselves included and healthcare professionals will know this more than anyone. But it's not necessarily that easy, we might know the importance of addressing our own wellbeing.
But Our brain devolution hasn't caught up with our working environment, and what I mean by that is that. In the last 10,000 years, the world has changed, you know, immeasurably. But our, in, in evolutionary terms, 10,000 years is just the blink of an eye.
And so what's happened is our brain and our mind has barely changed at all, but the world has changed enormously. So we evolved to live in this world, hunting gathering. And, and, and you now function in this world, so it's not just a different world, but the tasks that we're asking to do, we've still got the same biological bit of kit, but we're asking it to do an entirely different job.
And we can do it. We just need to be acutely and consciously aware of those differences between the two worlds. And one of the differences is, one of many, is that in that world.
We were, we were hardwired to seek immediate returns. So an immediate return means that, you know, we need food because, you know, we're getting hungry, you know, now or in the next few hours, we need water because we know we need to drink at some point in the next few hours. We need to fight a predator for survival, we need to do it now.
What we weren't particularly hardwired to do was think, oh well, I've got a, I've got a 12 hour shift, I've got . A list, a procedure and consult list as long as my arm. And if I don't make sure I stop at that time, I'm not gonna, I'm not going to have had a break for 10 hours by the time I do that procedure, and I'm actually not going to be very, in a very good state to offer the level of care that I need.
It's not to be critical of that, it's to acknowledge that that is our, that is our tendency, based upon the evolution of our, of the, of the human race. So it's just being more aware of that. And that kind of neatly leads us into what human factors really is.
Human factors is a scientific knowledge about our human body, about the physiology, deepening our knowledge of, of what it is we need at a very fundamental level of our minds, and that's things like the cognitive biases that I mentioned that applied that led to the assumptions in the aircraft event in Tenerife. And our and our and how we match patterns as experts and how that can lead to incorrect, decisions. And our behaviour, of course, assertive behaviour is a skill, it's something that we can learn, and it's something that we now understand we need to learn because of events such as the one in Tenerife and the one involving Elaine Bromley.
And the use of open questions as a drugs check is behaviours that we're learning about to mitigate the impact, as, as we learn more about our limitations. So it's, it's understanding what we are good at and what we are not so good at at a really fundamental level. So that we can create this best possible fit between the people, that bit of kit that you've got, your body, your mind, and the environment that we're working in, bearing in mind that it isn't necessarily the environment that you evolved to, to function in.
And it's the more we can understand that, the better we can apply your technical, your clinical skills to achieve the outcomes you want to achieve. Martin Bromley, is, well, the husband of Elaine Bromley, who I mentioned earlier. He's also from the aviation sector and was very interested in human factors himself when he very, tragically went through the experience and was, and was then left with, with his, his two young kids, having, having lost his, his wife.
But he, the, I think the reason it's such a remarkable story is that he, was really, really keen to make sure that something good came from this. He started something called the Clinical Human Factors Group, and it was a large, it's largely because of that group and the work he's done that a lot of change has occurred in this field within healthcare over the last 15 years, and that's actually what he got awarded his OBE for. And this is something he said, right, soon, very soon after his wife passed away in 2005, he said, we know that 75, it's 75 to 80% of accidents and incidents in aviation are caused by human factors.
In healthcare, and this was back in 2005, he said, what's that statistic? No one really knows. Now, actually, the, the science since then has shown that it is about the same, not surprisingly, the same human beings are involved and therefore.
And it's a very similar number. But at the time, he didn't know that number, and he said, well, even if it's only 45%, that's still an awful lot of lives we can be saving. And this was his, his approach to it.
He was saying, yes, of course their clinical skills are fundamental to what they do. But if we're saying that the vast majority of adverse events are occurring. Because of reasons that have got nothing to do with those, what can we do to, to improve outcomes beyond that?
And my question is, what does this mean for us, and what does this mean for the for this profession? And I think that's a really important question as as our our our knowledge in this field continues to evolve. But my last question is this, is simply knowing always enough?
We've talked a bit about some of the contents of human factors around what we know about the mind and the body and our assumptions and the skills that are relating to that. But is knowing enough? And a good example for this is is drink driving.
In 1966, we were given the knowledge that actually, not surprisingly, it's not a brilliant idea to, to be, you know, under the influence of alcohol and be in charge of a vehicle. And so as a result of that, that awareness, there was a law that was made in 1966 that meant that if you're caught under the of alcohol, driving a car, you'll get, you'll get punished in the same way that we are now, obviously the laws have changed slightly, but fundamentally, the knowledge in 1966 was the same as it is now, that we're telling you why it's not a good idea, and we're going to, we're telling you what the punishment will be if you break that rule. 13 years later, they did some research and there were over 31,000 deaths and injuries due to drink driving, 13 years after the the the the law was passed.
So it was still happening a lot, even though people knew they shouldn't and actually had a rule that told them what would happen if they did. In 2014, 35 years later, they repeated that research and that figure had dropped from just over 31,000 to just over 8000. So just over a quarter.
Now despite there being a lot more cars on the road, that figure had dropped. Not because people knew anything different. People still knew the same stuff, they still knew that it was not a good idea, and the laws were still largely the same.
But the question is why did the behaviour change? And this is why I'm so, this is why the link with culture is so important, that we can know things and we can understand things. But actually, the time and time again research is showing us that the biggest influence on our behaviour in most settings, whether it's at a national level as users of road users, or within a practise or within a profession, that culture in inverted commas, the way things are done around here influences what we do more than anything else.
And it's, and all being acutely aware of the impact of our behaviour all the time on the subsequent culture is so, so important and empowering everyone to, to, to value the impact that their behaviour is having on other people's behaviour and the impact that that's having on other people's behaviour and the fact that that cumulatively is culture within the profession. So, of course, it again, culture and and understanding culture there is more to it than that, but at a very fundamental level, that's why it's so important. So if I was to summarise, I would say that performance is is what I'm really interested in at a very fundamental level.
I unders I like to understand how can people achieve the outcomes, perform at their best. And what that's. Determined by is what we do in a given moment, either collectively or or individually, what we do, how we respond to a given set of circumstances will determine how we perform, our outcome.
And that's determined by what we know. Now that could be your clinical skills and knowledge, but it could be your non-clinical skills and knowledge about the non-technical skills that we discussed, and some of them we've touched upon just now, and of course there are a whole load more beyond the ones we mentioned today. It's also based around how we feel and function, our health and wellbeing very, very strongly influences, our ability, what we do, our behaviour, and therefore how we perform.
And what we're provided with. So this is stuff that makes it easy to get it right and hard to get it wrong, systems and processes. And that's why we need to take that systems approach, but also bear in mind that it's all these things that collectively impact what we do and therefore how we perform.
But actually on top of all of that, the way things are done around here, that culture that surrounds you in your practise, within your team, and the profession is affecting what we do more than anything else, in, in different moments. So this link between culture and behaviour and really bearing in mind that culture is the sum total of what everybody does. This idea that we are what we repeatedly do.
And really, really believing that, particularly with more junior members of the team, because I think sometimes there's a perception that only leaders can influence culture and of course, senior. Figures within veterinary teams have got sometimes more of an influence on the way things are moved, but actually, everyone has to deeply believe that everything they do in every moment contributes towards the culture because it will influence what other people do, and therefore what other people do. And that is, you know, what we, what we need to believe.
So again, quick summary here, human factors is a huge topic. We had a chance today to talk a bit about some of those things. We mentioned wellbeing and health, and I made the links to patient safety.
We didn't, but we didn't have a chance to talk about loads of other other stuff that exists within this field. We didn't talk a huge amount about communication other than open questions. We mentioned assertiveness.
We didn't talk much about hierarchies, we, we, we sort of touched on it. We did mention checklists. We didn't mention, stuff around personalities or cog much around cognition or attitudes or conflict management, or safety reporting.
So the point here is that we've touched upon some of the issues within. I hope. That I've, I've, I've, I've, I've given you an overview of what the topic is and why it's so important with a few, sort of takeaway points, but essentially it's a big topic with a lot with a lot that can be discussed, and, and for different, different people, different elements within it may well resonate differently.
So in as a final summary, what are we talking about? We're talking about looking beyond clinical skills, not devaluing them in any way, in, in the same way that I would never devalue the skills, my flying skills. I need those skills to be able to fly an aeroplane and and because I don't have the clinical skills that you all have, I couldn't do the jobs you do.
But looking beyond them, developing non-technical skills with an awareness that actually they have as big, if not in time, a bigger impact on safety than the clinical skills. Raising awareness of limitations to encourage his curiosity over criticism. Why did something make sense to someone at the time?
Why did the decisions and that the behaviours of that Captain Tenerife make sense to him at the time? Because they did. He was very skilled, he was very good at what he did, he was very experienced, and there were factors that made something seemingly obvious, not obvious to him in that moment.
Looking beyond individuals, systems thinking, providing tools and systems that make it easy to get it right and hard to get it wrong, such a powerful phrase that I'm really, really believe in. And the critical link between performance, behaviour, and workplace culture, this idea that behaviour breeds behaviour, and leaders of teams, practise managers, and clinical leaders can absolutely start that wave, but it's up to everyone else to keep it going and it's a continuous cycle. And sometimes it takes a generation to change culture in certain professions, but it can happen quicker than that if there's a real belief and a real buy-in at all levels that everything we do all the time contributes to the culture.
And certainly in guys and. And the culture towards their own self-care has changed significantly in 4 years, so it can happen much, much quicker. So, thank you very much for your time.
I'm, I'm really interested to know if any, you know, what particularly has resonated. If you've got any thoughts and questions. I think we've got some time for it now.
But otherwise, thank you very much. Brilliant. Thank you very much, Dan.
A real tour de force there. We have some questions that have come in. This one came in quite early on.
Does this explain every time I go into the pharmacy, I always notice things are incorrect as you cannot physically put anything else on the shelf. So, it goes I always notice things are incorrect, and just to make sure I've, perceived it in the correct way. I always noticed things are incorrect.
Because you cannot physically put anything else on the shelf. I'm guessing they're saying, you know, other people are putting things back in the wrong place and all the shelves are full or whatever. Yeah, I, I think, yeah, again I think this is .
I, I can, I can imagine again, please, do, sort of elaborate if, if, if there's anything else to add to that. Again, because I think this came quite early on, but, but yeah, I think essentially, we. We do go into the sort of this sort of auto pilot mode and it's very easy when everyone's very busy and distracted to to not sort of be take conscious awareness of these kind of limitations.
I think that's, I'd be I'd be interested just to to to elaborate on that a bit more because I don't want to, to answer it in the wrong way, but . OK, we'll give them an opportunity to come back. It was, yeah, I just wanna make sure I'm answering that in the right way.
So I'll just go on to the next one. If there's any, if certainly having talked about it a bit more, if anyone wants to, to elaborate, I'm, I'm really happy to, and I'm also happy to talk about it, afterwards if anyone wants to send me an email. So we have checklists.
I was off work, 2 months and went back, and the one at reception I picked up had not been used for 3 months. Consequently, pharmacy stock was in the muddle and automatic ordering. Working.
OK, so this is obviously so checklists can can exist in other areas of practise. And and I think this is one of the biggest challenges here is that you can have something that exists, but, you know, and, and, and it's it's understanding why maybe it can be a struggle for people to adopt new processes. You hear the expression, we've always done it that way, because it can be very hard to change habits and .
That's the biggest challenge is, is, is, is really helping to be gain understanding, so that everyone believes in the benefits of using these processes and has a genuine buy-in because simply having a, a, a, a system in place doesn't guarantee it will be used, of course. Do you have any top tips for that? Because that was one of the questions I had.
I was speaking to a friend recently, and particularly in these current times, checklists are becoming more prevalent. I think, you know, when you're treating clients and stuff. And he was saying that one of his receptionists just consistently does not fill out the checklist properly.
There's half of it not filled in and he's had talked through it. He's coached through it, and she's still just not doing it. Have you got any top tips for that?
Yeah, honestly, I think a real, real understanding. And the one thing they did, as part of the World Health organisation research, that, that big, that, that big project that over 18 months was they really, really. Time, educating and and getting a genuine buy-in from the people involved, because initially, understandably there was resistance because it can feel like a criticism, it can feel like it's a suggestion that we don't, we're not good enough for what we do.
And it's really trying to be empathetic towards that. Why might people be finding this difficult? And, and, and understanding why they might be finding it difficult, empathising with that, really helping to explain why.
And it's this idea that we're only gonna know if it works if we do it. And then once we do it, if we need to review it, of course we will, but we're only gonna know if it works if we do it. And really, really spending time, so it might mean actually trying to have an hour with a team to really go through the, the, the, the details of why.
Check this is so important, how they can be used, and, and, and, and helping have that discussion, because I think simply saying they should be used without, without that, you know, the NHS found this, you know, once the World Health organisation research came out, an email was sent to lots of trusts, just simply saying, you know, here's a checklist, everyone should use it. And it didn't impact patient safety one bit, because it, it, it isn't as simple as that. And so, I think Just time and real understanding and patience, because, if it, if it's, if it's, if it's done purely because people are being told they have to, there won't be a real buy-in.
So, time, patience, helping with understanding and empathising with maybe why people might be struggling with, with the idea that, that, that they should use a checklist, because, because it could be perceived as a, as a criticism. So it's not a simple answer, but honestly, time, and, and, and, empathy, I think it's. Great, I think that leads on to the next one, which is an observation, observation that those in, sorry, would you rather read these to me?
I wasn't sure. No, you go as long as you read them out so because not everybody can see. No, that's fine.
So the question is that an observation that those in the medical profession seem to think. You try to come up with the systems and checklists, you are criticising their dedication to their job as opposed to trying to help them. Why is this?
Pilots seem happy with checklists. Yeah, and it does, you're absolutely right. Touch on what I was just talking about.
Honestly, the reason pilots are so happy with Checklists is because they've been around since the 1930s. In the 1930s, it wasn't particularly easy to get a buy in, but it was quite easy because no one, no one's behaviour was that well ingrained at the time, because no one had been flying on the whole for very long. And everyone was pretty open to new ideas, but.
And so they became a part of the norm very early on. But when you introduce them to professions that have been doing a very good job and, and, and achieving a great outcomes for a long time without them, it's, it's, it's understanding why might that person be finding this difficult, and really going in with, with that sort of empathy of really thinking, OK, how can I, how can I speak to this person in a way that, that doesn't just simply say, well you shouldn't be feel criticised and understanding why maybe they do. And, and just having that patience and just helping them to understand and supporting the process.
So, you know, instead of it obviously not being, punished. And, and one of the really important things here is that I think that sometimes checklists get used as audit tools, and the, the one problem that I have sometimes with them having tick boxes, physical tick boxes, and being signed and dated and stored, is that they can feel like they are being watched and, and checked up on. And of course the checklist is there as a patient safety tool.
In aviation, they're not used like that. They are used, they're referred to, they're a laminated bit of paper or, or actually on modern aircraft, they're electronic. They, they're used in the moment and then they're put away.
They're not marked, they're not stored. You know, the only motivation that pilots have to use checklists is that they believe it's going to make it a, a, a better outcome. So one way of trialling it, if you're struggling, might be to take the check boxes away, take the, not, not ask them to sign it and date it and store it and just say, look, use it because you want to use it and help me understand what it is that's, that's, that's, you know, that's a barrier.
So, yeah, really good question. If you do not stock the shelves regularly, you, you do, you just not stocking is wrong. This is not my routine job, so when I go into it's glaring at me.
It's the follow up to the first question, I think. Ah right, OK, that's a really good point. OK, so yeah, so you, you noticing that, that, that, I guess this relates to the, the issue of.
Incorrectly identifying medications, and you may be not aware that, that, that something's been put in a different order in a different place and actually, you know, our habitual, . Processes that we'll go and expect something to be where it was last time means that of course. Our, our habits can get the better of us.
And I think part of it is just an awareness of, of our, of our likelihood of just matching patterns and, and going into autopilot. But part of it also is by is by having processes that help to, to prevent that. But without investigating and, understanding the situation a bit more, it's hard to know exactly how to specifically resolve that.
But sometimes just an awareness of our, you know, helping other people with an awareness of these of these limitations is, is, is valuable in itself. one here that says basis, I feel that, is important, is not what we do. Again, is it possible to elaborate a little bit?
Is it, I think it then follows up with, but why we do it? OK, right, OK, thank you. Basically I feel that it's important, it's not what we do, but yeah, it's why we do it, .
So, so yeah, . I and I think the, the, the why is, is so important, it's understanding as human beings, why do we, why do we behave the way we do, why do we react the way we do to situations, and having that curiosity to ask that question, and the more we can understand why. Ways, why do we make assumptions, why some people struggle to raise concerns, why, you know, why is always the question, and once we understand why we are in a better place to, to resolve the issue.
But yeah, I mean, I've got a, I've got a comment and a question if you don't mind. Obviously your story with Martin and Elaine Bromley is fantastically laid out in Matthew Sayed's brilliant book, which I'm sure you've read, Black Box Thinking, which just is a strong recommendation for everybody listening if you want to know a little bit more about this sort of thing. And then the other, it struck me as I, you, you talked about this just culture and it struck me as I was reading your bio actually.
You talked about the associated growth mindset. I presume you're talking about Carol Dweck's work on mindset. Just wonder if you want to briefly, maybe in 30 seconds, elaborate on that.
Yeah, it, it is. I mean, yeah, it, it's not, again, it's a big topic in itself. Growth mindset is really, you know, that's that, openness to see, to see things that maybe don't go as we expected, as an opportunity.
Not as a, as a threat, but understanding that our brain may well perceive them as a threat and being able to, to, to take conscious action to, to channel these these moments and seeing and seeing the opportunities that arise when things don't go as we expect. And having a just culture is such an important element there that as an organisation you have a growth mindset that sees these opportunities to grow and improve. And, and be grateful for them.
Because that is how, you know, almost all all professional groups have improved. It's, it's through understanding when things don't go well. Now there is a whole other topic here that I haven't touched on, because it's just, it was, there's just not time.
And that's, and that's comparing something called safety one to safety 2. A lot of the more recent research within human factors is saying that actually we can't only learn by an Investigating when things go wrong, because the majority of time things go right. So we need to understand why things go right as much as we do understand why things go wrong.
And so it's not all about understanding, failure, but we have to, as a starting point, embrace those opportunities. And that's the link to, and that, that's not an an entire description of growth mindset, but I would say that's the link to Carol Weck's work on growth mindset. Brilliant.
Thank you. Great. Thank you for making those links.
You're absolutely right. Black Box thinking is a brilliant summary of all, if you want one book to read to summarise most of this with that is definitely the one to, to, to go for. Brilliant.
OK, well, thank you very much, Dan, again, that was fantastic talk. 00, we've got another question possibly coming. Yeah, we've got a few more again, oh, no, sure, sure, actually it's not popped up on my list.
Can you see it? It says, we, we had 6 before. It says 7, but I can't see a 7th 1.
Anyway, never mind. I'm probably imagining it. So again, thank you very much and thank you also again to our sponsors MWI Animal Health and Vettled.
The recording will be uploaded in a day or two, and, if you want to listen again and I look forward to, seeing you again next month on the practise management webinar. Thanks again, Dan. Thank you very much.
Thanks everyone for coming. Cheers.

Sponsored By

Reviews