Wilderness First Responder - Lectures & Pre-Course Learning
The pre-course learning can be found in 'catch up content'
Warning!
The following transcript was generated automatically from the content and has not been checked or corrected manually.
Welcome to your Endeavor Medical Course. I'm Alex. I'm one of the emergency medicine registrars breaking down in the southern part of the UK. And I'm going to give you a presentation to date on human factors. I'm going to start with a well known case that's often referenced as part of human factors talks. And that is the case of the US Airways flight 1549. Now this flight was captained um by a man um casually known as Sully, but Chesley Sullenberger and he was able after the plane was um hit by birds that were flying and that caused both engines to fail to land this plane in the Hudson River. There was subsequently, unfortunately for Sally a lot of scrutiny about this. So although he managed to rescue all of the people that were on board the plane and there were no deaths, there was a lot of scrutiny suggesting that maybe he should have landed the plane in a nearby airport. Unfortunately, when this was looked at, it took them multiple simulations, possibly even as many as 17 to land the plane in a nearby airport. And so all those who suggested as the best option. So he was able to highlight that when you took human factors into account what he had done at the time was the best option and allowed all the people on the plane to survive. So what are human factors? Well, these are the environmental, organizational and job factors and the human and individual characteristics which influence behavior at work in a way that can affect health and safety. Now, I would argue that those are actually applicable in many of our life circumstances and not just at work. And I think we can think about them in a recreational setting as well. And I'll go on to show you how from a healthcare perspective and from a first responder perspective, these have a direct impact on patient safety. So two thirds of all clinical errors we know involve human error and approximately 50% of the clinical errors are preventable. Now, these can be latent. So they may be built in a system and have no actual involvement with the individuals involved in the instant. They're just because of how the system is set up. And I'll go on to show you some examples of that. So a case that has particular point in medicine and the healthcare industry is the Elaine Bromley case. Um and it holds real power in the anesthetic world, but it's transferrable into other specialties as well. Now, Elaine Bromley was a lady who had gone in for a routine operation there weren't intended to be any complications or any issues with the operation. The anesthetist were looking to put her off to sleep and the intent was she'd have a very straightforward procedure. Now, her husband Martin Bromley works in the airline industry which has a real focus on human factors and what it can do to mitigate risk. Now, when Elaine, um, was given the drugs to put her off to sleep, the anesthetist found that they struggled to pass the tube that's required to hook her up to the ventilator to keep her breathing during her anesthetics and they became task focused. So there were multiple expert anesthetists in the room who were looking to continue to pass the tube. And despite this, we know from our own guidance and that if you're struggling to pass a tube, that what you should do ultimately, um is to aim to ventilate with a mask. And then if that fails is to try and get an airway through the front of the neck, and this is ideally what should have happened in Elaine's case. But it didn't, despite the fact that a nursing member of staff highlighted and brought the kit into the room, the team at the top of um Elaine's head um were so task focused that they didn't take a step back and think from a human factors perspective about how they could address the problem that they had. And ultimately, unfortunately, as a result of this Elaine died. But because of this case and because of Martin Brom's experience in the airline industry, he's been able to highlight this and to the healthcare industry and allow us to reassess and with new focus human factors and how we can mitigate those to protect patients. So why did errors happen? What human factors are feeding in that are putting people within our teams at risk? And one of the ways we can think about this is the joy doesn't. So the 30 dozen are 12 aspects, 12 human factors that can increase the risk to our team members and they go from everything to poor communication, to complacency, to a lack of knowledge or to distraction, which is probably quite pertinent in some wilderness environments. People may be stressed. There's probably a lack of resources. If you're out in the wilderness, they may be under time, pressure or pressure of other sorts from management teams. There may be a lack of teamwork, particularly if your team is quite junior or there may be loss of awareness. People may be task focused or um looking solely at the goal ahead at times, we may become accepting of norms. So things that we previously wouldn't have tolerated have now become the norm. We may be tired or we may have a lack of assertiveness like a willingness to um raise issues that we've encountered with our management team. Another way to think about human factors is with each of the letters of factors itself. And I've put up a picture of an avalanche here because actually this is completely relevant to something called avalanche heuristics. And six of these factors hold relevance in an avalanche environment, which is obviously not a workplace, but it is a wellness environment. So familiarity, we've always skied that line. Why would we not ski that line? We know it so well, acceptance, the group we're with, we don't want them not to invite us next time. We want them to approve of us and to know that we're a part of the team consistency. So we might take a different line this time even though we're always on this lo we know this slow, but we might just ski a different way and just see technology. So the weather reports all said it was going to be fine, it doesn't look fine, but technology says it was omnipotence. This is sometimes referred in avalanche heuristics to the expert halo. So we have a leader, we have a guide, he's an expert, he knows what he's doing. Rarity. It's just snowed and we want to make the most of that opportunity because this is the right opportunity. So maybe it doesn't look like it should. But if we don't go now we'll miss the chance and then social facilitation, it looks like somebody's just gone down that line, gone down that slope. They were fine. We'll be fine too. And these are all the aspects and the thoughts that might feed into the processing that can then also lead to issues if there is an incident, many of these have been identified as factors and aspects that have a part of avalanche fatalities. So to take this forward into an example, and I picked a local example. So a scout group is climbing scarf pike a route they have done before and they are accompanied by a leader who knows the route well, and has lots of outdoor experience they've chosen today to climb as the weather suggests that the wind should die down at 11 a.m. and they will get sunshine. It is the only day in the trip that they may be able to summit. It is 12 p.m. and the winds have increased. A young member of the group who aspires to be a leader in future has fallen and twisted his ankle. He continues to walk on it even though it is uncomfortable as he does not want to let the team down. So if we think about this from a factors approach, can you see how there are aspects of this they fit in with some of those seven pitfalls we mentioned. Firstly, we have a senior leader who knows the root well, he's already familiar with it and he's carrying his expert halo and the team with him. Secondly, we have technology that's suggested to us that the weather should be fine. And although it doesn't look like it's fine and the winds haven't died down. Technology says it will. We have a member of the group who's looking to be accepted. He'd like to continue to work with the scr group and he aspires to be a leader in future. He doesn't want to let the team down. And so these are some of the aspects that we can look at when we think about human factors. I've missed one scarcity. It's their only chance to make it to the top of the mountain today. They don't think they could do it another day. And it just shows you how easily this can creep into some of our, some of our encounters in the world with the wildness environments and to take this forward. What is the role of us as an individual? Like how do we play into this? I've tried to break this down in a way that's simple so that you can think about your own rules in each of these cases. So when we think about decisions, I heard somewhere that an emergency medicine doctor does 800 tasks per shift and that can be anything down to prescribing something or signing something or taking a form out and filling it in. I'm not talking about our big decisions. I'm talking about the smaller decisions, but then you're also making background decisions. Do I need to eat something? Am I hungry? How will I get home later on? Am I gonna cycle? Is it too wet. Um And so in total, they think we make around about 35,000 decisions per day. And I don't think we're alone in that. I think we as humans deal with a lot. And so actually our brains have developed a way to reduce the complexity in the burden and they take shortcuts where they can and they try and make fast decisions based on patterns. So in many ways, this is us if I show you this paragraph, try and read it. So you'll find that although all of the letters in the middle of these words are completely rearranged, you can read this because our brains have developed shortcuts to allow us to be able to continue to function with that level of decision making. But these shortcuts aren't always to our benefit. And from a human factors perspective, sometimes things need a bit more thought. I'll give you another example. Hopefully you found that an interesting test of your attention. I know I certainly did the first time that I did it. I'm gonna move on to highlight bias, which I think has a real prominence. When we look at human factors from an individual perspective, I'm gonna start with an example. Here's an elephant and here is an elephant joke which were a trend not that long ago. Why do you have flat feet to stamp out burning fires? But why do elephants have flat feet? Do you see the pattern to stamp out burning ducks. So if you were to predict the answer, then you may be victim of our clustering illusion. So our brains look to see patterns in randoms event, there's no pattern to the stroke, but your brain may have tried to predict the answer. Now, I'm gonna give you three slides that have further examples of biases. And I'm not gonna go into them in detail. They're more just for your own information if you'd like to revisit them. Cos I find this quite interesting. So feel free to take a picture and I'll pause on each of them a further example. So you're out walking and a member of your group falls and twists their ankle. But you have a friend who recently fell and they twisted their ankle and it was pretty swollen, right? But it was just a spring. So you may be at risk of something called recency. So your most recent information, your most recent exposure was with somebody with a sprained ankle and it looked swollen but it wasn't broken. But that doesn't mean that this one mightn't be. And if you fell into this pitfall, then that would be recency. A further example. So there's something called confirmation bias and our brains will look to analyze the facts and the evidence that are in keeping with the beliefs that we already hold. And that may be that we ignore lots of evidence that's relevant to our cases just because it doesn't fit with our beliefs. And we call this confirmation bias, it's reinforcing our own preconceptions. So these are all examples of bias and how it might influence things from a human factor perspective. But there's also something called the Dunning Kruger effect. Now, the Dunning Kruger effect examines how our level of confidence varies with our level of experience. Often, initially, when we embark on learning something new, we have actually quite high confidence, not levels, we're unconsciously incompetent. And then as we start to realize that we don't know very much about what we're learning, we become consciously incompetent and we reach that bit that is kindly called the value of despair on this graph. And then as we start to learn more about our topic, we become um more conscious of what we're learning and we drift towards that kind of plateau of consciously competent, which will eventually become unconsciously competent. Now, this graph cleverly highlights that we can forget things that we've learned, which has definitely happened to all of us. But it's just a good example of how our confidence varies and may not always match the experience or our knowledge. Some of you may have come across halt in human factors before and I've definitely been victim to this in the past. So if you are hungry, if you are angry, if you're lonely, if you're tired, then these things need to be met for you to decision make in a way that is safe and reduces risk. If you're any of these things, then you're already victim to human factors and it may impact on your decision making. So what about things from an organizational perspective? Now, some of you will be familiar with this, the Swiss Cheese model. So the Swiss Cheese model is the idea that if there are a series of human factors or organizational aspects that allow holes in the system that allow risk through and they line up correctly, then sometimes you might get something that you don't expect to happen like this, that we could break this down further in this example. So it may be your culture, your leadership and your organization, it may be your technical support, it might be your training or it might be your clinical support if you work in health care. So what would this look like as an example? So in a healthcare setting that might look like this, we have staff shortages. So we're often doing more than we would intend to because we're trying to make up for those staff that are missing. We may have an experienced team members because we're looking to train new people to try and help us. We may have because of all that additional version failed to monitor some vital signs. So observations and patients heart rates and things as regularly as we'd intended to. And then because we're short of staff and um we're busy, we might have struggled to communicate our findings to our team and all of those lined up increases risk to the people that we're looking after. But what about the environment? How does that interact with us as an individual and with our organization, what human factors exist within the environment that impact our decision making process and may alter risk. So what human factors may exist in the environment or the workplace that you're working? Now, these are quite diverse and some of these may be relevant and some of these may not and some will be relevant in some circumstances and not others. So it may be to do with tasks, there may be workload or work patterns that make things difficult. Often in an outdoor environment, we have to be up at certain times of day quite early to do things and get things done. Your work environment and workplace design may not meet with um factors that optimize reducing risk. So an example in my workplace would be that our scanner that we often need in emergency situations is a bit further from our A&E workplace culture and communication may not be um open. Um It may not be inviting to allow you to raise issues and leadership may not be supportive of that or you may not have the right resources, policies, programs and procedures may not be available or may not be appropriate for the circumstances you're encountering worker competency and skill is very relevant, an employee attitude, personality and risk tolerance can also impact when we think back to our recent pandemic, there were lots of human factors at play and we all struggled to adapt to how to survive in that new environment, how to adapt culturally, how to keep ourselves safe, but how to make decisions in the context of what we needed to do, what we still had to do and what was safest for us and our families, we did our own risk assessments and this will have relevance going forward as well. But this is an important part of human factors mitigation and how we can analyze and make plans that are going to reduce risk to members of our team and to our patients. An additional thing to think about is systemic migration of boundaries which I would put under environment and culture as well. How many of us are guilty of going just above the speed limit? We think there's a 10% rule. So 75 is OK. Right. And it's this systemic migration of boundaries, this sort of normalization of what isn't the norm um that can allow creep into some organizations. This concept of frequent violations can become more severe over time and that the whole culture of an organization may slowly migrate toward an accident or an adverse event. Another example of an environmental or cultural factor that feeds into human factors is civility and this is gaining prominence and real emphasis in medicine and health care. So we know um that when somebody is rude, 80% of recipients lose time worrying about the rudeness and 38% reduce the quality of their work. 48% will reduce their time at work and 25% take it out on service users. Even if you witness incivility or rudeness, you will have a 20% decrease in your performance and a 50% decrease in the willingness to help others and for service users. So for those accessing our healthcare systems, 75% have less enthusiasm for the organization. So it affects everybody. And we know that civility and incivility is a real key human factors in how we look after our patients. And I think it has relevance to all environments that we practice health care and that goes for in a wilderness environment as well. So how can we approach human factors in a way that allows us to mitigate some of these issues? Firstly, we can data collect and risk assess. So if you're going on any trip, there should be a risk assessment of what things that you may encounter. And I know for me that goes from everything to um having coconuts removed from trees on an expedition, I was going on so that they didn't fall on our tents or on participants, which I don't know having not having not been to the island, I would have even thought of. But this is part of your risk assessment we also data collect because if we data collect, then we can look to figure out where the holes are and where we can look to reduce risk. And sometimes we do something like this in healthcare, which is a root cause analysis process. But this could be applied to any environment, any form of data collecting about what happened with an issue or just generally collecting will allow us to identify where systems are falling down and where we can mitigate. So there are five domains listed that are areas that we could all look to mitigate risk. But I've put a picture of a checklist deliberately because we often use checklists as reminders to make sure that we haven't missed anything. And I'm a real advocate for using them in each of these five areas. We often will do this just before we do high risk procedures in health care. But I think that doing them in a wilderness setting or an environmental setting is could be really important and really helpful as well. So another thing we can do is that we can build systems to reduce risk and to prevent errors. And this is actually one of my favorite pictures to demonstrate this. I'm hoping that you can see that the word hidden behind my head currently says design, we can design something. But if that isn't the experience that the user wants or the way that people will use it, then it's wasted effort. And So if people are using things differently, ie the rules in the system are wrong, then we should look to change them because that whole leaves a gap for error. From a human factors perspective. This further illustrates that. So we're have been guilty in health care settings of often having drugs that are different, but in very similar packaging. And if we can identify that and address it early, then it means that the patient doesn't get the wrong medication. Another key aspect of reducing risk from human factors is to create an open environment that will allow people to raise problems because getting angry with people from making mistakes, doesn't teach them not to make mistakes, it teaches them to hide them. So we have to have a no blame culture that allows people to raise problems in a way that is constructive and protective. So that as an organization, we find problems that we can address. Again, this goes for a wellness environment as well and in terms of personally making mistakes. So if you've made a mistake, what do you do? Well, firstly, don't panic, make sure the patient or your team is safe. Be honest and open about it, try to talk to a trusted member or to discuss it with your team um in a safe space and report it and pool knowledge on how you can reduce the risk in future. So we're coming shortly to the end of our human factors, talked but I want to leave you with a couple of quotes before we go. So, success is not just dependent on before the event reasoning. It is also about the after the trigger adaptation. It's what we do afterwards that will protect our patients and our team in future. So I bring you back to the story of Captain Sully when he landed in the Hudson River and he gives us this quote that I think is particularly poignant when we think about human factors. So for 42 years, I've been making small regular deposits in the bank of experience, education and training. And on January the 15th, when the plane landed in the Hudson River, the balance was sufficient. So I could make a very large withdrawal. If you have any further questions, please don't hesitate to get in touch. You could contact Lucy or the Endeavor team and they will send them through to me as needed. I hope you enjoy your course.