Computer generated transcript
Warning!
The following transcript was generated automatically from the content and has not been checked or corrected manually.
And thank you very much for attending this short webinar on Human Factors and for giving up an hour of your evening. It's very much appreciated. We've got people from the UK and indeed across the world so many, many thanks. My name is Peter Brennan. I'm a max of a facial surgeon based in Portsmouth and I'm also an elective member of the college council and I have an interest in human factors and safety. We've, we've got a, a diverse panel this evening and I'd just like them to introduce themselves. And then we're going to have a presentation by Niall Downey who's a cardiothoracic surgical trainee who then became an airline pilot, followed by Sadia who's a human factors expert. And we have a, who's a medical student who's had both good and bad experiences both in outpatients and in the operator theater. Um So say, ladies first say, say, dear, would you just like to come in and just introduce yourself, please? Hi, I'm sad. I'm a human healthcare human factors lecturer at university. I'm also a operating department practitioner. Um I'm still clinical on the bank. Um When I do get a minute. Um and I worked in research for about seven years which I really enjoyed. Um Thank you for having me and welcome to the webinar. No, thanks very much and just, just before we went, we went live, um I mean, we make assume about 5 to 7 mistakes every single day. And in fact, there was, there was a mistake with your, with your name, wasn't there as well. So, um that's, that's now been corrected. So, um I think so. Um So thanks and you, and you're very welcome and thanks for coming. Um Ellie, would you like to come in? Yeah. Hi, I'm Ellie. I'm 1/4 year medical student at the University of Bristol. Um and it just have an interest in human factors. I think it's very interesting um with patient safety. So, yeah, thank you for having me. No. Um No, thank you very much and, and um Edie's actually written a couple of papers on briefings and culture and you're, you're an author of the newer Human Factors Chapter and Gross Surgical Anatomy as well. So, you're very welcome. And Nile last but not least, thanks, Peter. Uh My name is Niall Downy. Uh My, I have a number of hats. Uh My main hat, first of all is I'm an airline captain with Aer Lingus, the fly A three thirties on our long haul fleet. Uh Previous to that, I used to be a cardiothoracic surgery rar back in the nineties, but it's back when there wasn't enough spr jobs. So I fell between the stools and jump ship. Uh I also teach our human factors approach from healthcare back in or from aviation back into healthcare. So that takes up quite a bit of time. And last, but not least, I'm an author now as well because during COVID, when I volunteered back into the N HSI was told you weren't short of doctors. Thanks for offering. So I went and wrote a book and said good and very, very well worth reading. Um your book now as well, isn't it excellent? Really enjoyed that. So, what we're going to do, I think Niall has very kindly agreed to kick off and is going to give us a very kind of brief whistle stop tour of some of the human factors that we think are probably relevant. Then we're going to have a short, short little discussion and then Sadia will then follow up talking about resilience a little bit about systems processes as well, which I think will be really interesting. So, Niall, if I can hand over to you and if we can have Nile's slides up, please, would it work? Yeah. Faith and re Yeah. Yeah. No, I'm just starting to shake now. There we go. Um Yeah, I see the slides side on the side. So we should have just the band on. We, she had recent them nicely. Yeah. Yeah. Uh So that's, that's us. We've already introduced ourselves. Yeah. Um, here we are. Right. Go. Right. Ok. Thank you very much. No. So one of the things that we're gonna be talking about is where can this go wrong? What's Plan B? Um myself and er Antonia had a practice on, on this, this morning and uh I couldn't actually control it properly from my side. So Plan B already is that Ria is gonna change the slides for me. So again, thanks for having me. Uh If you just go to the next slide there, please, Ria, this will give us an idea of what uh the problem basically is in healthcare with human factors. Uh There we have two totally different steroid injections. Now picture yourself at four o'clock in the morning in the side room, not or tired on a 24 hour shift. Uh You haven't had your breaks, you haven't had your dinner and you're trying to uh pick the right steroid in a hurry. What's the chances of you picking the wrong one? I would guess uh fairly high. Look at the next slide. Another example. Uh I have a cardio thoracic surgeon. So a wee bit of patients going into renal failure, uh POSTOP. So we would have them on dialysis on the intensive care unit. Uh One of the things we had to uh be very careful of keeping under control was their potassium levels. If it went too high, they would arrest on us. Uh So here we have two bags of renal dialysis fluid and the one on the left you believe has twice as much potassium as the one on the right now. Again, picture yourself in the side room at four in the morning and think what's your chances of lifting the wrong one? Uh Pretty much guaranteed. Next example, I'll show you something similar. There's endless examples of these unfortunately, in healthcare. So you just go to the next slide there. Yeah, this is one from the hospital I'm working in at the minute. Uh It's a cardiac catheterization lab. So they have to heparinize the patients before they uh put the, the cardiac catheter into the, the artery and feed it up into the heart so that it doesn't have any uh clots forming all it which can then uh shoot off. Uh they normally use 5000 units uh per mill. And that means for the 10,000 units that they need, they need two mils of heparin. Uh This week the la the pharmacy department didn't have the normal one. So I sent this one up instead. And again, where can this go wrong? Uh They're used to seeing 5000 units per mill. Uh Most people when you look at things, uh we tend to be in a hurry, we take shortcuts. Uh That's how the word works. That's how we manage to achieve most of what we achieve. Uh If you're expecting to see 5, 5000 units per mill. When I glance at that top corner, I see 5000 units in brown and I see per mill green. So I'm gonna see what I expect to see. Uh I would suggest a better idea would be to scrap one of those dosages and just put one or the other because I can count, I can work out how many mils I need for 10,000 units and it almost caused problems. Uh People almost got 1/5 of the heparin dose they were meant to have this week. Uh If you go to the next slide there, please, again, if uh when we have people uh POSTOP in intensive care, they'd often be on various uh infusions. Uh We have the people maybe on three or four at a time wouldn't be unheard of if you're changing the in the infusion overnight. Uh Chances are you're changing the baby a couple of times. Uh If you think when you're unlocking your phone, you don't actually think of the actual numbers to unlock it, you generally think of the, the muscle memory, you think of the pattern. So when you're changing the infusion a couple of times during the night, you're gonna do the same, you're just gonna type in the same flow rate using the, the pattern that you've been using, using your muscle memory. Well, the problem with that is if you look at these two infusion pumps, which might be on the same drips stand. Uh The one on the right has 123 across the top. The one on the left has got 789 across the top. So if you're using muscle memory, you're gonna typing a totally different flow rate, which could possibly kill the patient. OK. So that's the sort of environment you're working in. Uh Basically, you can see that uh human factors. It hasn't got a look in. Uh it's been designed by people who probably don't actually work on the, the the shop floor at all. It's been designed in offices and labs and often it's been designed to look nice. Uh It hasn't taken the human into account at all, which is possibly why the adverse event rate is as high as it is. So if you go to the next slide, please, uh This is my current office. Uh It's an Airbus a 330. Uh I've been flying with Aer lingus now for 25 years. Uh I've been doing for the first seven or eight years. I did uh casualty regar work as well. So I kept my hand in both camps uh for quite a while and then basically just ran out of time and had to sort of cut the medical work uh for the last 13 years though, I've been dipping my toe back in by teaching Irish human factors approach back into healthcare. Now, uh some of you who have been about for a while will remember when uh CRM was introduced into healthcare about a decade ago and it wasn't a success. And I think the reason it wasn't a success is that it was uh done basically as a transplant and it was a bad tissue match who got rejected. But having worked in both industries, I think the underlying DNA is very compatible and that uh we can genetically engineer it in which is what I've been trying to do. Instead, the principles are very transferrable. Uh Basically, if we can apply our aviation framework and then you guys flesh it out according to what you need in your own environment, I think it's very transferrable and could make a huge difference. So that's my current office. Uh If you go to the next slide, we'll see my previous office. Uh as Peter said, uh I used to be a surgeon. I have a security thoracic surgery trainee in Belfast and in Dublin. Uh back then the system was you had to get an spr specialist registrar job to get a consultant training post. And there wasn't enough to go around. There was three SPR S in my unit. There was 13 registrars and 313 in the three dozen go. Uh I ended up falling between the stools like quite a few other people. And the following week, Aer lingus had an ad in the paper and I had no interest in aviation like Peter. Uh I had no background in it. I had no plans to become a pilot, but I needed a long term career and Aer lingus offered me one. So I've been there for the last 25 years. Go to the next slide, please for you. So let's have a look at a few numbers. Uh, self and Peter differ slightly on the stats here. Uh You can make an argument for either side. So it's not really a big deal, but here's figures going back about 50 years. Uh looking at the number of adverse events in healthcare, let's say we've got half a century's worth of figures. We've got figures there from the northern hemisphere, southern hemisphere, first world countries. Uh It's the same with third world countries, developing countries. Basically the adverse event rate uh in every country as assessed has been anywhere between 5% and 10%. So that's of all inpatient admissions between five and 10% have an adverse event and an adverse event is basically something that goes wrong with the patient unrelated to what they actually come in with. So basically AAA mistake made by a staff member of human error. So next, please. I, so let's look at a few stats. Uh This is Marty mccarry, he's a professor of Public Health in Johns Hopkins in Baltimore. Uh He's also a professor of endocrine surgery over there. He was from Liverpool recently, but he's been in the States for a long time now. Uh Marty wrote a paper back in 2016 looking at the the adverse event rate in the States. And it was so controversial that none of the American journals would even publish it. So it ended up in the British Medical Journal instead. Uh He showed if you look at the stats, uh medical error was accounting as the third leading cause of death behind cancer and heart disease in the US. So we'll go to the next slide rate. We'll have a look at figures a bit closer to home. Uh Here's figures from around about the same time. This is 2015 from the NHS. Look at the big green box, a number of adverse events between eight and 12%. So say ballpark figures, we'll call it 10% of all admissions have an adverse event. Uh If you move to the next one, please bring this closer to home for me. Uh I'm a fellow of the, the Irish College. Uh So I went from the Royal College of Surgeons of Ireland. Uh They published a study along with the Royal College of Physicians in Ireland. Uh the HSE which is our version of the NHS and the, the health research board back in 2016. Uh It's the first time they looked at adverse events in Irish hospitals and they found basically that the stats were broadly consistent with baseline studies in other companies. So basically Ireland is not an outlier, the UK is not an outlier everybody's got the same stats worldwide and nobody's managed to crack this yet. And if you go to the next slide, ra there's a few details on it. We pick out the important bits. So they found again, 12% suffering adverse events. So again, we'll call it 10% ballpark figures. And further on down, we can see 7% contributed to the patient's death. And over 70% of the, the the events were considered preventable using basically aviation and safety type human factors. Now they repeated the study five years later and got the exact same stats and funnily enough. It's possibly because they didn't actually change any of the, the procedures that they were doing. They found that 70% were considered preventable, but they didn't actually get around to trying to prevent any of them. So if we go to the next slide area, we'll work on a few numbers just, but in context, even if we stick with the Irish ones for a moment, we can then translate it into UK figures. This is an Airbus ac 20. Uh We've got about 35 of these in a lingus. We use them for flying around the UK and Europe. There's about 100 and 74 seats there. Uh In the Irish Health Service, we've got 650,000 in patient admissions per year. If 10% have an adverse event, that's about 65,000 adverse events in a year. If 7% of them it causes or contributes to their death. That's about 5000 deaths per year. So that's about 100 per week. So for ballpark figures, uh in the Republic of Ireland were crashed in one of these a three twenties every 10 days and they bring it the UK figures. Uh The UK population is about 10 times higher than the, the Republic of Ireland. So in the UK, we're crashing in a 320 every day. So for in a 20 in Heathrow every day and it never made the news and it doesn't have to go to the regulators. I think people will be asking questions. So basically, does healthcare have a problem? Yes, it does. It hasn't addressed human factors. Is that problem uh causing any sort of great consequences. Uh We're crashing a plane every day. So yes, it does. So the third question then is, can we do anything about it? And I think yes, we can by using our aviation and safety human factors. So if you go to the next slide, please, our big take home in aviation is that error is nothing to be ashamed of. So if you remember nothing else from tonight's hour, uh remember this error is nothing to be ashamed of. I know and I work in healthcare and from keeping an eye on the media, I think it hasn't changed that much. Uh If you make a mistake in healthcare, it's name, blame shame when we train in aviation, we expect to make mistakes. So we don't see that it's any big deal when we do. So if you go to the next light, please. Area we'll see you a few examples. No, I skipped on there and just go back there to look at that one. If you can play that video first, please. R you. This is the, the biggest security event in Ireland this century. This was the Barack Obama visit in 2011 here. He is in the beast, uh leaving the American Embassy in Balls Bridge and I don't know if you can hear the sound. It's not coming across on my side, but the beast is bulletproof. It's bomb proof, it's chemical weapons proof. And uh, they, they have a fridge in the boot where they actually carry the President's blood. So if he gets shot, they can take the blood to the hospital with him. Uh One of my friends was the head of the police tobacco unit at that stage. They had, uh, checked all the Sears under Balls Bridge that week for any suspect of ISIS. They had welded the manhole cover shut. They had a helicopter above the embassy for the whole time he was there. They had canceled all police leave for the whole duration of the visit. And now here he is, there's an absolute sitting duck in the middle of the road in Bos Bridge because the, the car was so long, it got beached on the ramp on the way out of the embassy. Uh The car was there for four hours before they managed to get it moved. Now, by pure luck again, one of the things we uh preach is the plan B approach, which we're already using ourselves here. Uh turned out by pure luck, Obama and Michelle were in the second car that day. So they were able to reverse back and go out a different uh exit, which is how they came in, which is why it didn't happen on the way in. Go to the next one, please. I, so when we all know and love, we always rely on to make mistakes, I'm not getting any sign on my side. I don't know if you are basically, this is uh Boris Johnson at the start of the Ukraine War uh about two years ago now in Westminster. And he's just wanted to thank uh Vladimir Putin for his inspirational leadership so far. Uh Oh, sorry, Vladimir Zelensky. So you always rely on Boris not to let us down. So basically, we can see that uh in security uh entertainment. Uh If you can count Boris as entertainment politics, uh all sorts of levels, people make mistakes all the time. So why should the healthcare workers be embarrassed when they make mistakes? So, I've got a next slide, please. And we look at uh what do we do differently in aviation compared to healthcare, three big things. Uh Basically, we have a framework approach. The first leg of the framework is culture. We've got what's called a just culture. So if I make a mistake tomorrow, I can put my hand up and report it safety and knowledge that it's actually drawn into international law. I can't be disciplined and I can't be dismissed as long as it's a genuine mistake. Now, it doesn't cover growth and negligence and it doesn't cover uh malice. So it's not get out, get out of jail for E card. But if I make a genuine mistake, I can put my hand up. Second thing then is I know from healthcare, it tends to be about finger pointing. Uh A lot of the stuff from healthcare is about finger pointing and been able to pin the blame on someone. So now we've now dealt with that, we can now move on and not really address the real issues. Uh So it tends to be about who went wrong in aviation. We look at what went wrong. So if I make a mistake tomorrow, Ing's mindset is basically, well, you've been flying for us for 25 years, it took you 25 years to make that mistake. So if you're a crap pilot, why did it take you so long? Do you get that wrong? So we assume that something happened in the system, we dig into it and see, well, what went wrong today? What was different to the Swiss Cheese model and see why did you make that mistake? We usually find the tripwire that we fell over and then we try and engineer out of the system. And ideally, we try and engineer in uh a safety net to protect us. So basically another layer of the cheese. And lastly, the third leg of the still end is the crew resource management. That's our under underlying operating system. So it's about void trap, mitigate. We assume we're gonna make mistakes. So we try and look out for them in the first place. We try and avoid them. We try and trap the ones that get through and we try and mitigate the ones that get further down the line uh before we either crash a plane or crash a patient. So put the next slide there, please basically summarizing that uh we accept er is normal healthcare hasn't got to that stage yet. Uh We look at the system and see what went wrong. We find where there was, where was the tripwire today we engineered out and we engineered in a safety net to change that. And then lastly the third leg of our framework, we trained staff to manage air. So notice not to stop air happening at all. No matter how hard we work, there's going to be our, so we need to learn how to manage them to avoid them uh turning into a major adverse event. So next night, please. I, so we talked about the Swiss Cheese model. This was James Reason from the University of Manchester. When we dig into the system, we always say basically healthcare's problem is that they dig into the system. Who made the mistake? Peter Brennan did. Uh Peter Brennan is a crap surgeon. Peter needs to work harder, problem solved. Let's move on. Whereas in aviation we look at what went wrong, we dug a wee bit deeper and say, well, why did Peter make that mistake that day? Uh Peter had been up operating all night doing an emergency. Uh Peter's registrar is all sick this week. So Peter's covering the registry as well. Uh The hospital is actually on fire at the minute. So uh there had been a fire alarm going off in the background while they were trying to concentrate. So we usually find there was a whole series of other things that led to Peter making that mistake. So we then see, well, which of those can we address? And we try and uh plug all the holes in the cheese and that's where they avoid trap mitigate comes in. The more layers of safety. We have the, the better our chance of actually trapping an error before it gets to the stage of killing the patient. And again, the with us, we tend to have multiple layers of redundancy. That's why we have multiple computers and backup systems on the plane. So we expect to lose some and we can continue on regardless with that. So, in the next slide, please, I, so basically summarizing, we've already discussed this. Where can this go wrong? Uh I complied this presentation with Antonia earlier on and we couldn't get it to work from my side. So what's plan B uh Ria has very kindly stepped in and she's doing it all for me. So I've got the easier option here. So, same in healthcare, if you're uh approaching the patient, even if you're communicating with the patient, uh we'll look at that in a 2nd, 70% of errors happen in communications. So if we can address even communication on its own, we've already seen that 70% of uh or, or 10% of patients have an adverse event, 70% of the adverse events uh are are preventable and 70% of them are due to communication. So we can catch half the adverse events by dealing just with communication alone. So whenever you're talking to a patient, when you pass on a message, are you sure they actually received the message you think you transmitted? So they might have heard something totally different. Uh Despite the fact that you've given a very clear message. So what's Plan B see from a communication point of view, we have close look communication. So we would then get the patient to respond to us and clarify from their point of view. What did you understand from this. So from the next slide there, please rain. The other thing as well is what's in it for me. Uh I trained uh an intensive care network team uh last year, uh there was about 400 of them. So we did about 10 training sessions of about 40 each. One of the things that came up was medication nurse, which is one of your biggest issues in health care. Uh A nurse had given the wrong dose of a medication during the night shift. Uh The management looked at it did the root cause analysis and of course, root cause analysis was that the nurse had made a mistake and hadn't read the dose properly. So it's the nurse's fault. So what was the solution? Uh She now has to cross check the the dose with one of her colleagues before she can actually give it. So when you dug a wee bit deeper in the aviation angle, when I looked at it, I thought what was actually wrong is it's post COVID, they've lost a lot of staff. They have a lot of agency staff now who don't work in the wards. So she was trying to mentor them. She was missing breaks, basically, she was too busy. So how did we resolve the the problem? We made her busier. So what's in it for me? Is that gonna work? Uh No, it didn't funnily enough. Uh So whenever we do make changes, we need to involve the staff and bring the staff in. You guys generally know where the mistakes and the problems are. So you guys need to be part of the solution so you need to help sort of work out the solution, not have it imposed on you. Uh Next LA please. R so just briefly, then we'll just finish up with uh we've already spoken about uh career resource management. CRM. It can be broken down in different ways. We tend to break it down to about six big headings. And we've already said that 70% of issues are due to communication. So communication should be dead easy. Basically, you've got a communicator, you've got the message and you've got the receiver. So you just ask for what you want and you get it back. So if you go to the next slide, we'll see an example of that. Although I think the sound might not work. I don't know if anybody's hearing that the sound. Well, basically, it's the two Ronnie. Uh It's one of the most famous sketches, Ronnie Barker's come into a hardware shop. Uh He's the, the communicator. He's got a message he wants four candles and Ronnie Corbett's the receiver and he's just gone behind the counter and he's gonna produce four candles and Ronnie Barker says no four candles. And so, well, there you are four candles. So it turns out what he actually wants is for handles, handles for his garden fork. So basically we took what we thought was a very straightforward message, very simple, passed on perfectly clearly, but the receiver heard something totally different. So whenever you're talking to a patient and explaining something, uh first of all, uh on a good day, people only take in about 30% of a message uh on a bad day when you're a patient, you're going to take in less. So, despite the fact that you've given them a very clear message, they might have heard something totally different. So what's Plan B again, close loop, bring them into the conversation, get them to explain back to their side what they're expecting what their figures are and work out. Have they actually worked out or understood what I've told them. So, go to the next slide there, please. Ray, we're nearly done and this is just to compare uh my world with your world. Uh We're seeing your medication issues, uh the way your stuff's labeled, the way we can genuinely make mistakes. Uh Here's my word. This is a $240 million airplane. Now, any takers as to what that lever does. So that's sticking out of the front of my dashboard right in front of my face, just where my hand will reach automatically if I'm coming in the land. And I want my wheels down. I grab a hold of the lever with wheels on it and I put it down. That's as simplistic as we make it, we assume pilots are thick as big shit and uh everything's designed around that we assume we're going to make mistakes and human factors approach as we try and make it as easy as possible to do the right thing. For me, that's what human factors is. Make it as easy as possible to do the right thing. So the final slide there, that's uh me wrapped up. So that'll give you a rough uh sort of outline of how aviation see human factors. I think me and Peter agree on most of that and we've already collaborated on work trying to bring the aviation approach to him and I'm sure he's going to elaborate on that for me now. Yeah. Well, thanks very much, Neil and we could spend hours and hours obviously talking about this. And in fact, Niall and I spoke at the Society of Cardiothoracic Surgeons. When was it a couple of years ago back in Belfast? And that actually brings in, um, you know, teamwork and how teams interact with each other and the stress of the operating theater. And we staged, didn't we? We staged an argument on, on the stage about the incidence of, of error. And the paper that I use is the one quoted then B MJ of 337,000 and no. And I had an argument on the stage and it got progressively more heated, didn't it? And I started off? Yeah. Yeah. Yeah. And the audience as about 354 100 cardiothoracic surgeons, trainees, medical students and so on. They were absolutely shocked and horrified. And then Noel came back in and we said, and we just stopped and said, how does that make you feel? And they were just, I think someone shouted out where I said, it's a disgrace. It's like, no, that was actually planned. And this is actually what happens in your theater occasionally and teamwork and toxicities and things, which is, which is a real barrier to human factor, you know, to safe communication, empowering the team to be able to speak up. And if they have any concerns valuing the team, so I'm, I'm just going to bring you in as a medical student. And can you just give us that example of what happened to you in the operating theater and how that made you feel? Yeah. So, um let's say, first of all, I've had lots of fantastic experiences in theater, but I had one occasion where um just to start, like sort of wasn't introduced and it was kind of ignored throughout the procedure, which is, you know, as a student, you kind of, you do get used to that feeling. Um And then there was an incident where I got a throb, a swab thrown at me, um which was quite, um it made me feel quite sort of upset and uh like, yeah, it sort of belittles you quite a lot. And you feel like in that moment, I was like, I just wanted to leave the theater and obviously was not going to ask any questions or want to get involved at all. Um So, yeah, I think it was the whole sort of um the whole thing from the beginning being ignored and then that almost um act. Yeah. Um So, yeah, yeah. So, so I guess the question is now I talked about hierarchy a little bit as well actually. Would you feel if that if that surgeon was, was going to operate on the wrong side, for example, or, or was doing something that you thought was unsafe? Would you feel able to speak out and, and to challenge that, that surgeon? No, definitely not. And I think probably not even from the beginning being ignored, sort of at the beginning of the procedure. Definitely not. And then, yeah, with the sort of throwing at me. Absolutely. No, really sorry to hear that. So for most of us on this, on this webinar this evening, what we want is the, is the practical elements of a few of factors how it relates to me and you, what we can do to make it safer, the effective communication. Now you've touched upon that, having the eyes and ears around you from the rest of the team. So empowering the most, the most junior person in that operating theater from a healthcare support worker to a medical student. Right up to the most senior person, they're your eyes and ears, they're looking out for you and you're looking out for them and, and you're, you're building situation awareness as we call them. And I think what we're going to do is we're going to go on now and sad is going to give a talk a little bit about systems, which is, which is a really important part of human factors and just give us an oversight on that. And then we should have 1520 minutes to have some discussion at the end. So, so if you, if you've got some questions, please hold them and we'll put, we'll put those in the, in the chat after Sally's to thank you very much. So can we move on with the slides then for the next? Um Oh, it's, oh, there it is. We did practice this earlier and of course, um are we able to see that? Sorry, I can't see anyone? Uh Yeah. Yeah, we can see that. Yeah, I'll, I'll share it again. Okie dokes. Um Hi uh again. So I will be talking a little bit about systems and resilience um on the front line and in this case in surgery, um a quick over view, I was going to start off about myself, uh a little about myself. Um I do want to speak a little bit about COVID and how it brought the term resilience onto the forefront for many of us. Um what resilience is in healthcare and how human factors can be the resilient solution. I will touch upon a few theories which Niall has kindly already covered. Um And how they're relevant to resilience in surgery. And then of course, I will finish off with um teamwork in and safety in surgery and how it all integrates and intertwines with systems thinking and resilience. Um So I was going to do a long introduction, but just a brief background into how I got into human factors. So, um I've always been interested in human behavior and interactions. Um So I went off and did an undergrad in psychology and sociology. Absolutely loved psychology, especially the research um component. Um I also wanted to give back to the NHS. Um And this is because um I'm very grateful to the NHS. Um And that's because I have psoriasis and as a child and a teenager, I spent a lot of time in the NHS and um I think nurses, doctors, healthcare assistants, medical students, everybody's just fantastic. So, um I decided to train to become an operating department practitioner. Um and then once I did that, um I was working in orthopedic theaters um whilst I was a student, um the whose surgical safety checklist was introduced. Um And that's when I thought, oh, this is good because now the consultant actually knows my name, um which never actually happened before. Um So yeah, II did think the surgical safety checklist did empower everybody in theaters. Uh It is a bit of a shame. Um Listening to Ellie's story, um I then moved to research for the Adrenaline, worked in a mental health for us for a bit. Um I did a um study called Respond uh at the University of Oxford, which I will talk about later and I'm currently working at Lough University. Um So I do wanna, yes. Um my animations are not working properly, but that's OK. Um So human factors um I always start with definitions. Um So human factors is a scientific discipline that uses theory principles, data and methods to enhance human wellbeing and systems performance. Now, when we mean human wellbeing, we mean both patients and staff um and families and carers because we like having everybody involved. Um And in terms of citizens performance, we mean the systems that you work in the tasks, the processes, you know your day to day tasks that you carry out as surgeons or healthcare professionals. How can we improve that using human factors? Um So COVID-19 was very important for everybody. It was change, it changed for everyone. Um I was still working in recess and I still remember we didn't know it was airborne. Um So I was stressed, it was during the phd time as well and we were learning and we were looking at what Italy was doing and what we kind of saw in terms of the mental state. And what we're learning now is that the mental state of the Italian population was really affected. Um And this was because the whole country was going through grief and mourning. Um because many people died. Um Many, many staff had um mental disorder concerns related to anxiety, depression and distress. Um And uh previous studies have actually suggested that pandemics do leave a long term effect on mental health. Now, in the United Kingdom, um uh Boris Johnston seems to be popular today. Um We did go into lockdown um just under 230,000 people died and there were more than 1.1 million uh people in um in hospital needing uh care for COVID. Uh We also saw the strain that it took on everyone. It was emotionally, mentally and physically demanding. Um We had loads of changes, we had positive things, but most of all, um there was an extra strain put on um put on specific services. So um there was an increase in elective surgery by 25%. Um There was a higher workload um and there wasn't enough staff and people were getting sick. Um in terms of surgeons, um uh the Royal College of Surgeons, of course did um publish some guidance. Um There was guidance from the Chartered Institute, there was guidance to support surgeons, but I think the mental health was really affected. Um And it was because there was always issues with personal protective equipment. There was grief going on in terms of family members, I've got a lot of friends who are registrars and they, they would be working uh continuous nights um because their colleagues were sick. Um And there was always that fear of an infection. So this is what led to that kind of um those kind of mental health problems. It was also um the stress and anxiety and the, the symptoms, which is actually uh referred to as PTSD. Now um was affected in nurses, it was affected in healthcare assistants, it was affected across the whole diverse range of healthcare professionals. Um Now, if you look at what resilience actually is, now, I know that when I've got my clinical hat on, if somebody talks about resilience or, you know, I used to coordinate research. Um and I, my everything safety was at my forefront of everything, but it wasn't my personal or psychological resilience that got me through that. It was the number of staff that I had, it was the skills of the staff that I had. It was whether I had all my equipment, whether I had um you know, um rapid, whether my rapid intubation trolley was set up. So resilience is actually the ability to be a happy, successful. Um Again, after something difficult or bad has happened now COVID was COVID was um very, very heavy for, you know, just recently, one of my colleagues put up a post on um stating that one of our other colleagues had passed away two years ago. So it is still quite raw. So we have to establish that psychological resilience is different from what people talk about in human factors and in health care. Um with psychological resilience, yes, it safeguards against burnout and mental health challenges and it must be the number one priority. Um but it's not the responsibility of staff to have build up the psychological resilience when you've been through a global pandemic. And that is a really important message that human factor systems thinking and resilience engineering tries to hone in on. Um So I will introduce resilience engineering. Um It's not um it's based on um trying to make things go um preventing things from going wrong, but also ensuring that things go right. So we're looking at things that go right. So for example, in the N HSI used to work as a part of the patient safety team and we have far more successes. So we have effective discharges, then we do have incidences. So, you know, there's been far more um full patient care delivered in recess have gone to the ward or to it, then we have had incidences. Um And that's because we want normal outcomes, you want to look focus on the positives, the right things. And when we say the positives, it's the adaptations that frontline staff make. Um because essentially the more likely that something goes right, the less likely it is to go wrong. Um So with resilience engineering, we look at what you do, the adaptations you make. Um And we see how they work in the overall system. Um And essentially what we do is we use systems models. So for example, we use um s many of you might have heard of the Systems Engineering initiative for patient safety. And this model essentially um breaks down tools, technology organizations and person, which is in the center. Again, that is patients and staff. Um Just to show you how complicated delivering healthcare is. Um You, we've been through a pandemic, things are tough. Um And we just need to kind of realize that resilience in health gate is about redesigning your work system to make the way you work easier, safer and effective. Um Now this is a very safe example, but it's an example nonetheless. Um I'm I'm very bad at Diy. So when I am doing Diy, um if I'm trying to unscrew something and I can't find the correct screwdriver, I have guiltily used a knife that's an effective yet dangerous adaptation, please do not do this. Um However, that's an adaptation. So in healthcare, for example, in recess, um many of you have heard of two handovers. So the paramedics will give a standard ABCD handover and then uh a few minutes later they'll come back and say, well, actually, um I'm concerned about the patients eating or do you know there might be more, they always have a second handover. So that's an adaptation that's effective. I'm going to touch on some resilient theories just to kind of hone in on the, the fact that resilience is based on systems and not on you as healthcare professionals or surgeons or registrars or medical students. So there are, there is a view of safety and resilience engineering where safety one is where we focus on what goes wrong. So it's reactive, we respond when something happens. So when somebody makes a drug error, which I used to make um during my time in recess, unfortunately, um what what would happen, um we would be banned from doing, administering any drugs. So that would put a toll on the rest of the team which would cause a negative culture where like, you know, you've let the team down. Um and then for further um further kind of uh it it was the reactive response which was um you can't actually do any bank. So we'll financially suffer if we made a drug error which does cause a lot of negative culture. Um Accidents are caused by failures and in safety one, the purpose of investigation is to identify the causes whereas safety two is looking at what goes right, what frontline workers do as effective adaptations, uh humans are seen as a resource and you're necessary for system flexibility and resilience. Um So, and the purpose of investigation is to understand how things usually go right as a basis for explaining how things can occasionally go wrong. Um Now this is a theory that I've spent a lot, many years on uh working with. Um, and it is quite a theory that I do resonate with. So when I am clinical was clinical, um I did get told off for, they say so you're not following policy. And oftentimes I'd say yes, absolutely. But the policy would be 33 pages wrong. And at four o'clock in the morning looking on the internet when my login isn't working, um I would think, what do I need to do? Do I need to cannulate this patient, patient, do the ECG get everything done or should I look for a policy? Um So then I spent five, probably eight years um including my phd working on work as imagined versus work as done. So as Neil said earlier, you are the expert people on the front line know what happens on the front line. Now with work as imagined and work as done, it's separated by the work as imagine workers in paradigm, which is separate by time and space. So as you can see, you've got um the regulators and the managers in work as imagined. So this is what managers and designers um and seniors think should happen on the front line. Whereas work has done is what actually happens on the sharp end. So it's where you carry out those emergency um procedures. Um and that, and it's also separated by years because it takes years to write a policy, months to get it reviewed um and then approved and then minutes to carry out the actions. However, I've written a policy or a few policies and I know how long it takes now in that time, we could have a pandemic. So practice will change again. So it really does make you, we really need to understand the implementation gap. There is the difference between work as imagined policy and practice. Um And we do often use workarounds to fill in those um those gaps. So in terms of resilience, engineering in surgery, um there has, there has work has started. Um So there have been qualitative and quantitative studies work together where they've mapped out how um the preoperative and perioperative phase all work together and they use a systems perspective. Um in this paper, they've also used work as imagined versus work as done. So it's a multifaceted faceted um approach to enhancing surgical practice. Um Finally, I'm just going to touch on um understanding ways to improve patient safety and teamwork. So um the illustration at the back is seats and anesthetics. So I've, what I've done, I've literally just mapped out all the tools and technology, the tasks, the people, the internal and external environment. Um And I've, I've included people like the technicians, the medical reps, the students in the person because all of these things are all occurring with interacting with new technology, internal environments, medical reps, students um whilst trying to deliver the safest and most effective care. So there's a lot happening in, in in theaters, in surgical systems. So in terms of complexities of theaters, II still do work in theaters. Um And what I found that there are always multidisciplinary teams, even the anesthetic scrub and recovery team, although we're all working together, we have different skills. There's different equipment and technology, plastics and eye lifts are so different compared to orthopedics. Um There are students and students are very important. Um And that's because we are, they are the future of the NHS of the future of health care. So I do suggest that they are treated with respect and the who checklist is used there, there, there is also a different level of competences. Um So, you know, we might, I might be confident in anesthetics rather than recovery. So we need to kind of have that, that overview too. There are also ways of measuring patient safety um in theaters and it's called the not TE system. Um I'm just going to quickly go over the respond project. We're going to have to wrap it up very soon because we need some time for questions and things. So, um I don't know if you can just, you can just finish in the next minute or so, please. Yes, absolutely. So the respond project um combined systems thinking resilience, engineering and we came up with four interventions which are based on reducing failure to rescue rates. There are other studies such as anesthetic room studies where we've used human factors systems links and methods to enhance anesthetic practice. Um And essentially the take home message is resilience in healthcare is a systems issue, not a person issue. Um COVID-19 was the wake up call for what resilience actually is and we're just trying to make your work system uh easier, safer and effective. Thank you very much. No, thank you very much indeed. And it's, I mean, two diverse talks really sort of showing the huge scope of human factors and it's an enormous, enormous subject, isn't it? But I think for me, the clue is in the title, isn't it? It's what you said now, it's that we're human, we make errors regularly. It's minimizing error, error as best we can and chasing culture. Um, and Collette, you've put a question in the chat there, recommendation for helping change cultures to accept. There is nothing to be shamed of. That's a difficult one I think, I think, yes, you're absolutely right. And I think for effective human factors integration, I think you'd agree. You need a top down approach as well as a bottom up approach coming from, coming from students and you somehow equilibrate somewhere in the middle. What do you think? What do you think panel? That's why I sort of cut straight to the chase of what the, the whole problem is, uh, unfortunately, we have no quick answer for it, as you say, though, that bits going to have to come from top down, uh, until, uh, it's like senior management except that their staff are going to make mistakes and stand behind them and don't scape to them and turn them into whistleblowers until that happens. Uh, nothing's going to change. Basically. Uh, we need someone at the top, uh, to sort of stick my head over the power PT and say, right, well, I'm going to stand behind someone who's made a mess of something in my trust or my hospital, uh and then treat them as being like the most valuable sort of player on the team that day as opposed to the problem on the team. But if you're the chief executive, you've a better chance of putting your hand up without it being shot off than the more junior staff have. So we need someone who basically is going to have the, like the, I have to do that. Now, I'm working with a big private hospital group in Ireland at the minute and that's exactly what's happened. I have got in touch and said, uh we've, we've, so we looked at a lot of the, the human factor stuff. We, we agree this is where we need to go and we want you to implement it across the whole network. And we are in the process of that at the minute. Now, once we get our first big event, we can find out then if they're going to walk the, the walk, call us, talk to talk. But I'd be optimistic and I find them once one hospital does it, everybody else will want it as well. So we basically need someone to get the ball rolling and I think it'll start falling into place beyond that. But II can't see that happening very quickly in the NHS. So I think, I think I can say that the UK Regulator, the General Medical Council now does recognize the importance of human factors. And indeed, it's one of the generic professional capabilities. And I've published with the medical director of the GMC Colin Melville around fatigue and a few other things. So the regulator does recognize that. Can I just ask you or do you have any human factors training or education at UNI and interestingly on, in our interview, we got asked about like human factors and there was like a picture of plane. Um but no, we have a little bit on like teamwork and a little bit on error. But no, it's not really very part of the curriculum. So what do you think? Um Thank you. Um Absolutely. So, in a perfect world, um having the concept of respectful challenge would be great. Um So I worked at a trust in Birmingham and we had a massive poster in research saying we, we um we encourage respectful challenge. So when I was working in A&E, I would feel comfortable enough to go to a consultant and say, look, I II don't agree with that decision and it will do it respective respectfully. So not in front of people, not in front of patient. Um But what I've seen is that, that's not possible. Um There is always a hierarchy and people often say we do need a hierarchy, especially in surgery. Um But as Nilla said, we need a top down approach. Ideally, I would like it to be a bottom up and top down approach, but I don't think we're there yet. Um We don't have that voice for your band. Five nurses, your band, six nurses, your junior doctors. Um And I, I think that will take a few years of having human factors, practitioners having more human factors in um curriculums, medical curriculums, nursing ODP curriculums. I think that's we, we're a few years away but we will get there eventually to have a bottom up and top up approach. Yeah. And I guess, I guess that sort of partially answers. Victoria's question about concepts of resilience and support offered, offered by trust, focusing on individuals coping with stress, but resilience is not infinite and will fail if an environment, staff in gaps or whatever continues to deteriorate. What approaches to trust, colleges associations take to address that again, really, really difficult question to answer. Sad if you, if you want to, I say anything about that? You know, it's in the, actually, could you, could you put this, could you put the questions in the Q and A rather than the messages? And again, that was just purely because I just clicked on messages. Otherwise, otherwise, then we would have missed out. You see. So, um so, so please, can you put any questions in the Q and A? Um So I've just uh thank you Victoria for your question. It's very interesting because I try to think of it as an academic slash expert. Um And uh as a clinician. So uh when, when I have worked in um private and NHS institutions, there's always support cafes, talking cafes. Um I think trusts are trying to do their best for psychological resilience. But what they're not doing is saying the support that we're providing you is for work related stress, the pandemic and you know, even things like social events, they say they help. But in terms of trust, I think the education in terms of resilience systems needs to be highlighted. I think the term human factors in the NHS is quite blurred and what we need to do is say human factors is a science that helps with systems which will help with resilience because we'll have the right tools, we'll have the right people to support us in our day to day tasks. I hope that answers your question and I understand that personal resilience is not infinite. Um I think working together, learning from each other, perhaps learning from other, um, disciplines, aviation, they pilot, uh, need resilience. Perhaps we need to learn from other, um, organizations of the disciplines. Um, but I don't think the NHS has quiet, um, identified systems, resilience and personal resilience and that's where the problem lies because it's quite blame me saying. Yeah. Yeah. Yeah. I mean, that's another thing, isn't it about us all looking out for each other at the team briefing, which I certainly do and I often get students actually to lead the briefing as well, which empowers them. They come in, they go away thinking, wow, what an amazing day they've had before Steven started. And you please do look out for me if we, if we're doing a really long complex operation, we will set a clock at about three hours. And if you're concentrating, the staff, nurse, the anesthetist, whoever it is will call the time. And as long as it's safe to do so we then stop operating. We down to, we go away, we have a cup of tea or coffee, have something to eat and then we come back and you actually catch that time up as well. So, you know, just the power of teamwork and things is, is really, really, really critical. David Parry has put in the chat about that human factors is not in most curricula at the moment, which is, which is obviously a concern, but especially as the GMC says that it should be, I'm gonna put, time is actually running out. I'm going to put in the, in the messages, a link to the Civil Aviation Authority Human facts handbook. Now, I think Niall, I think you would agree with me that we would never, we shouldn't compare aviation to health care or surgery. The two are totally different but you can certainly learn lessons from other so called high reliability or organizations HS. And this is a free to download. I'm just looking for it now. A free to download resource written by a chap called Steve Jarvis, who's the civil Aviation authority expert adviser, Steve and I have written a course for the R CS about human factors. It's an online course and, and we're currently writing a face to face course that should be available. So, so please download that and have a look if you're interested. And I think now probably if you, if you take out the aviation examples, there's lots of thing about your medication, threaten error management as, as you mentioned, resilience all sorts of things in there. So, so please do download that well worth a read free to download. Um, Emma. Do you think there's something like, oh, gosh, it's just, it's just jumped off there. Um, something like a, a short course. And, oh, here we are. Do you think something like a mandatory short training course in this domain could show interest and awareness Um Yes, I think the answer is to that. I mean, most definitely. Um As I say, we've written a short on online course for the, for the R CS, which is available. Um I don't have the link for that, but that's certainly generated a lot of interest already. If we think so, Richard, if we think about high versus low performing teams, do the speakers think that there are differences maybe indicative of total system processes that may be entirely different experiences and functioning where care, maturity, trust and response are treated in entirely different ways. What practical interventions would the speaker suggest to identify and understand and improve those problematic indicators um to, to improve teamwork. Um Really long question. I don't know if we've got, we've got about two minutes left actually. So I don't know, team if you wanna, if you wanna try to answer that, no final, final thoughts from you, I'll go across the uh the panel. Uh I suppose final thoughts. A lot of people have asked again, how do we actually get the, the concept uh across as we've already discussed? Management need to sort of take the bulb of the horns. You've done a lot from that, from the top end, from, from yourself and from the the college. Uh I'm starting to get some buy in, in Ireland now as well that people are starting to take more interest. Saturday is right. We need a top down and a bottom up approach for the, we need the top down first. Or else the people at the bottom aren't prepared to put their hand up and with good reason, uh, whistleblowers get their hands blown off. We've seen that repeatedly, but I think that we're talking about evidence based practice. All the evidence I've seen is that what you're doing at the minute is not working. So, uh, I think we need to try something different and I think I've shown uh some of the evidence there that in aviation, uh our evidence seems to be, our system seems to be. It's not perfect either. It's not Disneyland, but it's better than yours and I think it's worth trying. Yeah, thanks very much. No, any final thoughts from you then about medical, medical student. Now you would, what you'd like to see changed. Yeah, I think just um being like welcomed and introduced and just feeling a part of the team has such a massive impact on your experience in theater or wherever it is and like how well you learn. Um So yeah, just that as a like just being introduced honestly has such an impact. So yeah, that's what I can say. Yeah, so thanks. So, so everyone, please listen in on this call, please, please please empower your team, value your team, respect your team. So respect is earned. You know, if you give respect to anyone, then you'll get respect, respect back. Um Really important. Um Sadio. Last, last thoughts from you. Um Yeah. So in terms of systems, am I answering the question, Peter? Um Well, we've got about a minute left. So, so answer the question, what, whatever you'd like to do. Yeah, if you can have the question, um There's a lot of, there's some research being done at Oxford um with the response team and they use a concept of mill sports, they looked at military teams and sporting teams and one of the interventions that we took away was away days. So we take staff away. It was a lot of work. Um And it was a form of team strengthening, enhancing culture. So things like that could possibly happen. But as I said, they do require a lot of work. Um I think that's my final thoughts. Thank you for having me. No, thank you. Thank you very much and asked us to email the human factors course again, this is multitasking. I'm trying to, trying to chair the session. I don't, I can't look it up, but you know, you've got the internet. So, so please please do look that up. So I'd just like to once again, thank everyone for attending. Thank you for giving up your time. I hope it was useful. Certainly, we can do more of these events and things and um you know, again, just thank you so much for coming and that's exactly seven o'clock. So we're going to end now because with human factors. People want to go away, they want to spend time with the family, they want to have something to eat and drink and things. So can I just thank Niall Ellie and Sadia so much for your attendance for coming. I also, I'd like to thank the ba team. So Antonia and Sean Healy. Thank you so much. All the work that you've done. I don't know if you can put your, your cameras on and Phoebe as well. Um, if you can and if you can't, not to worry. But thank you so much to everyone and I wish you a very nice evening. Thanks very much for attending.