Home
This site is intended for healthcare professionals
Advertisement
Share
Advertisement
Advertisement
 
 
 

Summary

This on-demand teaching session focuses on the impact of unconscious bias on medical professionals. It will explore how our own unconscious beliefs and assumptions shape the care we provide and affect our patients. Through visualisation and data-driven models, attendees will gain insights into their own biases and learn strategies to mitigate them. Real-world examples and case studies will be used to illustrate the topics discussed, offering a unique opportunity for medical professionals to gain invaluable insights into combatting implicit bias.

Generated by MedBot

Description

An introduction to Diversity, Equity and Inclusion... and how you can be an effective ally!

Learning objectives

Learning Objectives:

  1. Identify and distinguish between the various types of unconscious bias.
  2. Analyze the ways in which unconscious bias can affect medical treatment.
  3. Understand the psychological process behind perceiving an “in group” and an “out group” when forming biases.
  4. Recognize the implications of unconscious biases on recruitment and selection in the medical field.
  5. Assess individual scenarios to detect potential unconscious biases.
Generated by MedBot

Speakers

Similar communities

View all

Similar events and on demand videos

Advertisement
 
 
 
                
                

Computer generated transcript

Warning!
The following transcript was generated automatically from the content and has not been checked or corrected manually.

her home. Mike. Thank you. Lovely. Thank you very much. I can't see anybody. I also can't tell if anybody's put the hand up. Put any questions in the chat. So please do stop me. Ask questions. I'm talking about unconscious bias. Got to say this is probably one of the talks that just stressed me out the most. So thank you, Cate, for volunteering me for doing this. It's certainly been a learning curve for me. My paper. So I can't see anybody. I can't tell if you're doing this, but everybody, what I want you to do is we're gonna do a little bit of a visualization. Okay? So I want you to close your eyes. I want you to imagine that you're off. You go. You're flying off to an international conference. You get in your car and you set off whilst driving the car in front of you is going so slowly. They're pausing at every sort of junction when they don't need to. There's queues of traffic behind and you're running a bit late. It's going on for ages. Your behind this car and you're trying to work out. If you could overtake and you can't and eventually they turn off and you glare at the driver. As you drive form. While you're awaiting your flight, you decide to grab a drink at the at the bar. It's really busy that's absolutely packed, But there's a couple who move their stuff so that you can sit near next to them. So you grab a seat there drinking champagne. They tell you that they're just about to go off on honeymoon. You wish them a lovely trip. So you finally get on the plane as you climb on and and sit down. You see the pilot boarding. They turn around and wave, um, at the passengers, and you feel quite reassured. Okay, I want you to open your eyes, and I want you to be very honest with yourself when you're thinking about this. In that visualization. Is there anybody who didn't imagine that slow driver to either be elderly or female or both The honeymooning couple. Did you imagine them to be the same sex? And if I tell you the pilot was black, does that fit with anybody's visualization? So unconscious bias? What is it? Well, it's judgments based on previous experiences. Personal deep seated thought patterns, assumptions and interpretations, which is a long way of saying it's things that we're not really aware of doing. But we've got to set up. That means that we do. We do have those biases. So many of you know me. I'm a orthopedic surgeon at the Norfolk Norwich, especially some pediatrics. I've been a consultant for 5.5 years. Um, and I have I trained at UCL. I pretty much just sort of run through my career. Didn't, um, taken career breaks. And during my entire well, since I've been earning, I have, um I have been, uh, the the main, uh, the only financial support for me. I have, um I had, uh, no sort of haven't depended on anybody, um, for finances. My career has been incredibly important, and it's certainly taken a sort of highlight in my life. My sister is a corporate lawyer. She moves millions without batting an eyelid. Uh, of my very close friends. I have a friend who's head of finance for a large swathe of London trusts. I have another friend who actually leads a think tank about where we put international aid, or at least guiding people, any female consultant, colleagues and friends who extremely high powered and and to take the career very seriously. I did the I 80 and I have a moderate association with thinking that if you're male, you're professional, your career and if you're female that you are family, that you are the caregiver at home. So it's really interesting, isn't it? Why, why did I unconsciously come up with that So unconscious biases? Primordial. It's actually lifesaving. So we have these these assumptions that we make in order for us to, um, respond about thinking so we can respond quickly. So, for example, if you know that snake is going, you know, potentially poisonous, you see any snake and you you immediately think dangerous, terrible, awful, even though you've never been bitten by a snake. And actually many of them aren't or aren't problematic, so it helps us make those fast decisions. There's also a degree of familiarity gives us comfort. So we actually we like to revert to what we understand what we know. So if there's any difference or something, it makes us feel, um uh out of that comfort zone. It's really jarring. And what we do is we create this sort of, in group of characteristics that we understand that that fit with us, and we respond positively to those characteristics. And so that's our in group. And then we've got our out group. So that's a different group. That's a group where we don't necessarily feel fit with the characteristics that, uh, just left unchecked that negative. So without intervention, those unconscious or in class can be explicit. We can become a prejudicial or discriminatory without, Actually, they're being a check on on what those are I/O groups that we've we've intrinsically, um, or implicitly come up with so I can work with this. I haven't actually read. This book. Apparently is a very famous business book, but I quite like the model when I was doing research for this talk. And so what this model is is talking about how you how you create a set of leaves and behaviors. So what you have is you have your observation or data, so something happens to you. So the event happens. You then pull data from that. So you take whatever experience you've had or observation and then you take data. You then interpret that data for a lens. The means you're putting meanings to it. And that can be with things you've got culturally or personal personal meanings to to that data which then makes you make assumptions. So based on on what? That that data, how you've interpreted the data is what assumptions you make. And then you draw conclusions from those assumptions and those conclusions become blandings. So what? Your beliefs are about the world, and then you have your actions, which at the top of side that's been cut off from this figure. So you have actions which is based on that belief structure that you've created and whatever our beliefs are that directly impacts on what data we take from the observations. So it's a reflex. That means that we actually are depending on our own biases. We are actually selecting data that we want to in order to help create those those actions. But actually that that can then mean you get in a cycle of, um, believing sort of negatively about something, and you therefore pulling data in a biased manner from it to create beliefs that fit with that structure that you've created. So for me I have grown up with, but also clearly intrinsically my my understanding of the world is that men are the leaders, providers there, strong taking college where the women are much more supportive There, the emotional, helpful, um, fragile caregivers. And that's what's reflected in that in that implicit bias test that that I did. And hopefully you guys did as well. So bias happens in medicine, so we've There's been evidence for bias affecting the treatment that we give patient's. Now that's in a whole range of different groups, and we talked about quite a lot of them today. So race and ethnicity, gender, socioeconomic status. And then there's things such as age, mental, illness, wait, patient people. And, uh, there was that lovely chance, wasn't it? Sort of showing about, you know, that if you're overweight, you know that that's a negative trait, um, that they make people make assumptions. If you have AIDS or HIV that you have your brain injured patient that's perceived to have contributed yourself to the Injury IV drug users, people with disabilities. And so how does my ass affect our patient's? Well, there's been, um uh, well, really quite powerful papers. So there's one from the, uh, 2011 where they looked at. And this is in America, um, non Hispanic whites compared to other racial, if not ethnic minorities. And they looked at pain scores and they looked at, um, and they controlled for, um, illnesses. And they found that both in acute and also in chronic pain, there was consistently left less analgesic given to patient who who were, um, a non Hispanic whites. And then there's a paper from 20 years ago where women were three times less likely to receive knee replacements. The men, despite all the other factors. So um, the oxidase Kelsey pain, control of the side, the pain levels, etcetera, all controlled for three times less likely to be offered a knee replacement. And we think, Oh, well, that was, you know, 20 years 10 years ago. Things are changing. Well, there's There's a paper from this year from the America. It's black and Hispanic Patient's were less likely to receive an HIV drug than White Patient's in a quicker time period, and actually the only intervention was a set of of absolute guidelines were made, which then meant that that those drugs were being offered to those patients'. So even now, in 2023 or 2022 when the paper was created, uh, there is bias within our healthcare system of the way we're treating our patients', and this isn't types of of way that we can be biased. So there's this thing called anchoring, anchoring bias. And that's essentially where we make a judgment when we first, um, when we first see a patient, when we first, um, make a assumption or diagnosis And, um and that, uh, we sort of hold onto that. And, uh, because of that, despite there being a change in the information given, we're still holding onto. So the example that this particular paper used was talking about neath just saying that actually, the patient had been easily to ventilate, using a mask initially that then been intubated that had sort of airway swelling. They were actually quite difficult to ventilate with the mask, but because we've we've made that assumption at the beginning that they were. They were an easy person to ventilate with a mask carried on with that, and so that's angry and virus is what it's called then there's availability bias. So availability is basically what what do you know about and what have you heard about most recently? So if you've read an article just recently about a particular condition or you've had a lot of exposure to a particular thing, you're much more likely to to make that as your diagnosis. For example, if you've been doing TB clinics then, um, and seeing the small number of people with tuberculosis these days, then every patient you see, um might you might come up with that as a diagnosis. And then finally, there's confirmation bias. So that's basically where you've made that snap decision a little bit like your anchoring bias. But what you're doing is in interpreting that data for a lens, which is just looking for things that make that original diagnosis correct. Rather than viewing the viewing the data with a, uh, with, uh with impartiality. So actually looking to see well, is that diagnosis still appropriate? So those are that the three sort of major ways of bias that can affect us from clinically and then when we look at from an employment point of view, um, so there was a set of experiments done in the 19 seventies, where they looked at white interviewers interviewing black, um, interviewees. And they found that compared to the white interviewees, they the the black interviewees received less sort of intermediate, see, so chat between questions, you know, the way that they were sort of prompted they the exam the interviewers examines, um, ask made more speech areas when they were talking to these interviewees, and they provided them with actually less interview time overall than the white interviewees. So as a consequence of those changing behaviors from the interviewers, the interview was shorter, and those interviewees had less opportunity to shine. And therefore, uh, good manner, a less good man than than the white white interviewees. And then there was the Boston Orchestra, where prior to well, 1952 was when the Boston Orchestra um, prior to that, all orchestras were entirely mail. Um, and then the Boston Orchestra, in 1952 came up with an idea of putting a screen up to blind auditions because actually, you don't really need to see your, um, your, uh, musician. You just need to hear the music. Um, and they did that and it had a massive impact on the the, uh, the musicians that they employed it actually up to 50% so suddenly became much, much more equal. I mean, they only I think, for the Boston Orchestra they went up to the heady heights of 21% being female. But that was a huge change, and actually that was adopted over the 19 seventies and eighties. So really big changes to to orchestra just by blinding, um, the auditions. And then, um, it's been alluded to already. There's actually been 27 studies which have looked at gender differences in orthopedics. Um, and they've found overall overwhelmingly, um, that there's been poor work, first representation, lower salaries and less curious success, including in academia. So it's very relevant for for now as well as, um, as well as before, so unconscious biases when we unconsciously behave in a way that we're not true to ourselves. So essentially, we don't mean to behave this way, and we might not be aware that we're behaving this way. And actually, there's really there's interested in two different 2016, actually says the majority of people can recognize bias and others, So you can say Oh, yeah, No way. Something. I can see that person, they're They're very biased, but not actually recognize it within themselves. Um, and so that's really difficult, isn't it? And the answer is, of course, we've we've discussed it would just stop being unconsciously biased. But the problem is we might not ever be able to stop being unconsciously biased. There is something primordial about it. There's this is this is the way that were designed so we can be educated, but we can't actually obliterate every every whisper of implicit bias. So what we can do is change the environment that we're decision making. We're making these decisions in to incorporate an ability to, um, to, uh to make decisions which are not so biased. So one is that we detect and call out bias and others. So we discussed that already, and I think, um, that being empowered to speak up, um, is incredibly difficult. So it's a really incredibly important, but also, um, it is actually something where we need to get better at, um, uh, standing up and saying, Actually, that's not right. Um, we've got some bystander training. Um uh, awareness training coming up. We've got some, uh, orange next week in February, which I'm very much looking forward to, but I think it's incredibly. Some people are very good at calling out things, but authority of us feel awkward. Um, and it's it's finding strategies in order to be able to do that, and we are better at seeing it in others, so we might not necessarily. We might not necessarily see it in ourselves. So it's really important. You do call out other people because they may not see what they're, what decisions they're making about it being pointed out to them. Another thing is, we have to realize that we are not more fair or less biased than others. So actually, we may feel we're better, but we're probably not as good as we think we are. One thing which I thought was a really good strategy. So if we say that this is reflex thinking, it's unconscious biases, reflex thinking. It's, um, it's actually designed in order to, um to provide us with a sort of quick response, as it were. So you know that that life saving response so if we're making important decisions, we need to slow down. We need to slow down the decision making, so it gives an opportunity for our conscious mind to come in and go. Actually, are those attitudes of is that decision based on good evidence or or have I allowed some unconscious bias into that situation? So slowing down the speed of decision is really important in order, especially if you feel like you're in situations where unconscious bias might be, uh, might be ruling the decision making. And it is really important if you have something where you feel feel their stronghold. That must be truthful to just check that you're not just set going back to your to the stereotypes that you need to be open Saints that's new and familiar. So that's a slightly whistle stop tour. Okay, stop share ng of unconscious bias. Now, I got everybody to do the eye 80 which is an incredibly powerful platform. I first did it last year when I was in Vancouver. Uh, pediatric orthopedic surgeon had done a a study using it, but actually, I hadn't realized it's, um it decades old and actually, um has been using lots of different different areas, but has really changed. Um the way that you sort of we can assess for in, um, implicit or unconscious bias. So, um, uh, the way it works is is how long your paws you're having to think. And then and then put your answer in is how much are you being driven by your implicit bias as opposed to as as opposed to what you're actually, um, consciously wanting to to think or believe. Thank you, Helen. That was really good. Um, I when you mentioned to me a couple of months ago about the i i a t scores, I had a play around with it then and it's just it is really interesting. I love the way that they've set up for you to be able to do for so many different aspects of unconscious bias. So certainly something for everyone to kind of explore if they've got some time or if they're wanting to do research base. I think actually, it's quite a good tool. Um, no questions in the chat box. I can see anything that anyone would like to ask any training