and is the director of the Office for Women's Careers and the Center for Faculty Development and Diversity at Brigham and Women's Hospital. Since joining the faculty at Brigham and Women's, she has held leadership positions in research, education and faculty development. Dr. Serekshtro graduated with a degree from Case Western University and came to Brigham in 1991 for a primary care internal medicine residency. Following residency, she stated Brigham and Women's to complete an epidemiology research fellowship as well as the masters of public health at the Harvard School of Public Health. She has a deep research experience in women's health with a particular interest in the role of obesity and hormonal factors on chronic disease as well as understanding of sex-based differences. She's the author of more than 200 research publications and holds several large grants from the National Institute to a National Cancer Institute and has been continuously funded by the NIH since 2002. Her current research focuses on the impact of metabolism on the risk of heart disease and stroke in women. She's also examining interventions that might reduce breast-enceying women thereby reducing the risk of breast cancer. Dr. Serekshtro had also as a committed teacher and mentor serving as a program director for the Harvard Vanguard Primary Care Residency Program from 2004 to 2006 and has taught psychosocial aspects of medicine from 2003 to 2017. She's been honored with the James Munchall Award for Excellence in Ambulatory Teaching in 2009 and has been recognized as a mentor as well receiving the Clifford Barger Excellence in Mentoring Award from Harvard Medical School in 2016. Please join me in welcoming her to the podium this morning. Good morning. It's really a pleasure to be here. I was at Boston Children's last fall for a diversity grand round. And our campuses are so close to one another, but I think there aren't nearly enough times that the Brigham men, children's cross-populate. So I hope we can find a way to bring you over to our campus sometime soon. I have no conflicts of, in the past, I think we've been very close to the campus and I have no conflicts of interest to disclose. My goals today are really to start a conversation. This is not a definitive lecture, but the start of some reflections hopefully, some examinations of ourselves and of potentially processes that are going on around us to really look at unconscious bias and how it might shape our behaviors, start the reflection process for our own biases. And then I'll really focus the second half of the talk about strategies that might mitigate the ability of our unconscious biases to filter through in our professional and personal interactions. Some of the things that I've bring up may make us feel ashamed about episodes that we've witnessed or been a part of. And I want to start with some ground rules that are to really think about first recognizing that differences existed in many different kinds of ways in race, ethnicity, gender, but also in job hierarchy, in physical ability and cultural background, in language and learning styles. And it is often helpful to think about a time when you have felt other from the culture or the group that you're in on the outside as we're thinking of some of these sort of bias conversations and to recognize that all of us have had experiences of feeling that way. And we may use those to help dig into this work. And I encourage you not to blame or shame others, but also not to blame or shame yourself. So I'm gonna start with a quick imagery exercise and I'll read off a couple of attributes. And I want you to think of this person in your mind. The first is an undergraduate degree, a person with an undergraduate degree in physics, chemistry and mathematics and MBA from an Ivy League University and CEO of a Fortune 500 company. And the person I'm describing is Inru Nui who is the CEO at Pepsi Cola. I want you to imagine now the person I am describing, a white male married father of two. Imagine in your mind's eye what that looks like. The person I'm describing is Neil Patrick Harris, pictured here with his two children and his husband. The next, a white female physician, winner of multiple marathons. The person I'm describing is Sherry Blouet, who's a physician in our physical medicine and rehabilitation hospital. And I don't believe competed on Monday, but has competed in many Boston marathons. And last, African American woman, single mother with a history of substance abuse. Many of these taglines are things that we hear bandied about in our hospital conversations. This is a description of Maya Angelou. When we think of those words and what they could note in our heads, they're linking into sort of patterns that we've been exposed to unconscious patterns. And those impressions that might generate from those words are some of the first clues to our unconscious associations. We're constantly filtering and sort of putting people into buckets or categories using labels. And so in general, in terms of our cultural context, that we, most of us in this room, have grown up in and in many other parts of the world as well. When we think of a leader, we're more likely to think of a man. When we think of an athlete, we think of someone without physical disabilities. When we think of marriage, we might more often think of a man and a woman. And when we think of addiction, we might limit potential. And I've been fascinated by the ways in which are one liners that we use for identifying patients or clinical situations often come with those backloadings. And so just the start of the tipping point of, where do our unconscious biases manifest? How might we react differently to a few different kinds of words about the same situation? This topic of unconscious bias is in the news right now. I'm sure many of you heard about a very recent episode that occurred at Starbucks, where two African-Americans were told not to be able to use the restroom and to leave the store because they weren't buying anything. And there was a police response, which is its own separate piece. Starbucks responded by saying they're going to close all of their stores for an entire day and do racial unconscious bias training. And these moments of a differential response to race or to another attribute are embedded in our daily lives and are reinforced by cultural perceptions and biases that really have been woven into the culture and a way that we'll discuss. The WAMC has taken this on. This is their definition, referring to social stereotypes about certain demographics or groups of people that form outside our own conscious awareness. And none of us want our medical decision making or our professional interactions to be governed by these forces. And yet we need to become aware of what they are and so that we can minimize that possibility. So what is bias bias in and of itself isn't a bad thing. It's an automatic response. It's a filter that we bring to sort the world. And we're constantly filtering information. This has been the topic of several recent books. We were just discussing Daniel Cunhaman's book, Thinking Fast and Slow, Malcolm Gladwell wrote, Blink, Maserine Banaji, who is really a leader in this work, recently published a book called Blind Spot. So this is very much a topical conversation within our culture at the moment. So if we look at this picture, some of you might see Adventure Excitement, Acceleration. As a mother of two boys, I might see Danger and Fear. But what is actually there are these things. Silk, ropes, gear, a person. But this is what we're doing all the time. We're looking at scenarios and we're bringing our perceptions to them, not just seeing the elements of what's there. That's the unconscious process. So in terms of unconscious bias, we're talking about social stereotypes that form outside our own conscious awareness. And these are forms by our exposures to our external world. And so the most part, I'm going to be talking about the experience that many of us have had by being raised in US culture. But within the US, there are certainly variation in different parts of the country in different family contexts. Many of these biases are seen across multiple populations. But again, we have to define this within a cultural experience. And these unconscious biases form because our brain needs to filter something like 4 million pieces of information every minute. And so we're constantly dichotomizing and sorting the world according to categories. And this efficient processing can come at the cost of accuracy. And I want to emphasize that our unconscious biases often are explicitly at odds with our, or they're at odds with our explicit beliefs. So for instance, I'm somebody who has spent the last many years of my life working in the office for women's careers. I have a very explicit bias towards reducing gender bias in many different ways. But when I took an implicit association test on certain domains, I still had gender bias, implicit gender bias that really is a result of my cultural background. And that I need to understand how it might manifest in my own thinking. So these two things don't necessarily go together. Racism and unconscious race bias implicit race bias are not the same thing. So there's a fascinating neuroscience behind this. I won't go into a lot of detail. But these biases form by our need to use sort of what some people call fast brain or brain one. Which very efficiently handles information. And which in medicine we have to use all the time, that initial impression is this patient's sick. There are many pieces of data that we're assimilating and looking for patterns so we know how to respond. And this is very advantageous in a crisis and in very repetitive tasks that we do all the time. It is tied into the limbic system. And so can be tied into emotion, which sometimes can be unhelpful. And at the cost of this speed, we can make false links and errors in our response. We also spend a lot of time in medicine talking about the slow brain, the differential diagnosis, the thoughtful way of pros and cons and consequences. And the disadvantages of that system are that it takes energy and effort we can't live in that space all the time. And so we're constantly switching back and forth between these two. And we have to live in both spaces. We just need to understand what might be going on in our unconscious processing. So there are many different kinds of assumptions that we make. And often these kinds of episodes are the thing that we might remember, at least in my case, as I'm falling asleep at night and I think I can't believe I said that, or those foot and mouth moments, where we realize, by what we've said, we've made an assumption that wasn't true. That might be assuming somebody's gender or race on the basis of their name. It might be something about their sexual orientation. I had an episode in my primary care physician, and I had an episode where a patient that I had met just once or twice before called me and said that she really needed to see me that she didn't feel safe at home. And I arranged to meet her in my clinic at the start of the day. And I started talking with her. And in that, I only met her one other time, and not first sort of my usual intake appointment. It was for an urgent care appointment. I made the assumption that the person that she was afraid of was a man, and that she was straight. And that wasn't the case. She had a female partner who she was feeling unsafe about. We recovered that moment. I learned to ask that question in a way that would bake no assumptions. So there are ways we can defend against our own biases. It could have led to an outcome where she felt not disclosing to me and not being able to help her. So these kinds of assumptions that are embedded in our thinking and have very real consequences to our clinical lives. Again, there are ways to work around them and how we talk, but then also in examining our own processes. So I'll start with some data on height implicit bias. And I think nobody would say that there's a GWAS that says that height and intelligence are linked. But each inch of height is associated with a higher salary in this culture. And of the general population, only about 14 and a half percent of men are taller than six feet, while more than 58% of Fortune 500 CEOs are taller than six feet. So how might this happen? Why is this? As Malcolm Gladwell says, no one ever says of a CEO who candidates he's too short. But we have linked physical stature and physical power with leadership in our culture in a way that reinforces that leadership potential of that slightly taller person in ways probably starting very, very young on through later life. There are people that analyze presidential races and the taller candidate almost always wins. So there's this connotation we have with physical size and leadership. And again, in our minds, we would say, I don't really believe that's true. I don't need a tall leader in our current context to protect us. This isn't a physical danger vote, but that link is still there. And it actually reflects in salary. So we can't even see these blind spots unless we start to look for them. I mentioned Dr. Banaji recently about her recent book Blind Spot. She defines it this way, implicit biases come from the culture. I think of them as the thumbprint of the culture on our minds. Human beings have the ability to learn to associate two things together very quickly. That is innate. What we teach ourselves, what we choose to associate is up to us. That last part gives me hope because I do believe we can make new associations and find ways to mitigate this. But we have to recognize, again, this thumbprint of the culture to start. This has a lot of relevance for discussions which are going on on the front pages of the bus and globe about healthcare access and race in our medical journals looking at outcomes and disparities. And there are many things that drive health disparities, including access and trust. But there's some very challenging data to show that provider held unconscious bias also plays a role. And again, is one that we personally need to reckon with. In a landmark paper from 1999 in the New England Journal, more than 700 physicians at an annual meeting the American College of Physicians were asked to watch a video and given the same set of standardized data, EKGs, et cetera. And asked to estimate the likelihood of coronary heart disease and whether or not they would recommend cardiac catheterization based on this interview and data. And the scripts were entirely the same. The actors were male and female, black and white, younger and older. And they analyzed the data to look at the outcomes of how likely was a physician to refer the patient for cardiac catheterization. And women were 40% less likely and blacks were 40% less likely to be referred for cardiac catheterization respectively. And black women, that intersectionality of both of those attributes were 60% less likely to be referred for cardiac catheterization with the identical EKG and clinical history. Now we as a profession have been discussing whether or not our rates of cardiac catheterization are too high. And that's a whole separate conversation. But we don't want to be making that decision implicitly based on race or gender. We want to be making that decision based on data. And yet there are many, many studies that show similar results that these unconscious biases might be influencing our decision making. In a study that made that next step to our underlying implicit bias, there was a study of pediatricians. And they were given four treatment vignettes, one for pain management after open reduction, a semifracture, one for UTI, one for ADHD, and another for asthma. And the only difference again in these scenarios were the race of the children. In the pediatricians' recommendations for treatment, they were similar for UTI, ADHD, and asthma. But the recommendations for narcotics for black patients were significantly lower in postoperative care. And when the pediatricians' implicit attitudes were surveyed, they correlated with that recommendation, meaning if there were more implicit racial bias, that pediatrician was more likely to not give narcotics postoperatively. And again, we can have a whole discussion about our rates of postoperative narcotics too high or too low. But I think we could all agree that they shouldn't be decided by an attribute such as race. This data has also been shown quite clearly in hiring, and we spend a lot of time hiring fellows, labtex, faculty members, and in a 2003 study, using identical resumes with different names, some with stereotypically sounding African-American names, and some with stereotypically white names. The same resume was about 50% more likely to get a callback for an interview if it had a white name. Same attributes, everything else the same on the resume, the only thing that's different is the name. And more recently, closer to home, in 2012, Joe Handelman's group at Yale published in PNAS, a study that showed similar findings for a lab manager position. And these CDs for lab managers were given either female or male names and circulated to division chairs and cheats across the country for a science offering to both male and female faculty raiders. And male applicants were offered a 14% higher starting salary, a recommendation towards more career mentoring than the female applicants. And faculty gender did not affect responses. So the women faculty were just as likely to make that difference in pay and mentoring for male, rather than female candidate. Again, these weren't male and female candidates. They were identical resumes with different names. So another example, a long gender, in addition to race. So where can we find bias? Well, we can find it lots of places, but we can't fix it until we see it, until we start to look for it. So I spent a lot of time doing behavioral counseling in primary care. And I've come to think of unconscious bias or implicit bias as a habit. And like any habit, becoming aware of the habit and motivated to change that are the first necessary steps. And that means we have to assess both ourselves as well as our processes. So there are a number of tools for assessing implicit bias how many people here have taken an implicit bias test at some point? Okay, so a snattering of people, great. So there is a link, and I'm sure it'll be available through the department as well, to take an implicit bias test, implicit bias testing basically measures response time and our ability to categorize different kinds of attributes. So I might have to categorize male and female and work at home and how quickly I can sort those and with what accuracy into those categories by switching sort of how they've been labeled has reproducibly been shown to link to sort of our own implicit biases. And in fact, when you do the test, you can sometimes even feel it. Like it's harder for me to connect certain attributes. Excuse me, together. Other ways are to really start this process of reflection for ourselves to get feedback from other people. And there's a tool that I sometimes use which is to sort of flip the situation and try to see what I've responded in the same way if I change the gender, if I change the race, if I changed the sexual orientation. So we might look at these lists of words and wonder would we have the same response would be equally likely to focus on supportive, emotional, helpful, sensitive, and fragile for a male candidate as a female candidate. So a flipping of our language or our assumptions can sometimes reveal some of that bias. I also believe strongly we have to assess institutional biases. And some of our best responses can be in looking at our processes and trying to find ways to minimize their impact. And we really need to look through the full spectrum of what we do in an academic medical center, clinical care, selection, curricula, mentorship and communication. Biasis and clinical care I touched on in some of the examples that I gave. And it's been shown that how a physician communicates with her or his body language and verbal cues, our patients pick up on those things. And it can have a big impact on how our patients respond to or adhere to treatment and their trust of the medical system. This is an example from the Joint Commission where an X-ray was put up on rounds and the team was puzzling over what was going on with this child. And somebody just walking down the hall said, wow, that's a case of cystic fibrosis. And the team had been puzzling because this was a black child and they had been sort of pulled into the heuristic that cystic fibrosis is predominantly a white disease. And so we've been faced with X-ray data, that information knowing the patient's race was actually blinding them to perhaps of the right diagnosis. And I feel like our boards, questions, our experience as medical students all can sometimes work to reinforce these kinds of heuristic dangers. A 24-year-old African-American woman with cough. I can know, I'm supposed to answer sarcoid because that's what's gonna get me the right answer on my boards. And now I've introduced that bias to a degree that I might not consider the diagnosis in other times where we break that bias. That's not necessarily an unconscious cultural bias, but it's actually a reinforced bias that can come out in our medical teaching. And one that obviously can influence care. Thinking about how bias could manifest in care, the first thing we need to think about is looking at outcomes by straight up, looking at language, getting patient feedback, and thinking about how can we incorporate an evidence-based design that will minimize that? I've given this talk in a lot of different audiences and a year or two ago I was speaking with a group of OBGYNs and they had an instance where a patient came in late in pregnancy, I think about 36 weeks with bleeding and the patient and her partner both were very well dressed, had sort of the appearance of high social class. And in this context, late term bleeding is associated with drug use, but the doctor evaluating the patient looked at other attributes of that patient and did not send drug testing for that patient. They delivered the baby, the baby went into withdrawal in the NICU because they hadn't considered drug abuse in that patient because of that positive bias, perhaps identifying with the patient that came in. The unit instituted a policy that regardless of any other attribute, whether you're the department chair, whether you're someone who has been taking care of before or not, that everyone with that presentation would get a talk screen regardless to help undo that element of bias. So there are ways that we can make our processes less influenced by bias, by sort of policies and procedures that can sort of protect against the way it can creep in. We sent a lot of time interviewing candidates, thinking about staff positions, interviewing medical students for residencies and residents for fellowship, faculty. And it's important to start the process by recognizing what factors are we waiting and why, who's being included in the discussion, how might our interviewers, what kind of biases might they have? These biases in the more that we've had these discussions can range from what are our biases about different kinds of schools? Are they right? Different kinds of pedigrees? How much time are we giving different candidates as they're differential there? Are we asking our questions differently of different candidates? And their significant studies showing that addressing implicit bias can improve our yield of more diverse applicants and better selection. Some strategies that might be used include thinking a lot about the selection committee and the perspectives on hiring and search committees, thinking about how the style or origin of references might influence us or foster bias, using structured interviews and evaluation and avoiding gut impressions because those gut impressions are usually the signal from our unconscious bias. And then looking at data and creating accountability. In our teaching, we often reinforce some of these biases and we can think about how do we define normal, how and when we use designations of race or sexual orientation or body mass index. What does that mean and what are we reinforcing when we use it? And looking for these counter stereotype examples, diversifying our standard quote unquote patients so that we're not just describing race and let's say African American race for something that's associated in a higher index category but showing that it's a part of the presentation for any condition and getting feedback from our trainees as well. We can exhibit bias in our mentorship and encouragement of others. And really this shows up most and who do we choose to help or mentor and are we preferentially choosing to help people who are like us. And this is a danger that I recognize both in mentorship and initially in interviewing where if I had a candidate that had a similar path to medicine that I had or might have come from a school that I knew well or one that I had attended, I would have these areas of resonance that was great, it made it comfortable but that might have influenced how much time I spent with that candidate, how much I got to know that candidate as opposed to another. And am I choosing to help people who see more like me or resonate with attributes that I have rather than across the spectrum. So looking at our outcomes for ourselves in terms of we mentor our mentorships and asking why? There's been a lot written recently on bias and communication, whole words like mansplaining generated to talk about this. But looking at who is represented in meetings where people sit, who speaks and whose suggestions are seconded and endorsed, these are not immune from bias in any way. And in fact, in time series where you look at them, race, gender, age have a big impact and to get the best outcomes from our diversity, we need to be able to actually engage the full spectrum of people in our midst. So I'm gonna end with some perspectives that really are individual strategies that play out perhaps more in our personal interactions as well as in our clinical interactions. And the first is perspective taking. And this seems kind of obvious in some ways. It's really just a conscious attempt to envision another person's viewpoint. We talked about how unconscious bias is this categorization, making sort of these sorting into different categories. And perspective taking is an attempt to dive below those categories. So in a study, nurses were told to use their best judgment in pain management for their patients and similar to the study we saw about the pediatric patients they recommended pain medication significantly, more often for white and black patients. However, with a small intervention, the nurse being instructed to imagine how the patient felt to assess that and then make the judgment, they recommended equal energy treatment regardless of race. So there's this moment of moving beyond what was perhaps the implicit bias that nurses themselves were not aware of and really focusing on that individual patient. If we take on their perspective and immerse ourselves in the individual we're less likely to fall back on these unconscious biases. Empathy, the most basic of our emotional tools can be a powerful tool against unconscious bias. This is similar or might be called in a different way, individuating. So if you think of a clinical encounter sitting down and talking with a patient by consciously focusing on the unique traits rather than these social categories to which a patient belongs, we can move beyond these biases. This is where I'm particularly wary of the one liners which can be really packed with these kind of general statements that trigger all kinds of responses in us. And when we move beyond that to something that's more granular that's more individuated, we're less likely to pull in whatever unconscious bias might come with those target words and sort of minimize the impact of something like race or gender. So that could be redesigning our short phrase to include something that's more salient with regard to that individual might still include identifiers that deal with race or gender, but the counter the generalizing impulse. Maybe that's their profession, maybe that's another piece of information about that individual. It's a few moments of focusing on the individual. If you meet an Irish guy in South Mac Boston, you have met one Irish guy from South Boston. So we might have an idea of what an Irish guy from South Boston is like, but that's the group. That is not, and if we know one, that is that one person, we can't let them stand in for the whole group. We have to go beyond that to the individual. So as we are in a situation, we can think about why am I responding this way? Am I thinking or reacting? Is this a bias? And think about how to transition to the slow brain. In my clinical encounters, in my outpatient practice, it's a way of monitoring how am I responding? Do I have a negative feeling for whatever reason towards this situation or this patient, trying to understand where that is? It's not wrong to have the feeling, but if I don't note it, and I don't get to this process of thinking about it, reflecting on it, it might interfere with my care. Concrete, standardized data assessment can help reduce bias. Again, if we have a clinical decision tool that's based on data, it can help. An example in COPD diagnosis is that gender differences, gender diagnosis in women was much lower than in men or accurate diagnosis when based on symptoms. But when provided with spiroometry data, concrete data rates were the same. So the data can prevent us from filling in these partial stereotype-based assumptions that can otherwise leave a stray. Counter-serietite messages also can have a huge impact, imagining in detail people who violate expected stereotypes in a positive way, and thinking about those positive examples, seeing them and seeing them called out can help combat this reinforced, culturally reinforced bias. Dr. Benaji, in a talk, described how her screen saver has pictures that of powerful African-American leaders of women in science, of caretaking men. By seeing these examples, we start to ever so slightly undo whatever category we might want to think. The dominant pictures that we're seeing, that we are seeing all the time in our culture, on our walls, in our media, in what we might see from day to day. When we do exhibit an see-and-episode of unconscious bias, this comes out very much in this episode, such as being discussed at Starbucks, thinking about, why did you say that? Why did you think that? Being curious and open-ended in questions. If we're on the receiving end of that bias, it can be really painful. And we can also be on the side of causing that and comfort ability, or having made an unconscious assumption that could offend someone else. It's also important to know that there are places to go. If you feel that you personally have been on the receiving end of that, talking with mentors, chief residents, or the Diversity and Cultural Competency Council, or Melissa Broderick, who's an HMS on-Buds person. And you can consider filing a safety report. In our teams, we can work to minimize how this affects us as teams. And as we create diverse teams and committees, the real key is to get to know people individually. There was a fascinating study showing that when freshmen were paired with someone of a different race as their freshman dorm roommate before and after they measured implicit bias, and that experience of working living closely with someone of a different race reduced implicit bias among those students. We can make exposure crossing boundaries of any category, can help getting to know each other across differences, can make a difference in reducing our unconscious bias. Focusing on concrete positive and negative factors and on data rather than relying on our gut feelings. And we can be an ally and call out bias in exclusion when we see it. And advocate for ways to reduce its impact. So in closing, I want to emphasize that we all have unconscious bias. That despite our best intentions, unconscious bias, can have negative effects on our personal and professional interactions. A starting place is to look at our own biases and processes to consider how to mitigate them. And I'd encourage you to pause and think about two things you might do in your own life to address your biases in the coming week. If you haven't taken an implicit bias test, that can be one of them. And I thank you for your attention. Thank you for an excellent talk. In terms of implicit bias, do certain implicit biases also result in overtreatment rather than under treatment? That's a great question. So there's actually a lot of data to show that positive bias can have a negative effect as well. That tends to be in situations such as when we treat physicians, which can go both ways. We might want to spare them either embarrassing or uncomfortable testing, but we might also over test. And we actually know that there's a lot of dangers in overtesting. And so positive and negative biases can be equally damaging in different ways. But absolutely, there's ones that result in overtreatment. And in that categorization example, I don't know what the right number is. It may be that the recommendations for white men and that were overtreated. So it can go in both directions. And most often lead overtreatment might be in categories that are these positive biases. And in our culture, those would be things associated with higher status, power. And that would tend to go in that direction. That's a good question. Thank you very much. OK, thank you.
Click "Show Transcript" to view the full transcription (34090 characters)
Comments