July 21-22, 2021 / Online

An interview with David Dylan Thomas

Written by An interview with David Dylan Thomas

June 7, 2020

Hi there, I’m Geoffrey Daniel, producer for the Design and Content Conference.

This week’s subject matter is not only timely, it’s something we need to continue to address long after protests end. We’ve spoken with David Dylan Thomas, a content strategist and creator of the Cognitive Bias podcast. David has dedicated 103 episodes to the subject of cognitive bias and how cognitive bias structurally disadvantages people of colour.

The change that needs to happen goes beyond cops seeing Black people as criminals. It’s also that doctors see Black people as more capable of enduring pain. Teachers see Black children as older, less disciplined, and more worthy of harsh penalties. And banks see Black people as high risk and less worthy of credit or home ownership.

While I am deeply heartened by the overwhelming support for Black life we’re seeing across the world, there’s more work to be done confronting bias in our hiring practices, our conversations with family members, and our day-to-day lives.

My hope is that this interview with David can help you start that process.

Summary

In this interview, Steve chats with David Dylan Thomas about; the sway of cognitive bias on the unconscious decisions we make, the unexpected consequences of these decisions on people and what we can do to limit its influence on the things we design.

Links:

Transcript

Steve: Welcome to the "Design and Content Conference" podcast, and I'm excited to have David Dylan Thomas, who serves as principal content strategy at Think Company, and has developed digital strategies for major clients in entertainment, healthcare, publishing, finance, and retail, here with me today. He's the creator, director, and co-producer of "Developing Philly," a web series about the rise of the Philadelphia tech community, and has given standing room only presentations at TED NYC, South by Southwest Interactive, Confab, UX Copenhagen and Artifact, and the Wharton Web Conference. Well, we can't guarantee our audience will be standing, we are so happy to have you here at DCC, welcome, David.

David: Thanks, glad to be here.

Steve: You had a podcast on "Cognitive Bias" where you catalog simply all of them. Firstly, wow, and secondly, why? Why did you do this?

David: I saw a talk by Iris Bohnet called, "Gender Equality by Design" at South by Southwest maybe two, three years ago, and it blew me away. And one of the main points she was making is that a lot of gender bias, segregational bias comes down to pattern recognition. Like someone might be hiring a web developer, and in their head they might be thinking, "Skinny white dude." That's just the unbidden, that's the image that pops into their head when you say web developer, even if they genuinely believe that anybody can do good web developing. So if they see a name at the top of a resume that doesn't match a skinny white dude, they might start to give that resume the side eye, even without intending to.

When I saw that so much pernicious bias could come down to something as simple and human as pattern recognition, I decided I needed to learn anything I possibly could about cognitive bias. And so I went to this, the RationalWiki has a page that's just hundreds of cognitive biases, and I realized I'm not gonna learn that in a day. So one day at a time I just took one bias, so it was the bias of the day, and I would learn about it, and onto the next one, and onto the next one, to the point where I became the guy who wouldn't shut up about cognitive bias. And my friends were eventually like, "Dave please, "just get a podcast." So that's how I went on that path.

Steve: Thinking about cognitive bias, how do you feel like that affects our industry when we're looking at designing content?

David: Primarily in two ways, there's the biases that our users have, and so, depending on how careful we are when we design for our users, we could be either amping up that bias and making it worse, or we could be mitigating it. And if we're really thoughtful, sometimes we can actually use that bias for good. But then even more important though, the biases that we ourselves as designers are bringing to that experience, and those are the most dangerous biases, 'cause we don't even realize we have them. And we can perpetuate them and pass them on to our user without even knowing it.

Steve: Can you tell us how users typically make decisions on the web and how bias plays into their experience?

David: Sure, so something to understand about bias is it's not something you can just get rid of. Bias is just a fancy word for the fact that we make maybe a trillion decisions throughout the day, like even now I'm deciding where to avert my gaze, or how fast to talk, or what to do with my hands. And if I thought carefully about every one of those decisions, I would never get anything done.

So, most of our lives are on autopilot, and that's generally a good thing. But that autopilot sometimes makes mistakes, and those mistakes are what we call biases. So, where that can get us into trouble is if those biases cause us to cause harm to someone else.

So going back to that example of the web developer, like if I have a shortcut around the pattern of who gets to be a web developer, who I think of when I see web developer, that can actually hurt somebody if I pass over their resume because they don't quickly fit the pattern I have in my head. So those are the sorts of things that we have to be aware of when we're designing to make sure that we're not making that shortcut worse.

Steve: Do you find that people, as they're thinking about the work that they're doing and becoming more aware of their own biases, does that change things for them?

David: So unfortunately, and this is one of the sad truths I learned by doing like 100 episodes of a podcast about bias, is that almost none of them can be affected by knowing about the bias. There's like maybe one or two where that helps, but for the most part if you know about the bias you still commit it. So it becomes more about being aware that you have the bias and then building in safeguards to make sure that you don't hurt anyone with it.

A good example might be red team, blue team, which is a design practice that the military uses and that journalists use, which is kind of a weird intersection. But the basic idea is you have a blue team, if this were a product design they'd get just about to the point where they're ready to start prototyping, but then for one day the red team would come in and their drive would be to go to war with the blue team and offer a different perspective. And they would find all the holes and all the potential harm in that design that the blue team just couldn't see 'cause they were so locked into their own biases.

So a lot of the time it's about inviting diverse perspectives to the table so that even if they're also biased, they're biased in a different way that allows insight into the thing you're making.

Steve: Ah, I love that building structures in around you to support you, and yeah, I've heard that as the 10th person philosophy too.

David: Yes.

Steve: Something like that when someone comes in and says, "Hey, okay, this is my job to look at this in a different way."

David: Absolutely.

Steve: In your talk you allude to some nasty biases that lead users to make bad decisions. Can you give us an example of something like that?

David: Bandwagon effect is a really easy one, right? So, a lot of us have been in a position where we have to get honest feedback from a client, and it might be a bunch of different people in the room. And there's an experience where, if you have a bunch of people in a room, and you show them three different lines and there's one reference line and they have to say which of those lines is most like the reference line, if it's one-on-one and the line that looks the most like the first one is A, they'll say A. But what you do is you have everybody else in the room go first and have them say B. So by the time it gets to you, you say, "B," right? 'Cause you've been influenced by all these other people. And so when you are in a room with a lot of people, a lot of stakeholders, it's very easy for maybe the most powerful person in the room to say they love something or they hate something, and then everybody else just shuts up, goes along with it, because of that bandwagon bias.

So that's why it's, and a lot of us know this already, but it's a good idea to get feedback sometimes anonymously and say, "Hey, everybody write down on a sticky what you think of this, or what path idea you have for that." So that the people who actually disagree about something can know that there are other people in the room who also disagree. 'Cause that is actually a good way to counter the bandwagon effect if I haven't put one other person in the room who says A, you now have the confidence to also say, A. So that's just one example of the many that plague our users and our stakeholders when we're trying to design.

Steve: That's so true in how we bring in our bias and as teams is a good thing I like to think about. So, how do these biases hurt people as we are building our experiences or doing our work?

David: If we're designing for let's say a very vulnerable audience. I had a friend who told me a story, he's inside the tech for grid space. And he had a team that was designing an app, that were supposed to allow sex workers to be able to basically report on bad customers who were abusive, or let somebody know. And one of the design elements was a very bright colorful screen, and someone was able to point out, "Well, this is an app that's gonna be used in spaces "where people don't wanna be seen." And a real big illuminating screen could actually put this person's life in danger, even if your intent is to safe lives. So, the bias that the person who's designing that of just simply not knowing and not having had that experience can actually be very, very harmful, and that's one of those red team, blue team situations where you want a red team to come in and say, "Hey, have you considered what happens if someone uses it in this really bad scenario?"

Steve: Yep, that is so true, like that there's a thin layer of experience that we all stand on, and that we rely on so many assumptions and bias outside of that, that bringing in other perspectives is critical to that. How can we architecture for bias in a way that helps our users?

David: Often there's a couple different texts here that sometimes go against our instincts as designers, but sometimes it's about concealing information. We like to think of design as the artful reveal of information, but sometimes it's better to hold things back. And going back to the resume example, anonymous resumes are saying I'm gonna remove the design element of the name field, and any other identifying or biasing things like what college you went to, or even names of certain companies you worked at can create a new bias. And really focus the hiring manager on just the facts of the case, right? What skills do you have? What experience do you have? In a way that isn't gonna be biased by, again, not consciously but unconsciously where I juxtaposed men with design versus women with design, right? Just take that off the table.

And the really funny story about that, the City of Philadelphia actually created an anonymous hiring protocol for web developers, and as an experiment the first thing they found out was that if you want to anonymize a resume the best thing to do is physically print it out. Have an intern who has no stake in the process take a marker and just redact it like a CIA document. And then the other trick they found was that even once they had an anonymized resume and they like the qualifications, like your natural instincts would be to go to GitHub to see that developer's profile, but as soon as you did that you saw all their personal information and it ruined the experiment. So they created a Chrome plugin that would as the page loaded redact all the personally identifying information, and then they took that plugin code and put it back on GitHub, so it's actually there now if anyone ever wants to use it.

Steve: That's a great place to put effort in a process like that and a fantastic example. Thinking about your talk and what you'll be giving, are there any key little bits, sort of gold star moments that you'd like to hint at coming up to the event?

David: I think my favorite part of the talk is at the end when I really focus in on our biases as designers. I like to think of it as a secret design ethics talk. It's like I lured you in with the promise of bias for users, but actually I'm gonna talk about design ethics, gotcha.

Steve: Nice.

David: Because that's ultimately where it goes, 'cause if we're talking about our own biases and wanting them not to scale in a way that's gonna hurt users, we do have to start talking about the process of design, and the questions that we ask ourselves when we design. 'Cause at the end of the day, ethics is a lot about what questions you're asking yourself about the things you make. And it turns out there are tons of resources, and I'm gonna have these available virtually for all the attendees, however we wanna distribute them. Lots of people have been working on this problem for a while now, but there's all sorts of great questions to ask yourself and processes to sort of instigate, right? And budget for, right?

Steve: Yes, yes.

David: Because it's the only way things get done, so that you're consistently checking your assumptions. And at the end of the day that's why I feel like this isn't gonna be strange for designers, because our job is to help people question assumptions. And this is just another layer of that in saying, "Okay, instead of these assumptions we're used to thinking about around what do I think my users need versus what they really need?" It's also some assumptions about, "Well, what am I even assuming that I don't know I'm assuming, and how do I question that, right?" So I think it's just applying that at a deeper level. I'm really looking forward to that part of it, that's when it really gets fun for me.

Steve: Oh, that's great, and it sounds really good, and I can tell that it gets fun for you there by the expression on your face.

You've got a book that's gonna be coming out this summer, and it's called "Design for Cognitive Bias." What could you tell us about that? Any kind of preview that you'd like to give?

David: Sure, I mean the talk itself is kind of a microcosm of the book and it was sort of the impetus for the book, but the book really talks about bias and design in three ways.

First it talks about the biases that our users have and the design choices we can make to make their lives better and help them make better decisions. Then I focus a lot on, and I think this is gonna be a pretty popular section, 'cause people keep asking about it, but the biases that our stakeholders have. And so it tries to get at why sometimes we have difficulty getting our stakeholders to see things that seem obvious to us. So it has a lot of strategies in there around that. And then the final and I'd say most challenging perhaps sections at the end are around our own biases and what we do to make sure that the assumptions that we don't even realize we're making don't end up becoming problematic for our users. But that's coming out from the Book Apart this summer, I'm really looking forward to people getting a chance to see it.

Steve: Oh, that's excellent, I love that Book Apart series and I'm really happy that you're going to be an addition to that.

David: Thanks.

Steve: Thinking about the event too, I know that you haven't been to DCC before, and of course we're hoping to have you in Vancouver and with the global pandemic this is a much better way of doing this, keeps everyone healthy and safe. But what are some things or a thing that you're looking forward to for this event?

David: I mean, I already knew even before we went virtual that there was gonna be a lot of just really awesome people at the event just from looking at the speaker list, so I'm just looking forward to getting to know a lot of people, or getting to know a lot of these people better. I know that Sarah Richards and I have already been chatting about, "Oh we need to meet up." Now it's gonna be a virtual meetup, but still, you know? So I'm just excited to meet a lot of cool people.

Steve: That's great. Well, thank you for taking the time to talk with me today, and we're very excited for your talk this summer.

David: Thanks so much, looking forward to it.

Our sponsoring partners