July 21-22, 2021 / Online

Designing Backwards

Written by Designing Backwards

March 1, 2021

Transcript

Hi, thank you for joining me in my talk. I'm going to be talking about outcomes and how we as designers can be more intentional about using outcomes as a lens through which we design.

My name is Sheryl Cababa, I am a VP of Strategy at a consultancy in Seattle called Substantial. And these are our beautiful offices that I am not working out of right now, I'm working out of my home. And yeah I miss my colleagues terribly and we are all just trying to deal with our experiences in the pandemic right now.

I'm also a board member of Design in Public which is an organization in Seattle. And I run and teach workshops all over the world. Well before the pandemic hit, now virtually that are oriented around systems thinking, designers being able to integrate systems thinking. And designers using outcome centered design as a framework for their design work.

So at Substantial we believe in the power of technological innovation to improve our lives and society. This isn't just ingenious. Even though it sounds like every statement of every company. I worked in tech for more than 20 years, part of it as a Product Designer. And of course most recently as a consultant. And I've seen it all and I feel a little bit like I genuinely believe this. And I see the problems that are happening in tech today. And part of what I feel that my job is is to help designers and technologists kind of better equip themselves to think about potential side effects of our work.

And what that makes me think about all the time is one of my favorite classic movies Jurassic Park. Which by the way I heard that in June Jurassic Park was the highest grossest movie in the United States. Because of all the drive through movie theaters that are making a comeback during the pandemic. I like to think it's because maybe also people are thinking critically about the unintended consequences of things as well. But yeah, even if they're not, it's still a great movie. And for those of you who haven't seen it, I am going to try to sum it up in 60 seconds. And we're talking about the first Jurassic Park, not any of the sequels which I don't know if I've actually even seen.

So it starts out with a rich old white man who as you can see from here dresses like a colonizer. He gets his hands on some dinosaur DNA in the form of mosquito trapped in amber. And because he has narrow capitalist thinking, it leads him to thinking that a theme park would be an awesome use of that power that he just discovered. And before he opens the theme park, he decides to kind of stress test the situation. And he decides to get together a bunch of experts like paleontologists and a chaotician, I'll get to him a little bit more later.

Some good things happen, they want gentle dinosaurs munching on leaves in the trees. They watch dinosaur babies being hatched with a very young BD Wong looking on. And then some terrible things happen too of course, like a T-Rex on the attack. Of course they add a T-Rex to the situation, and Velociraptors in the kitchen with the old colonizers grandchildren who he somehow decides to bring on this adventure which is completely vile to me. And of course in the end the humans lose control of the dinosaurs and that's the end of the theme park, spoiler alert. But then again of course they do this over and over again for like five other movies. But that's a talk for another day, didn't learn anything.

So what does this have to do with tech? Well I'm really interested in Jeff Goldblum's character in this movie, a mathematician called Dr. Dr. Ian Malcolm. Who aside from trying to looking like a terrible Renaissance painting, he makes it clear he's the only one thinking about how things could possibly go wrong. So he says things like your scientists were so preoccupied with whether they could, they didn't stop to think if they should.

I see this quoted all the time these days and I kind of see based on the world that we're in. And oh God, help us. We're in the hands of engineers. Which is basically a little bit of how I feel after having worked in the tech industry for decades. He was the only one thinking about the possibility of unintended consequences to what they were doing. And the fact that not everything plays out the way we want it to. And so if we just kind of replace this with silicon valley and talk about how we've created our own Jurassic Park's in today's tech environment.

You know what are some of the things we've wrought? So for example, if you're just thinking about social media. Companies like Facebook fail to forsee their role in being was not being around for disinformation and extremism. Resulting in the most efficient way to connect extremest groups. And societal polarization is linked to those of us at tag who are facilitating it. Even if this man on the left, his actions are not specifically due to his interactions with technology. We're making it easier for people like him to rely on disinformation.

I think a lot about this boardroom that Facebook had in 2018 during the election. And it just kind of feels like they're trying to respond but basically in this super reactionary ways. Like they're playing Whack-A-Mole. And reacting is basically like the kids in Jurassic Park. It's Unix, I know this. And trying to respond by throwing resources at it but would probably be better off if we could anticipate it better in the first place.

And so if we think again about Dr. Ian Malcolm, let's just replace life finds a way with negative externalities find a way. Racism, trolp, addiction, rich people, terrible companies, bad governments. The list is endless. And I think one thing for us to understand is that these things are not inevitable. Digital technology does not need to be a garbage fire of polarization, violation of privacy, addiction and inequality.

We need to better anticipate and respond to the ramifications of our work. We can't just put our heads in the sand in the name of technology needs to move fast or break things. Because we're actually helping to break people's lives right now, breaking democracy as part of our work and that's not inevitable.

So just in my years of trying to focus on more responsible applications of technology, I thought about three principles for avoiding unintended consequences of our design work. One of the things that has helped me in this endeavor of trying to understand how to kind of bake these ideas into our practices is that I've worked on and contributed to toolkits for engaging in more responsible tech. Including the Tarot Cards of Tech which are on the left for my previous company. Omidyar Network's Ethical Explorer which is on the right and is a really great resource that just came out last month. And then I've also helped Microsoft with some tools for their responsible AI efforts.

And as a Design Consultant I think tools such as these are ways to have concrete conversations about ethics and responsible technology with organizations. It's a framework for when organizations don't know where to begin and how to start those conversations and make them part of our work.

So the first thing we need to remember in working in tech is that acknowledge that your tech is not neutral.

Look at that, Jeff Goldblum just ages so well.

So recently, just last month, Mark Zuckerberg said "I just believe strongly that Facebook shouldn't be "the arbiter of truth of everything "that people say online". He said this on Fox News and I think he has a very hard line idea of what free speech means without understanding that he's the one who is in charge of creating the square in which speech happens.

But it's interesting because their own research, Facebook's own research has been revealed to show that more than 60% of people who identify as belonging to Facebook groups that researchers categorize as extremists found each other by way of Facebook's own recommendation algorithm.

So Facebook is actually driving people who are extremists to each other. And not only that but there's been this thing that's been happening in the past several weeks where rural towns, many towns in the US, people are thinking that Antifa is coming to their town. And you can kind of see the trajectory of how misinformation like this spreads. It goes from maybe a single user shares it on a group. And if you look at a chart like this, it ends up in the post of more and more extremist groups.

And so this is part of the systemic impact of what's happening and it's not neutral. It's us directing the information. And I think one of the things, this piece that I'm pointing to has basically became an evergreen piece by Zeynep Tufeckci in the New York Times from two years ago about YouTube and it's algorithms. There's basically a perspective that on YouTube, and this is a fact, not perspective. That people stay on the platform longer if fed more extreme content.

That's not a neutral platform.

If you think about the concept in behavioral economics that there's no default. YouTube is like the junk food of information. It feeds us content that is worse and worse for us, especially if you look at the recommendation engine, why? Because it keeps us on the platform longer. And we need to keep in mind that YouTube is the most popular social network for those below the age of 18. I think TikTok is quickly catching up.

So we are actually shaping future ideas and opinions by way of these algorithms. And what it boils down to is that by focusing on a single metric, engagement, we're shaping people's behaviors in unintended ways.

When I was doing research the other year I was talking to somebody who's an expert in social networks and their impact. And she told me about a YouTube engineer that she knows who left YouTube because he was telling them it just feels like we're not being careful about the content that people are actually seeing. It's like my only metrics for success are that more and more people stay on the platform longer and watching more and more videos and watching longer and longer videos. And he ended up leaving because he just felt like they don't take this seriously.

It doesn't have to be that way. So you can if you're willing to give up some traffic, you can make better decisions.

So take Pinterest in contrast to Facebook or YouTube. They're giving up traffic in exchange for a platform that tries actively to prohibit disinformation. This is their page if you look up the word vaccine. And you can see this, it's super relevant to today because there's so much disinformation and misinformation out there about the Coronavirus vaccine that even if you search Coronavirus on Pinterest, they're only putting up information from reputable sources. So this is an active decision they're making to keep their platform from being a hotbed of disinformation.

So some of the things you might want to ask yourself, especially working on huge technology products at scale is does your tech have the ability to influence people's worldviews? And do you have systems in place to prevent the dissemination of falsehoods?

So the second principle is to know your values and stick to them.

I think one of the things I think specifically about is that oftentimes we get caught up in kind of trading our own values for the work we're doing and aren't really realizing that that's what we're doing. So this could apply to building things in a way that our products can only be successful if other people use it in a way that you understand to be dangerous or destructive or unhealthy.

So for the past several years, I have kids, and I have been fascinated by this ongoing story about how tech moguls send their kids to schools that forbid technologies in the classroom. Like wow, they're in monastery schools. Not sure what's happening with these schools now during COVID but I'm very curious. And you have major technology leaders who have said things like we limit how much technology our kids use at home. The person who said this is Steve Jobs and he's not the only one.

Bill Gates didn't let his kids use mobile phones until they were 14. And last year there was this story in The New York Times about how parents are policing their nannies and making them sign contracts that not only do they not allow the kids to use digital technologies and have screen time, they also have the nanny sign agreements that they themselves don't use their phones. And I saw this, it was a quote in that story. "The people who are closest to tech are the most strict about it at home". Which makes you consider well what do they know that the rest of us don't? Well, they know how addictive their products are.

It's interesting that these varied companies in which these tech leaders are guiding, they are continuing to design products for other people's children right. For schools to be integrated within classroom experiences. And they understand kind of what that is doing to us. So that's why they require less engagement of their own children. When you understand the implications, you have a more clear-eyed view of what you're designing. So as technologists and designers we should ask ourselves do we want to be on the receiving end of this manipulation?

I really like this quote. I think Joe Biden said it a few years ago. He said, "don't tell me what you value. Show me your budget and I'll tell you what you value".

And I think if you kind of ask this of the organizations with which you're working, you can kind of see how their decisions are made. How do things effect the bottom line? Do you feel like the work that you're doing has an impact on kind of the negative outcomes in our society? You see for example YouTube's inaction on racism and homophobia on their platform. They're slow to act when it effects their engagement and user levels, which tie to their bottom line.

And you see for example right now, Facebook is really suffering the repercussions of driving traffic in what they view as in an agnostic way. Because now advertisers are boycotting due to public pressure. So I think there are ways that you can kind of consider what is happening in your own organization and how what is driving the decision making.

I think often designers, when I talk about this type of value kind of driven decision making, they tell me that it's hard to align with their allies when decisions are above their pay grade. So I try to use this example of one of my favorite designers.

This is Marine. She's an intern that I had worked with a couple of years ago. Or she was an intern a couple years ago and she's a talented designer. And back when she was an intern, she was intern at a consultancy which is like when you're dealing with clients. It's probably the lowest level of decision making you can have. You're not only not within the client organization but you're outside of the client organization and you're an intern there. But it was interesting, she believes a lot in inclusive design. And she was kind of given these use cases which were oriented around making things easier for people who fall into the category of I guess, I don't know, like hotel memberships or frequent fliers. The kind of people who are already hugely hugely privileged and making their experiences easier.

And she basically worked with a client to say I don't think this is a use case that I should be working on. And instead could we think about how to orient this experience around people with a specific type of disability? And she made the case for it and it ended up being a super engaging use case for them. Because it did solve other problems by way of that as well.

And so I think these are kind of the ways the we as designers can use our own agency to help shape our work and the direction of it. We don't have to assume that no one will listen to our ideas when it comes to things like equity and inclusion. We can kind of use our own powers of storytelling, our own powers of being able to make a case based on evidence in order to shape the work that we want to be doing.

So some of the things we should be asking ourselves, especially when it comes to our values is whose prospective is missing from our product design and development? How could our technology potentially be misused to harm or exclude certain populations? And how might high engagement levels negatively change people's habits or collective social norms?

Lastly, and I feel like most importantly, we should be designing for outcomes.

We should be thinking beyond the direct benefit of use. Those of you know me know that I always include this slide when I give talks. Because I think it captures our moment so well. Designing for outcomes means that you are understanding that your goals and your impact are far broader than just the person who is momentarily using your product. We need to keep our eyes on the big picture. We need to think about societal outcomes and not just outcomes for the business for which we're working.

So if we think about the history of how design has been perceived by practitioners. You can think about how it's evolved throughout the 20th century. We had always kind of started out orienting around the product, designing things that are appealing and easy to use. You can think about maybe a great piece of furniture or something. Especially with the emergence of digital technologies, we started thinking more broadly about experiences. So designing touchpoints that create a desired experience. This maps to the tenants of service design for example.

So you might have banks for example that have branches. They have call centers, they have a digital app, they have websites. All of these connect to create a desired experience. I think now what we're thinking about is that how do we design experience that contribute to positive societal outcomes? What it means to understand outcomes is understanding what happens when your design is out in the world and beyond your control. This forces you to think beyond direct benefits of use and consider what kind of societal impact is your work going to have either good or bad? And how do we anticipate what might happen with our products and services as a result?

So if you take something like ride sharing. This is from awhile ago on the Lyft website. You can kind of see that they've created a super efficient, super usable platform that results in really convenient experiences that correct for a lot of the failings of other forms of shared transportation. So there's a product, many designers working on an application. There's the experience, connecting customers to drivers. But if we think more broadly about what some of the outcomes are. Many of the outcomes end up being not what these companies have been promoting. There's no path forward from driving to ownership.

Studies show that ride share drivers often make below minimum wage. And I think this has only been more emphasized since COVID. And then ride-hailing services add traffic congestion. And this may have gone down during COVID but this isn't resolved and in fact could put more pressure on congestion because of the lack of public transportation or people's inability to use public transportation at this time. And I think other companies too that depend really heavily on what we've been calling the Gig-a-con may fall into this category.

So you can kind of see this really effecting workers now who don't have a safety net due to working as contract workers for many of these organizations. So if we were to take something like Grub Hub or DoorDash for example. How would you think about the societal outcomes which are very mixed right now. You have the product which is convenient digital service that's really oriented around direct use by customers. But if you go one level up, there's a dependency on an exploited contract workforce and desperate Independent businesses who are engaging in this.

For example, I think it's with DoorDash that many of the businesses that are independent upwards of 70%, rather than chains or other kinds of restaurants. And so these companies take sometimes as much as 30% cut off of every transaction. And in terms of the outcomes, it's like I'm not even sure how this is benefiting some of these companies. DoorDash lost 450 million last year. And it's a subsidized services that's not even profitable in and of itself. But it manages to consolidate wealth. And the big question about that is well how are we sustaining this?

Uber is the same way. Every ride is subsidized and so how is that a sustainable model despite the disruptions that it's making?

If you want to think about making decisions for longterm outcomes. I was trying to think of some good examples as well. And I often point to the state that I'm in, in Washington, our vote by mail system. We're one of five US states that have all vote by mail. And on top of that it's postage paid. And I think part of this is benefiting us now because during COVID people don't want to vote in person or they can't vote in person. But what's interesting about this is this was designed for equitable outcomes all along. This was meant to increase access to voting, increase access to voting by marginalized groups. And it's actually a really low-tech solution. But it's the most equitable way that you can vote. And now it's benefiting us in a time when inequities are being so exacerbated.

So if we think about designing for outcomes and thinking beyond the direct benefit of use, we should ask ourselves how can our technology be used to harm or exclude certain populations? I think if we think about it as designers. I haven't met a designer yet who doesn't think that more designing for more equitable outcomes is a bad idea. So we do need to think about who's being left out? So how might people benefit if our design included people from historically marginalized populations? And do our growth targets result in us compromising our values or harming customers, suppliers or employers? So this is something that we can ask of these gig economy type companies that use digital products as a facilitation of convenience.

So in the end good design beings with asking the right questions. What do you want to happen when our design is out in the world? And I think if we think more carefully about this and we use the right tools and orient our work towards outcomes, then good design will find a way.

Thank you.

Our sponsoring partners