July 25-27, 2018 / Vancouver, BC

Storytellers - Anil Dash

September 7, 2016

Speaker - Anil Dash

Anil challenges us as designers and content strategists make sure we're using our gifts to tell stories we tell include the most vulnerable and not stories that are false.

Transcription

I like Vancouver a lot and I like Canada a lot. As mentioned, I’m this person on Twitter. A couple of folks came up is anybody here my Twitter friends? That’s terrifying. I apologize for everything I’ve ever tweeted. I like to disclaim anything you didn’t like and everything that you did like, I did write myself. So I come from the ancient internet long ago in the before time when CMSs were even more primitive and hateful and have had the chance to watch over the last 15 or 20 years, how one, the community that I come from, people that make products and that love the internet and the web and how we’ve evolved and sort of how the rest of the world has evolved along with us as the work we’ve done has become more and more relevant to more and more people. And a lot of what Donna said earlier today really resonated with me.

Because one of the things I realize is the people who create or lead the creation of digital experiences are in so many ways storytellers, and I think this comes up a lot in your work. I’m sure a lot of you use this terminology when thinking about the work that you do. Obviously the content, the words, the you know, the writing on the page, is stories that we’re telling users or hopefully we’re leading them through a certain narrative, but the design and everything else that informs it, our processes themselves, are about creating narratives that people can latch onto. Stories that connect with them and hopefully have meaning to them. And I keep thinking about a certain story that I tell myself every time I’m kind of out in the world, whether I’m using an app or on my computer or just walking down the street, I always come back to this idea of imagine the meeting and this is a very helpful tool for consuming the world, because like if you walked down the street and you see an advertisement you’re like, that’s really thoughtless and offensive, I immediately go into imagine the meeting. And everybody in the room was like, fine, fine, like I always picture and I think, if it helps you can close your eyes, the conference room, it’s the one that doesn’t have any external windows, so the only lighting is the fluorescent light that buzzes a little bit, gives you a headache. One of the end has the TV, it’s hung a little crookedly but you’re trying not to let it bother you and anybody can use that if they want to plug in their laptop.

There’s a couple markers on the table and the ones that are permanent are really clearly marked and the ones for white boards are really clearly marked and there’s a whiteboard on the wall that shows that nobody knew which markers were which? There’s the water bottles and you’re supposed to use the cups that they can wash. How many people have been in this room? A lot of people have been in that meeting so you don’t really need to imagine the meeting. You’ve been there. I’ve been to that meeting. It’s grim sometimes. Sometimes the results are good. Sometimes they’re not. I like to picture this because I think those of us that have had the privilege and the good fortune to be in that meeting and see the dude who’s not the highest ranking who keeps sitting at the head of the table. It’s always a dude. We know that is where a lot of these experiences and these stories are born. And that’s where the products come from, that’s where the apps come from, that is where increasingly culture is created. And that’s kind of a scary thing to think about. That room where we’re in where gosh, I wish I had some more caffeine so I could endure this meeting. I just don’t have it in me to fight everybody in this room. That’s where a lot of these decisions are being made. I’ll take one example. Amazon Alexa. It is a little speaker that Amazon sells. Well, Alexa is a cloud-based service for something-something, but it’s a service they offer and they sell little speakers that you put in your house and you can give them orders and so you can say reorder paper towels if you order stuff from Amazon and a lot of us do, and then it will show up in your house and Bob’s your uncle it’s very exciting, and there are skills that you can add to it and there are features, we have one around the house because I like to try out new gadgets. So we were wondering what cool tricks can it do and I’ve got a five-year-old son who’s really good at figuring these things out. And he’ll go play the Star Wars theme and it will do that, or I can tell him you’ve got 5 minutes left to play with your legos and he’ll say, Alexa, set a timer for five minutes and he’ll stop playing and it’s not my fault.

[5:00]

That’s nice. And it can tell jokes. It’s got like an endless supply of dad jokes and you can go, there’s a little app where you can see what commands have been given to your Alexa, even if you’re there, so if I’m traveling I can tell what jokes it’s been telling my son. Plus because it’s got a microphone that’s always on monitoring your home it can enable surveillance of your home. Probably unencrypted and who knows what agencies can monitor that, so that’s another one of the cool tricks that Alexa can do, because what could go wrong with that? Nothing.

So you imagine the meeting, right, where they’re like, hey, in the back of the room, hey, this thing’s always recording and always sending audio, and can you imagine anything could go wrong with that? And everybody in the room is like, shut up, OK. All right, I’ll wait until next quarter, I’ll bring it up, and then what does somebody say? It’s like you got a product manager who really means well and they’re like, why don’t we give it a voice. We’ve got to give it some personality. We know we’ve got a circuit board and speakers and a power supply, but what we really need is some humour. Let’s add some humour and sprinkle humour on top. And who’s going to fight that, right? We should do that, we want our products to delight. I like it that delight has become a cliche. Like delight used to be the greatest thing you could aspire to. Now it’s like did we put the delight in? It’s in Trello and I didn’t know if I have to check it off. So my son last year was 4 years old and had learned to use the Alexa we had around the house and those of you who have young kids, like preschool age, you know, they’re not always the most articulate about what they’re saying and he said next and it, I don’t remember what he was talking about and we have an Alexa around the house and it heard this and it thought he said Alexa I’m dead.

So this robot speaker in our house, said out loud to my four-year-old child, you’re pretty talkative for a dead person. Unprompted and unbidden while I was speaking to my son and he was talking to me, we had a disembodied robot voice of the cloud tell us you’re pretty talkative for a dead person. I did what any normal person would do I tweeted about it in hopes that Amazon’s cloud servers would respond to me directly. They didn’t. It was a little creepy and then I remembered I had that app that shows all the jokes that it had been telling and it said well, Alexa heard I’m dead. That’s what it heard your kid say. He didn’t say that, and then I started to imagine the meeting. Well what are going to be the phrases that are going to trigger the humour? We’re going to sprinkle the humour on — “I’m dead.” And another product genius there, said why don’t you Bing it for I’m dead.

That wasn’t actually the issue that I had that I needed my child to Bing I’m dead. It’s hard I’m beating up on one team and they made a very good product that I like to use. And you realize that we try to do the right thing, right with unintended consequences. We didn’t anticipate all of these stress cases. And then you realize, well, when we’re making these products, what is the way that we determine what we actually care about? To say what really matters to us? And I find the things we care about are the things that we assign a plan, a timeline and a budget. And I joke about like your bullet point saying humour, but it’s pretty rare to be like we’re going to assign a plan, a timeline and a budget to adding humour to this product, you sort of leave it catch as catch can in the process and hope that whoever is responsible for doing that, is able to do that and then of course we fail and we end up saying creepy weird things to kids.

And I thought, well maybe they didn’t have the resources to get it right. But this is Amazon, right? Their CEO’s side project is the Washington Post. Like what he does on nights and weekends is builds a rocket ship that can go into space and land right where it took off. That’s his hobby. So I think he has the resources to make a nonSkynet-style creepy robot talking to your kid device. I think he’s got the resources lying around. All I could do is imagine the meeting where someone said could we redirect just a little tiny bit of rocket ship money towards not sending threatening messages to children? That’s the — and you can put that in, you could totally put that in a project plan. It’s very hard to say no.

[10:00]

But they didn’t have the storyteller sort of articulating in context, what are the implications of this and what I realize is all of us who build products, what we’re doing every day is translating technology into everyday experiences, and that happens with both within our organizations to people up the chain or in the nontechnical departments or in the other parts of the organization, and then of course to our users, our customers, the people we represent, the people we’re trying to serve, our communities, those are all — the targets of this translation that we’re doing, we’re trying to speak all their languages and understand all their contexts and say things in a language that they will understand. And what the burden becomes, as has come up, I think so many times in the last two days and I find it really reassuring that everybody is kind of reaching the same points although I find it really stressful that everybody is saying the stuff that I was trying to say at the last talk of the event that already you’ve covered everything, but thanks, folks, but what we keep coming up to is you know, we need to be thinking about the worst-case scenario, because it’s very easy to imagine what’s awesome.

We’re making this product because we want it to be awesome. We’re creating this app or experience that we want it to be great for people. It’s very rare that we have deliberate malicious intent and so we think about the good things and we should, but we also constantly have to be telling a story about what the worst-case scenario would be. Because we start from our assumption that we have good intentions, and that matters, but it’s not enough. And what the interesting thing here is, the entire industry is actually shifting, so we look at the rising popularity of bots and all the voice recognition stuff and the Siris and the Cortanas and the Alexas of the world and what they’re all moving towards is using much more human language.

We’re looking about it really increasingly becoming important is writing. Voice, writing, tone, all these things, and it’s a move to make tech more human and all of us who’ve grown up with technology, this was the promise. This was what was science fiction on Star Trek decades ago, we were told that tech was going to become more human and more connected to us and easier to understand and less about learning the interface and more about just accomplishing your tasks. This was the promise and we should be excite about that and I don’t want to diminish that and I don’t want to slight the importance of that and I also want to point out, this is particularly relevant here in Vancouver, for the consumer web, the breakthrough moment at when user interfaces happened here when the very first versions of Flickr, they were the first consumer web applications that were using the sort of friendly copy and a human tone of voice in the service that they were creating for people. We’re in the place where that happened, right? And that came from a community that expected that that would be what they would do, and of course some of those folks have gone on to Slack and other experiences that are also defined by a great voice and but we also learned is that we made a lot of mistakes along the way and it was a careful learning process to be able to use voice in that way and we’ve lost a lot of the lessons that they’ve learned, even as that practice of using a human voice now has become commonplace. Almost every app on our phones is trying to mimic things that were novel 10, 12, 14 years ago and I point that out because I think maybe some folks here, because this is community that has this tradition might be familiar with this, but most of the industry doesn’t know that there’s a time and place where these were new ideas and maybe we could learn from the people that created them. Because being human is hard. It’s hard for people to do. It’s really hard for institutions to do, and almost impossibly hard for technologies to do.

And what we keep doing is amping up the significance, and the importance, the stakes of the technologies that we’re trying to humanize, without necessarily assigning the resources and the plan and the timeline and the budget to meeting that challenge that’s getting harder and harder. The biggest cause of failings in making our technology human is tech culture.  

Tech culture has not gotten appreciably more human. In some ways I think it’s trending the wrong way.  

And so I wonder how did we get here?

If we have these flaws that I think so many speakers here have articulated so well of what’s wrong in the industry, yet the importance of how we create technologies is rising. How do we get to that disconnect? Well, I think it’s a reflection of what it means to become a mature industry. There are disciplines, fields, areas of study, areas of work that are well known mature global industries that people work in and they have a certain set of traits. We’ll take one, medicine. We all interact with doctors, obviously as an American our health care system is a little different than yours here, yours kind of works, ours kind of doesn’t. But these are still relatively mature systems. And they share traits a couple of traits we can learn from. There’s a rigorous education.

[15:00]

There’s a long history all the way back to Hippocrates, and thousands of years of the human body and how medicine can work. There’s a set of ethical standards. You take an oath when you want to become a doctor, and people are expected to uphold that oath. There’s mentorship and so residency, people are expected to work with doctors who are experienced when they’re at the beginning of their career and they’ll learn from them, study under them. There’s community service so that nobody will be denied medical care, that you would be expected to give back and to provide or if you’re called upon in the case of an emergency to serve.

And accountability, you can lose a medical license if you malpractice, you have to have insurance to prepare against that and you have to be ready to surrender your credentials if you’re somebody that’s not meeting the standards they expect. And that’s just in medicine. What if you look at law? Same kind of thing it’s a mature industry.It’s been around for thousands of years. In Canada just as in the US we have cases where in ordinary cases it will go all the way back to British common law hundreds of years ago because there’s a history of what they rely upon for how they develop their knowledge. Ethical standards again.

You have to be certified if you want to practice law. You have to work under other lawyers. You’re expected to take on pro bono cases. You can be disbarred if you transgress. Journalism has the same stuff. Engineering has the same stuff. My father is a civil engineer. They take oaths, basically saying people are going to walk on this bridge if it falls down, that’s on you. Even business schools have ethics classes. I mean Trump went to business school so we know it’s not a perfect system.  

These are a set of principles that have evolved in industry after industry as they rise in importance in culture to kind of put together a system where people know they can trust what that industry is going to do. And that if someone transgresses or crosses the line that there’s going to be some accountability. Do we have those in tech? If you look at that list? Well, we have education, there’s certainly computer science, there’s design. There are disciplines with a robust and broad set of educational platforms that you can learn from and increasingly they’re getting more and more accessible.

There’s huge push to teach kids to code, right? History is hard. I think about if you’re a young person working on the Google docs team right now building spreadsheets, how much time do they talk talking about Lotus 1, 2, 3, I don’t think so. And the interesting thing that the people that created those first, they’re still alive, they’re still accessible, you can just email them and be like “when you invented PowerPoint, what were you thinking?” and the guy you know and the guy who made it will be like, listen, listen, we were designing for a different time. Even the guy who invented comic Sans, is still alive.

You can still email him. And there’s a thing there with like, in any other industry, it would be bizarre that haven’t taught his history that is ten or 20 or 30 years old, very recent history especially by people who are alive and you can get all the way through some of the best computer science or design course, can you learn about every single line that was in the original C compiler but not know about how the software industry itself was created. And then ethics, I mean same thing applies. It’s very possible on the technical disciplines to get through advanced degrees that are very, very robust in what they teach on the technical level, but without talking about the ethical implications of what you’re creating and this is especially important when we think about who gets funded to create companies. Almost every founder that gets funded these days are people from a CS background if you can get all the way through the very best CS programs without any ethical training and be the ones that get the most amount of funding for creating your companies, what are we selecting for in our leaders? Mentorship?

[20:00]

Again, this is where the technical half of our industry could learn from the Design and Content half of our industry. Because people say, well, mentorship is really important. This is a way we determine what a lot of our values are. Community service. Think about, you know, it is commonplace for even the most rapacious law firms to have a pro bono practice and to have a expectation that their lawyers will participate in it. How many technology firms have anything like that where you’re expected to contribute your skills for the greater good, for social good, to your community? And then accountability? We can all think of transgressors in the communities we’re part of and certainly the technology industry at large and what happens to them?

Is there any way to get disbarred?

Or do you even more likely just get funded?

And we make a lot of excuses for this, and the biggest one is relying on this idea that those of us that work in technology are underdogs, and this is a mythology I grew up with. I’m a little bit older than a lot of folks here but there was this idea that the nerds were underdogs and that tech was the thing you did off to the side and we weren’t cool. Well, guess what? We won. The geeks won. We have built the largest and wealthiest and most powerful industry probably in the history of the world and we have usurped almost all those other industries I mentioned in terms of our central importance in influencing culture and influencing policy and where we’re directing both our educational system and economic systems.

Funding more and more and more tech. There is no underdog anymore and this is really important, because a lot of people in tech still see themselves that way, and I get it. It’s because it was hard. It was not a safe choice to choose technology at the time when a lot of you came into the industry. It was a brave choice and I expect that, but the truth is that we won and we should be not sore winners.  

Because there’s this refrain, I don’t watch HBO’s Silicon Valley because it’s just a little too disturbing to actually sit through, but there’s that refrain about change the world. Where it’s like we’re building an app to change the world and the thing is, tech, you know broadly did. The world switched to it, right? Like the whole world’s way of experiencing each other, connecting to each other, did shift. The average smartphone user spends three hours a day with their thumb on the glass of their phones.

Three hours a day. And that’s only happened in the last ten years. That is a massive shift. And there are soon going to be a billion people with a smartphone or advanced feature phone in their hands, right? So that is what winning looks like, right? There is inarguably there’s no way to argue that tech won. Tech won. And the apps that we create, the experiences we create, are the majority of people’s experience for most of the things they do in the world. I mean think about, you know, already the majority of conversation that happens in the world happens through social media and messaging apps, the majority of conversation, more than face to face, more than any other medium, the majority of conversation happens through social media and social networking and messaging apps and I think we’re only a few years away from where the majority of all conversation that happens between all humans are through digital apps.

I take a lot of responsibility from this where I came from the community that created some of the first social media tools and social networking tools and I was so excited and so proud about what we had made and we would say, look, we made these blogs tools and these people met each other and that couple got married and these people started a company and we took all that responsibility for all these great things that happened and then when people started harassing each other and abusing each other and doxing each other, and all these negative things, we’re like, well, people are going to be people, that’s not our fault. And you can’t have one without the other. You can’t say I caused this, but I didn’t cause this.  

And there’s this tendency for us to want to think of ourselves as the “good guys” and I say that deliberately because Leaders of the industry still see the industry ******. It is not an accident that it is so male dominated it is a strategy and that’s something that has increasingly become untenable. Now, there’s been a lot of conversation about inclusion in this event. And I think everybody is well intentioned. We all want to do the right thing but we also have to talk about the industry as it is and the rate of change. Because most of the problems we talk about talk about these incremental 1% improvements each year, right, if that.

[25:00]

If that happens at that rate, it’s going to be too late. Change has to accelerate a lot in order to remedy the imbalances in the industry. And the other defense we rely on is to point at the broad political and economic support for what the technology industry does. In the states,  you can actually look from one end of the political spectrum from the other, from Bernie Sanders to Donald Trump and they will all tell you tech is the future, they will all tell you where that’s where the jobs are coming, that’s what we’ve got to teach our kids to get into.

That’s really weird that they all say that, because they will also say, well, yeah, Wall Street they do a lot of valuable things, but we got to watch them, they’ll screw that. And they’ll be critical about well, medicine is really critically important but we also gotta watch and make sure that it doesn’t cost too much and they are being responsible and they’re not pushing medications on us we don’t need. There’s always this balance but with tech and I think it’s partially because they’re illiterate about technology. There’s no nuance. It’s just presented as positive. And the very few people who are critical are typically dismissed because it’s academics and well they don’t count they’re not in the real world or it’s media, it’s press and they don’t count. Or if they do count then maybe Peter Theil will carry on a decade-long secret multimillion dollar campaign to destroy them, because that’s normal. That’s like James Bond villain stuff! Am I the only one that caught on that’s like really weird. Like that happened. That’s really weird and then they put him on stage. Anyway.  

It is a problem when everybody likes us. It is a problem that everybody likes those of us that create technology. We don’t have enough criticism, we don’t have enough effective criticism. We are not self-critical enough and I know the last two days have been full of us this is what we’re getting wrong, this is what we’re doing wrong, but some of what we do is acknowledge the problem to avoid addressing it. I really believe that. I know I’ve been guilty of it.  

And so I wonder about what defines us. What is the technology community? Is there such a thing? Is there a technology industry? I mean is that a meaningful term? I think about Uber and Etsy, what do they have in common? They’ve got some Swift coders. I can’t think of much else.  Right? I like Etsy. Pretty civic-minded, they’re mostly about enabling entrepreneurship for a majority of users which are women. And Uber is like well, we like to go to new cities and break laws and put cars on the road. They don’t really have much in common but we’ve grouped them together and so that’s part of the technique of making ourselves impervious to criticism.  

Because it’s nonsensical to criticize those things in the same way. They’re not the same thing and when we include people doing technology in other industries, if you make apps for insurance companies, for banking, for government, for all their agencies and industries, then you become a sort of mishmash of things that are inarguably good, inarguably bad, and everything in between it makes it really, really hard to criticize. And so we fall back on jargon and all of these are avoiding responsibility for when the worst case scenario things happen. This is a technique that works and I think it comes back to imagining the meeting. We’ve got to think about how these things happen and how they came together because the biggest thing that’s happening more and more frequently is we are accepting good intentions in lieu of good actions.  

What I mean by this is when a company, when an organization, when a community, says I meant to do the right thing, we’re treating that as if they did the right thing. But a person or a company that has good intentions and doesn’t follow them up with good actions is lying.

We need to start calling it what it is. It’s lying. And this was a hard thing for me to arrive at, because for many, many years, as a person who wanted to see himself as good, right? We all want to I’m a good person, and I’m evolved and I want this company to say they care about diversity, so I’m going to tweet at them and I’ve got a lot of Twitter followers so they reply sometimes and I’m going to tell them they should care about diversity and they’d reply back, we care about diversity and I’m like, well, that was solved. I fixed that problem! And yeah, you know, you just check it off. You did it.

[30:00]

And what I realized was it was a tactic, and I was asking the wrong questions. Because we actually have a way of knowing what our values are. And it’s by what we measure. Right? We measure up time. Those of you who remember Twitter’s fail whale, knew that a long time that it would fall over when it got big enough. And Pokémon Go right now, sorry, we’re not available right now, we launched a new country on and screw you. Then there’s a contrite blog post. But we measure it, we measure up time. We measure vanity metrics, right, how many downloads do we get, how many page views we get. All of you had to roll your eyes at some VP that cared about one of these metrics but they wanted to see it, so you had to give it to them in a report. They cared about it and so we did it. And we measure revenues. And we have dash boards to tell us in real time which products are converting and which funnels are actually working for us, right? And why do we measure that? Because it really does matter and that’s something we care about. And what are the things we don’t measure?

Well, inclusion is one of them. Right? Well, there’s diversity stats from Google and Facebook, now, right, and you think about what it took to get there. In the States we have our Equal Employment Opportunity Commission and they have a set of regulations that require the reporting of diversity stats from publicly traded companies and for years, Google, Facebook and others, just violated those rules. They didn’t report the stats that were required by the Federal Government to do so. Instead, preferring to just pay fines, rather than report the numbers and what finally changed it, two things. One: a lot of you were probably seeing Tracy ciao’s initiative where companies were self-reporting on their diversity numbers- who worked for them in their engineering departments.

But also shareholders sued the companies to get them to start reporting the numbers they were legally required to report. That’s what actually broke the dam on Google starting to report its diversity numbers which of course they write their blog post: “good news we wanted to share something with you because we’re not evil. We have this spreadsheet and some little charts, the numbers are terrible, but look at us, we’re nice because we reported the numbers.” That’s a strange thing that it took a lawsuit for them to do the right thing that was legally obliged of them.  

We also don’t talk about why inclusion matters. And one of the things that is really striking is we see in so many other areas of what we can consume and what we participate in, we care about who made something. Like I’m one of those people who reads the liner notes on an album or the credits after a movie. And certainly there’s a long tradition of communities saying we want to support businesses that are from our community, and we want to make sure that we’re supporting creators that are like us, but the biggest reason people care about who made something is they want to see themselves reflected in the experience that they’re having. That’s why representation in movies and television and music matters and why it’s going to matter in our technologies as well. So when we want to see in the apps that we’re using and the websites we’re using ourselves reflected because it’s the point we were discussing earlier is if the emotional reaction to seeing yourself in a movie is I feel I’m up there on a screen and I’m feeling what the protagonist feels, the same is true for our apps and our other experiences and if that wasn’t made by a team that included somebody like me or that was concerned with my concerns, I can’t see myself in it. I can’t identify with it. And in fact, it will heighten that sense of disconnect that this was not made for me or with concern to my feelings.  

And then that surveillance point that I brought up earlier. It’s a real thing. What we are enabling by putting all this data onto centralized infrastructures there is certainly the government surveillance, and particularly abusive in the states, but there is also, and that is something that is a global phenomenon. There are law enforcement agencies in countries where it is illegal to publicly be LGBT and they are monitoring social media for people being outed in order to target them, right? And that is something our platforms are enabling. That is not one of those happy stories we talk about how social media is bringing the world together. Also corporate surveillance, just the awareness of what e’re doing that companies can take advantage of in ways that we didn’t anticipate or expect. And we need to think about how we’re architecting vulnerability for people, just through the ordinary tools we use in building our apps and our services.  

[35:00]

What’s happened is a redefinition of what’s considered public. Right? It used to be there were these bright line boundaries around declaring something public versus it happening to be public. But in most — certainly in the states and in most parts of the world it’s not illegal to record anything that happens outdoors and we think about how inexpensive it would be to record it all, index it all, assign a person to it and think about what it would mean when that’s published or visible or being harvested by big companies, every time we walk outside, every time we walk in front of a security camera, all the places where it might be. These are big issues that are most affecting the most vulnerable in society and that we sort of casually enable these every time we say well we’ll just use Facebook sign in on our app, because we do we know in our jurisdiction what it takes for law enforcement to be able to get a subpoena for the login information from Facebook when they decide to use your app. In a lot of parts of the world they don’t need a warrant or a subpoena at all. They can just request it from Facebook and they can give it over or from other services that are used in a similar way.

What are the implications of that for people being able to one, just simply not be monitored, but two, be able to control the ways that their life and their experiences and their actions are represented out in the world. There’s real harm, there is a danger of real harm. So whenever we in the industry use words like disruption, I don’t think people outside of technology see this as a positive word. I just don’t. I think it feels like displacement to them. I think it feels like when they say, in the US alone, 3 million truck drivers, there are 3 million truck drivers driving tractor trailers and I would guess it’s not more than 10 or 15 years away when the majority of them will be replaced by self-driving tractor trailers for a lot of economic reasons. They’re not going to say that’s some other company that did that. That’s not this company that I like on my phone. They’re going to say these are both two companies that are on my phone, they’re using the same design patterns, the same UI patterns, they must be similar, they’re right here on the glass of my phone and this one destroyed my job, so this one I probably don’t trust either. We’re going to be lumped in with the worst actors or the worst fears of our users based on every other app they have on their phone. Fairly or unfairly that’s what’s going to happen.

And I think the biggest catalyst for mistrust in technology broadly is going to be when they see the impact it has on ordinary people’s work and we see this all the time. I’m sure all of you who do user research have seen this. And people say well, they changed my whole job and now I have to do this on computers and it’s worse because of this, even in the example we talked about in something like permits, where I want to talk to a real person because it’s more efficient but I have to use this app I don’t want to use. That’s one kind of stress from technology transitions. But that’s minor compared to people feeling like they’re going to lose their entire way of earning a living due to these technology transformations that are getting more and more extreme and more and more aggressive. And one of the things that’s really striking to me is how we treat people already in tech. So we’ve talked about how we’re not inclusive enough, we don’t include all the different people in society that contribute to the tech industry, but think about the big players, Google, Facebook, those kinds of companies, and it’s kind of shocking to me, this has already slipped from memory.

But just a few years ago, Steve Jobs, Eric Schmidt and others illegally colluded between Apple, Google, Pixar and many of the other biggest tech companies, to keep their companies from being able to apply to the other companies for jobs and get a better wage. They did this to artificially keep pressure downwards on the salaries of their workers. This is documented. This is public. This isn’t some conspiracy theory, this actually happened and the net effect was they cut billions of dollars in potential salary increases from their workers by making it impossible for Apple to poach from Google or vice versa. And you know, the striking thing this, no pun intended, the striking thing is there were no strikes, there was no labor response to this. The lawyers sued, they got a settlement. Most of the money went to the lawyers, I’m sure some little bit of money went to the workers, but you would think with billions of dollars at stake, there would have been some larger complaint other than well, you shouldn’t have done that. But they still revered Steve Jobs and they still revered the CEOs, their companies, and it was a striking thing to me. Well, what more would it take other than actually denying you potential wages for you to say your company doesn’t have your best interests in mind. That’s extraordinary. That wouldn’t have happened in the auto industry.

[40:00]

That wouldn’t have happened in any industry that I can think of and that’s how we treat the people who are already in and we’re talking about building a pipeline to bring in all the women and the people of color and other minorities that we’ve excluded to that industry. That’s the one. That’s where your future is. That scares the hell out of me. Because that’s a radical change that has to happen and nobody’s talking about it and you can talk to people who, again, graduated from great CS programs, go to Google today, go to Facebook today and Facebook to its credit did not participate in its collusion, so I should give them credit. That’s how low the bar is. Way to go, you didn’t break the law to screw your own employees for years, yay!

That’s extraordinary is that the company, you know, the employees can come in and go to these companies and have no idea of this history that is only a couple of years ago. It’s all been disappeared, it doesn’t come up. And yet, the mainstream narrative is this is the future, these are the good guys, this is where you want your kids to work. And that pipeline that we’re building to teach your little kids to code, well your daughters, when they learn to code, they’re going to go to those companies, which have no women in senior management or on their boards and treat their employees this way already and we trust that in a few years they’re going to start treating people well? What would cause that to change? They’re already the richest companies in the world, like what’s the incentive? I think the default course that the conventional tech industry is on, is to be looked at as the new robber barons. There was a time when the industries like rail and oil were seen as high tech job creators and then after a while, people were like we want to murder these people and they said, what if we build a bunch of libraries? Then would you leave us alone? And we’re like okay we’ll trade some libraries and universities for it. And we haven’t yet compelled them to build the libraries and universities.

But this is the default path for these kinds of things. Is a massive, massive backlash and the thing about it is I know it doesn’t sound like it but I’m an optimist. I really am and I have a ton of empathy for these people and I’ve had the chance to watch some of these company be born and I still think despite it all that they want to do the right thing, they just don’t see the path that they’re on, because everybody keeps telling them you’re doing the right thing and you’re the good guys and so they’ve fallen back on these avoidance tactics, right? They say we care. We care about diversity and inclusion, we care about all these other issues and we can ask them, well, what do you measure? If you care about it, what are you measuring?

And we all have to do this. Right? This is the thing that we have an obligation, because we’re all fortunate. We get come here to an event like this with amazing speakers and learn to do work that is rewarding and challenging and pays us well, so the least we can do is to say, well, if you say you care about this, can you tell us what you’re measuring and show us what you’re doing? Can we have that accountability from the companies that we work with, that we rely on? When they tell us we’re trying to do better, we can say, what’s the budget for that? What are you spending? The R & D budget is all public. And if you asked them, and they said, nothing, we’re just hoping and if you say that’s your strategy for inclusion and that makes me think maybe you’re not really serious about it. All we’re asking them to do is take responsibility.

All we’re asking to do is set meaningful, tangible goals about what they’re doing. And the thing about it is everybody here has that experience. Everybody here has done that, to say, we’re going to get a product out. Because shipping is the hardest thing in the world and you all do it. We’re going to get something out in the market and we’re going to get it in front of users and try to get it to really serve their needs and set a process to iterate. Those are the kinds of goals you set when you care about something.

And I keep thinking about imagining the meeting of what it looks like and I want to close with one more example that really stuck with me, because it highlighted the difference between what companies say and what they do. And how we can tell the difference. If we just imagine the meeting about what the intention of an organization is. Keep in mind, we know, what does it take to get something done? You got to have a plan, you’ve got to have a timeline, you’ve got to have a budget. Otherwise you’re lying.

[45:00]

125th street is the southern boundary of Harlem in Manhattan in NYC where I live. It’s historically very important and one of the hearts of African American culture in the United States. It’s very important to South Asians. It’s one of the first neighborhoods we were allowed to live in. It’s an incredibly rich historic and important neighborhood. And it has been a victim of redlining which has already come up once today. Certainly nobody else will talk about redlining today, but of course every good idea has been covered and redlining is the, you know, encoding in policy of systematic injustices, whether it’s in transit, in housing, in lending, in many other areas, biases were built into the systems about who was afforded access and in this case, specifically denying African Americans access and opportunity and this was mirrored in transit and even in areas like police violence where there’s massive over policing in Harlem and has been for years and so they were much more disproportionately higher victims of police violence and then last year, Uber did a press event, they put out a press release where one of their executives went to Sylvia’s one of the soul food restaurants, one of the most famous bastians of Harlem and Harlem culture. And sat down with Al Sharpton and talked about the very real issue of African Americans not being able to hail taxis in NYC. Largely because of historically anti-black attitudes and racism from predominantly South Asian drivers. It’s a real issue and Uber did raise a valid point about using an app to hail could help address this massive social issue that had existed for decades and they did this event and they got a lot of press and they talked about this essentially as the moral underpinning as to why Uber should get to release their app in New York City even though it was in violation of the laws of the City.

Then a couple of week ago, they launched an unlimited pass, this is all you can ride Uber pass to attract new customers and the requirements of it were explicitly that you ride must both start and end south of 125th street in Manhattan. That’s who the customers were that they wanted. That’s the rules. That’s interesting. And then of course as I’m sure all of you know, we had another outbreak of the sporadic bits of police violence that pop up in the US regularly. These were particularly outrageous and very visible incidents that we were able to see in social media and there was, I think a moment of pause, nationwide in reflecting on what had happened, and Uber responded to the moment by putting peace signs in their app to represent the cars instead of the regular car icon when you hailed a car.

This is what it looks like when you care about an issue. And this is what they did. I want you all to imagine the meeting where somebody said, we care about police violence, we care about a centuries-long pattern of disproportionate targeting for brutality and even killing of a certain community, a community that we used as the moral basis for our expansion into a city as an excuse for violating the policies of that city. We relied on that as our entire argument for why we should be allowed to do business and we did a press release around it and a press event around it and we had the cameras and the photo op. Now we need to follow through on the concerns of that community, be able to deliver what we promised for the reasons that we said we would, and somebody at the table, they got some post-it notes in front of them and they said, what about peace sign in the app? Peace sign in the app? Huh?

I want you all to be in that room with me for a moment. What would do you write on the whiteboard when somebody says peace sign on the app? You might put it on the whiteboard but it’s at the bottom, right? There’s like 100 other things you would come up first. If only they had tens of billions of dollars in resources with which they could address other issues and they do, they do. They totally do. They’re Uber, they have tens of billions of dollars in resources, all of you right now are thinking of better things they could do if they put a plan together, if they put their resources together, if they really cared about the issue. 

And the thing is, they think they care. They said the right thing. They even changed an icon in their app. And I used to be the kind of person that would push for those kinds of solutions. It’s been a long, long time of realizing the icon doesn’t matter. It doesn’t mean shit. It didn’t change anything. And that when I was in those meetings, I would go along, because I felt like, at least we did something, or well, who am I to push? Or am I going to still be welcome into this meeting next week if I’m the one that raises my hand and points out that we’ve done nothing but making ourselves feel good without ever having any impact and I’m done with it. That was the conclusion I came to. I’m done with it. I’m done with excusing it. 

I’m done with putting up with it, and I am increasingly trying to commit myself to pointing out when they’re lying. And these people are lying. You all are storytellers. The challenge we have is making sure we have not taken our storytelling gifts and put them in service of stories that are false. This is a heavy responsibility. Should it be yours? No, it should be the people that are getting paid more than you. The people who hire you, the people who work — who ask you to work for them. It should be their responsibility. But increasingly they’re not doing it or increasingly they think they’re doing the right thing but it’s not meaningful or substantive enough and all we have to do is go to first principles and what you all know every single time you ship something, every single time you put something out in the world. You know what it looks like when it counts. You know what it looks like when you care about it, and all we have to prioritize, one simple thing, that the story we tell includes the most vulnerable people. If we do that much then we can be worthy of all the things that people say about the transformative potential that technology has on the world. Thank you.