26.1 C
New York
Saturday, July 27, 2024

The AI Placed You at the Crime Scene, but You Weren’t There

Like a lot of tech solutions to complex problems, facial recognition algorithms aren't perfect. But when the technology is used to identify suspects in criminal cases, those flaws in the system can have catastrophic, life-changing consequences. People can be wrongly identified, arrested, and convicted, often without ever being told they were ID’d by a computer. It’s especially troubling when you consider false identifications disproportionately affect women, young people, and people with dark skin—basically everyone other than white men.

This week on Gadget Lab, WIRED senior writer Khari Johnson joins us to talk about the limits of facial recognition tech, and what happens to the people who get misidentified.

Show Notes

Read Khari’s stories about how facial recognition tech has led to wrongful arrests that derailed people’s lives. Here’s Lauren’s story about Garmin’s Fenix smartwatch. (And here’s WIRED’s review of the latest model.) Arielle’s story about the wave of shows about Silicon Valley tech founders is here.

Recommendations

Khari recommends hoagies. Lauren recommends Garmin smartwatches. Mike recommends the show The Dropout on Hulu.

Khari Johnson can be found on Twitter @kharijohnson. Lauren Goode is @LaurenGoode. Michael Calore is @snackfight. Bling the main hotline at @GadgetLab. The show is produced by Boone Ashworth (@booneashworth). Our theme music is by Solar Keys.

How to Listen

You can always listen to this week's podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here's how:

If you're on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts and search for Gadget Lab. If you use Android, you can find us in the Google Podcasts app just by tapping here. We’re on Spotify too. And in case you really need it, here's the RSS feed.

Transcript

Michael Calore: Lauren.

Lauren Goode: Mike.

MC: Lauren, have you ever been correctly identified by facial recognition technology?

Most PopularGearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

GearThe Best Lubes for Every Occasion

Jaina Grey

GearThe iPhone Is Finally Getting USB-C. Here’s What That Means

Julian Chokkattu

Gear11 Great Deals on Sex Toys, Breast Pumps, and Smart Lights

Jaina Grey

LG: I think I have been because I use an iPhone and when I go to the airport, I use a service like Clear, which zeros in on your irises, is bizarre—but I think what you're asking is, have I been identified by some governing body or agency by my face for something?

MC: Yes.

LG: Not that I'm aware of. Maybe I have, but not that I'm aware of.

MC: Has anybody ever knocked on your door and said, “We know you were here because our computer vision saw you there?”

LG: No. Has that happened to you?

MC: It has not yet.

LG: OK.

MC: I'm sure it will soon. It's also increasingly common because law enforcement is ramping up its use of facial recognition technology to ID suspects around the United States, and it is not always accurate.

LG: Yeah, we should talk about that.

MC: We will.

[Gadget Lab intro theme music plays]

MC: Hi, everyone. Welcome to Gadget Lab. I am Michael Calore, I'm a senior editor at WIRED.

LG: And I'm Lauren Goode, I'm a senior writer at WIRED.

MC: We're also joined this week by WIRED senior writer Khari Johnson. Khari, welcome to the show.

Khari Johnson: Thank you for having me.

MC: Of course, in person no less.

KJ: Oh my God.

MC: You flew through the night to be with us today.

KJ: I did.

LG: Khari's first Gadget Lab appearance, and hopefully the start of many.

MC: So you're probably all aware that facial recognition technology is becoming more widespread and is being used in a variety of applications. Today, we're going to talk about facial recognition technology and how it's being used in law enforcement. Just like a lot of tech solutions to complex problems, facial recognition algorithms are not perfect. But when the technology is used to identify suspects in criminal cases, those flaws in the system can have catastrophic, life-changing consequences. People can get wrongly identified, arrested, and convicted, often without ever being told that they were ID'd by a computer. And it's especially troubling when you consider that false identification disproportionately affects women, young people, and people with dark skin, basically everyone other than white men. Khari, in your recent reporting, you've talked to three people, actually all three are Black men and all three are fathers. These guys were all arrested and falsely accused of serious crimes because of the police's overreliance on facial recognition technology. What were the issues that stood out to you the most in these three men's cases that you wrote about?

KJ: I guess I would start with Robert Williams—both of his young daughters watched their dad get arrested on their front lawn, and that negatively impacted them in a number of ways that's detailed in the article. Nijeer Parks had to talk with his son for the first time about how to act around police, the sort of “talk,” this negative tradition that exists in this country for people of African descent, of how to conduct yourself around police. He had to have that talk with his son at the same time as talking about his false arrest. You think of the personal relationships that it touched, the interpersonal relationships in their lives, how these arrests rippled through their lives in different ways, which is I think really important to note and part of looking at artificial intelligence and the use of algorithms beyond maybe what you see on paper.

Most PopularGearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

GearThe Best Lubes for Every Occasion

Jaina Grey

GearThe iPhone Is Finally Getting USB-C. Here’s What That Means

Julian Chokkattu

Gear11 Great Deals on Sex Toys, Breast Pumps, and Smart Lights

Jaina Grey

If you look at any of these cases, I think it's easy to see that there is some violation of constitutional rights, and that's of course very serious, and denying people their freedom is definitely really high on the list of some of the most harmful ways that an algorithm can harm a person, but also that violation touched their lives in different ways. Nijeer Parks was transferring money, I think to his fiancé at the time, when police said that he was stealing something from a hotel gift shop in New Jersey, and he's no longer with his fiancé in part due to some of the issues they had surrounding these accusations. I just think it's really important in learning about each of their lives after the arrest to note that it touched the people around them and their relationships with people close to them, as well as each of these things related to their civil rights and their rights as citizens in the United States.

LG: When your story not only highlights some of the catastrophic consequences that these men have suffered in their lives and some of the personal issues that have come up as a result of these false arrests, but ultimately your story raises the question about this crash, this collision of what happens when humans are using new or emerging technology to do their jobs, and in this case, it's law enforcement. I'm wondering what these stories tell you or tell us about who is ultimately liable when something like this goes wrong. Is it the tech itself? Like in the case of Nijeer Parks, who as your story details, he spent 10 days in jail after being wrongfully accused of shoplifting from a gift shop in New Jersey. He eventually filed a lawsuit in federal court in Jersey against the director of the police department, local officials, and IDEMIA, which is the company that made the facial recognition system. So my question is, is it IDEMIA that ultimately will end up paying damages in this case, if it goes that way? Or what happens to the people who are deploying the tech?

KJ: I don't have a full answer to that. I don't think that my reporting goes as deeply into answering that question. IDEMIA didn't respond to any questions—these are the first cases of their kind. And so I think the outcome of the cases will in some way weigh in on who's held responsible or who's considered liable. I believe the Michael Oliver complaint requests that a judge rule that the technology not be used until some of the issues around its ability to discriminate against certain groups are resolved, for example, that could slow its use, but it wouldn't necessarily assign blame.

LG: Right. It seems like it's a bad idea, if you work in law enforcement, to deploy tech that we know is flawed. And as a result of it being flawed, people are going to be wrongfully accused of crimes. It seems like those are humans making those decisions and it's humans building the AI, right? That powers these systems.

Most PopularGearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

GearThe Best Lubes for Every Occasion

Jaina Grey

GearThe iPhone Is Finally Getting USB-C. Here’s What That Means

Julian Chokkattu

Gear11 Great Deals on Sex Toys, Breast Pumps, and Smart Lights

Jaina Grey

KJ: Humans building the AI, and it's humans who are making the policy for how the AI should be used and applied in investigations and where you set the quality standards. You can decide to set the confidence level for the search results for facial recognition system into the 98- or 99-percent accuracy rate, or you can drop them down to 50 and see what happens. There are some jurisdictions that some of the lawyers we've spoken with or researchers who have tried to follow and document the use of the technology by law enforcement, they've documented that in some instances, if the search returns no results and you chose, let's say a 90-percent confidence threshold, that they would be allowed after that to drop it to 80 percent, for example, and see what comes back.

That seems like the sort of thing that if I was a criminal defense attorney, I'd really want to know that and I'd really want to make sure that was something that you could argue about in court, or at least certainly bring, let's say an expert in to challenge, but you can't do that because facial recognition is treated as an investigative tool instead of evidence. And since it's treated as an investigative tool, the photo of the person that they believe committed the crime, that was identified by the facial recognition, is then shown to an eye witness. So the eyewitness is relied on at trial and the entire accusation can go forward without ever mentioning facial recognition.

MC: Yeah. This is one of the shocking things that I learned from your story—that police and prosecutors, they're not required to tell a suspect when facial recognition tech was used to ID them. So essentially the use of the technology is being papered over by hiding it behind a person who looked at a photo that was picked out from a database or maybe a photo that was generated by an AI in some cases, right?

KJ: Yeah. So the process for facial recognition use, as you were saying, there's the one-to-one type thing that you can get with ID.me or with your iPhone. That's a different type of facial recognition. The kind that's used in criminal investigations is referred to as a one-to-many system. So it can take a single photo, which typically an investigation comes from … let's say, security camera footage. Investigators go to a business, let's say that's been robbed, they take a photo of that from that security camera footage and then they run it through a facial recognition system, and that can return hundreds of results. And those results are then shown to a facial recognition analysts that works with police and they get to decide, the human at that point takes over the process and decides who's identified.

And so that can present some challenges, as well, because misidentification is considered one of the primary issues that leads to wrongful convictions, according to the Innocence Project. And people are also less adept at identifying people from different race groups. So there are different parts of that process of using facial recognition in a criminal investigation that a lot of people who are concerned with it say can present challenges or bias.

Most PopularGearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

GearThe Best Lubes for Every Occasion

Jaina Grey

GearThe iPhone Is Finally Getting USB-C. Here’s What That Means

Julian Chokkattu

Gear11 Great Deals on Sex Toys, Breast Pumps, and Smart Lights

Jaina Grey

MC: All right, let's take a break and come right back.

[Break]

MC: As we've been learning, facial recognition technology is being used more and more to assist law enforcement in the United States, even though those systems are saddled by imperfections that can lead them to falsely identify suspects. Now, Khari, you briefly referenced a report from the Georgetown Center on privacy and technology, and among the many things that group at Georgetown Law has uncovered is the fact that half of US adults are now in a facial ignition database. That is a stat that they surfaced in 2016, so that was five or six years ago. Where are we now with the number of faces that are in this database that they use to identify people?

KJ: Yeah. So I think it really depends on how you measure that. So there's the federal approach and there's—the FBI has agreements with, I think, 20-something different states, and that allows them access to drivers' license photos. And so that system, I believe, makes up a large part of what they're talking about with the one and two, but different states have their own systems as well. And in places like, let's say Pennsylvania, there are, I think roughly 13 million people, but their photo database that they use for facial recognition includes 38 million faces.

MC: How does that work? Like glasses on glasses off, beard, no beard. I dyed my hair brown, I dyed my hair blonde.

KJ: Yeah. Fake mustaches. Yeah. I don't really know, but Michigan has a similar thing where there's, I think, 10 million people and 53 million photos in their database.

MC: Wow.

KJ: And this is an updated database. So different states have large photo databases to scan. And then there's Clearview, which has a database, it's my understanding of about 3 billion photos. That's what prompted these accusations, that Clearview's emergence was in effect the end of privacy. I think it's Drew Harwell at the Washington Post has done some good reporting in the last month or so, pointing to a pitch deck from Clearview saying basically that they want to include photos of effectively everybody on the planet. So there's that ambition, I guess. Certainly, they're banned in some places.

MC: But this is largely unregulated right now.

KJ: Yeah.

MC: It's wild to think about.

KJ: There were some efforts to try and regulate it in Congress. I think dating back to 2018, the House Oversight and Reform Committee picked it up and was looking at putting some limits on law enforcement use of facial recognition technology. It actually brought together fervent Trump supporter Jim Jordan and Elijah Cummings, Congressman from Maryland, who was inspired in part to begin looking at this, I believe, due to the use of facial recognition technology in protest against the death of Freddie Gray, a Black man who was killed by police in Baltimore, I think in 2015. Unfortunately, Elijah Cummings passed away, I think a year or two ago, and a lot of that stuff hasn't made progress, either that effort or other efforts by Congress to regulate facial recognition technology haven't really gone anywhere. So most of the policy solutions to this are being approached at the state level.

Most PopularGearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

GearThe Best Lubes for Every Occasion

Jaina Grey

GearThe iPhone Is Finally Getting USB-C. Here’s What That Means

Julian Chokkattu

Gear11 Great Deals on Sex Toys, Breast Pumps, and Smart Lights

Jaina Grey

LG: So to bring it back to what we talked about more on the first half of this episode. The problem is that this technology, it doesn't work as intended, right. And that has real-life terrible consequences for people. How does that really work? If this were to be regulated in some way, is it possible that it would not only be regulated at the level of how it's deployed, but also there would be regulations around how it's built? Because we know with AI and machine learning, there's that phrase garbage in, garbage out. You give bad data, the output could be wrong too. Right? Talk about how that actually works technically, explain it for the people of our audience and how that should maybe change so that these things don't happen.

KJ: Well, I should say that we don't really know how accurate it is in practice. I think there's a comparison to be made between the real-world performance of this technology and what we know in laboratory settings. We know that in laboratory settings, it has improved in its ability to identify people greatly within the last couple of years. It's even apparently, for some reason, better at identifying people from the side, which I didn't know we were doing.

LG: Oh, that's interesting.

KJ: Yeah. That's a thing. I learned that, but the difference between the real world's use case and the deployment in a laboratory, I think is important to note. The other really important thing is even if it's the best facial recognition algorithm, once you use a low-quality image, it can greatly reduce the accuracy of results. So even the best algorithm is going to be diminished by a poor photo used as the input. No matter how the artificial intelligence was trained, whatever data was used for it, that's something that I think a lot of makers of this technology are still struggling with and haven't been able to address. But so far as garbage in garbage out goes, every artificial intelligence model uses training data to make a "smart decision" or determination in the training data—if it is representative, it can accurately identify people, for example.

But if the training data doesn't have people from different walks of life, then you are going to have biased results. This was demonstrated most notably by the Gender Shades project in 2018. It's extremely important to note that we probably wouldn't be having this conversation if it wasn't for that particular work by two Black women, Timnit Gebru and Joy Buolamwini. And there was certainly an overrepresentation of white men that they were able to identify, and I believe studies have found similar results, the National Institute for Standards and Technology, which has also studied this in the Department of Commerce.

LG: So basically, if these systems are trained predominantly on images of white men, then that is going to create a higher accuracy rate for images of white men versus people of color, women, children, underrepresented groups in the training data sets.

Most PopularGearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

GearThe Best Lubes for Every Occasion

Jaina Grey

GearThe iPhone Is Finally Getting USB-C. Here’s What That Means

Julian Chokkattu

Gear11 Great Deals on Sex Toys, Breast Pumps, and Smart Lights

Jaina Grey

KJ: That's the idea. But I think there's, as I was saying, the quality of a photo can impact it. The other thing is people's faces change as they age. When the photo was taken can impact the results, so there're different influencers that can contribute as well.

MC: You could make the argument that no matter how good the technology gets, no matter how much attention we pay to having proper societal representation in the data set, training the AI to recognize people better, it's probably not going to change a lot of the problems that we're talking about because we have a cultural resistance, a political resistance toward holding police accountable for their decisions. We have legal protections where the police don't have to tell you that facial recognition was used to ID you so you can't properly defend yourself in court. We have humans verifying the accuracy and asking human witnesses. And those humans have historically demonstrated that they have a harder time recognizing people who have darker skin. And then we also just have the preponderance of cases that we see where Black and brown people are prosecuted for crimes at higher rates and arrested at higher rates than non-Black and brown people, right? So it's like all of those things aren't going to change. And if the technology is "perfect" or it's perceived as perfect, then it becomes possibly even a deadlier weapon in that regard. Right?

KJ: Yeah. Yep.

MC: I'm sorry, I don't mean to show up with all the negativity, but this is a very complex problem. And it's something that I think, when we talk about these things, being a technology publication, we always tend to lean on, well, it's the technology that's bad and it's leading to these problems. But in this particular case, we've made it very clear, you've made it very clear at the start that it's not the technology is bad and we have these problems, it's that the whole system is bad. The system is relying on this imperfect technology, which just strengthens the power of the system. It strengthens all of the problems that are already there.

KJ: Yeah. One of the conversations that I had was with somebody who helped create facial recognition policy for police departments in Virginia—Christopher Quinn. He has a background in supervising investigators in criminal investigations. He believes that it can be used effectively in cases, but also that the disclosure needs to be there. But he talked about the idea of a CSI effect for facial recognition and the idea that investigators are expecting the technology to be there, the same way that people might have—juries or other people involved with investigations 20 years ago—might have felt about DNA evidence. The difference in this is the forensic evidence, it might be a bit more difficult to collect than a screenshot from a security camera.

MC: Right.

KJ: So the prevalence of it in people's lives can be much higher and to your point, all of these other things in society, even if the technology was absolutely perfect, still means that it could have negative outcomes or disproportionate outcomes in certain communities.

Most PopularGearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

GearThe Best Lubes for Every Occasion

Jaina Grey

GearThe iPhone Is Finally Getting USB-C. Here’s What That Means

Julian Chokkattu

Gear11 Great Deals on Sex Toys, Breast Pumps, and Smart Lights

Jaina Grey

MC: Well, Khari, thank you for being here and for walking us through all of this. It's very heady, so thanks.

KJ: Yeah. Thank you.

MC: And thanks for writing those stories, they're really great. Everybody should go to WIRED.com and read them. We will, of course, link to them in the show notes. We're going to take a break and then we're going to come back and do our recommendations.

[Break]

MC: All right. So we're going to switch gears a little bit. Khari, this is your first time on the show.

KJ: Indeed.

MC: So you are new to this and we apologize for springing this on you, but at the end of the show we do recommendations, where we ask our guests and then we, as the hosts also recommend things that our listeners might be into. They can be something that is in the WIRED world. Often, there are things that are outside of the WIRED world. So as our guest, you get to go first. What is your recommendation of a thing that our listeners should check out?

KJ: I'm going to be honest with you. I got pretty deep into a hoagie on the way over here. And it's been a while and we're not talking about fancy, no aioli.

MC: Top nod.

KJ: No, I don't think so. We're talking about lettuce, meat.

LG: Is it turkey?

KJ: There was turkey, it was peppered turkey basic. It was a basic chic sandwich.

LG: Where did you get it?

KJ: Fred Meyer's.

MC: Fred Meyer's. And you say on the way over here, you mean walking down Third Street?

KJ: Yeah.

LG: Oh, I wasn't sure if you meant on your flight this morning, I'm like, "Wow. Sandwiches are back on planes."

KJ: No.

MC: There is a great pleasure in a giant sandwich.

LG: Yeah.

KJ: It had been a while. So I'm going to confidently recommend hoagies.

LG: I respect that you called out aioli in particular. There was this tweet that went viral earlier this week where someone said they really just took some mayonnaise and added some flavor to it and called it aioli, huh? And everyone's into it.

KJ: It's 100 percent true.

MC: It's like calling the sauce curry, right? It just means you paid an extra dollar for it really. Also, you chose the word hoagie.

LG: Yeah. Let's talk about this. It's not a “sub.”

KJ: I feel like that's not part of my regional dialect because I'm from California, but also I like saying hoagie. I'm a wordsmith and I like …

LG: They call it grinder sometimes on the East Coast.

MC: Nope.

LG: Yes.

MC: Nope. I know they do, but my answer is no.

LG: Have you ever had a meatball grinder?

Most PopularGearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

GearThe Best Lubes for Every Occasion

Jaina Grey

GearThe iPhone Is Finally Getting USB-C. Here’s What That Means

Julian Chokkattu

Gear11 Great Deals on Sex Toys, Breast Pumps, and Smart Lights

Jaina Grey

MC: No. Grinder. Who eats grinders.

LG: I do.

MC: I eat subs, hoagies, and sandwiches and tortas. Yeah. OK. Respect the hoagie, people. Lauren, what is your recommendation?

LG: My recommendation is a little bit in line with Kate Knibbs's recommendation last week. So for those of you who listened, we had Kate Knibbs on the show very briefly at the end of the program to talk about her great experiment with Apple Music and how after a little while she gave up Apple Music and she went back to Spotify. My recommendation this week is to go back to a Garmin watch if you've been wearing an Apple Watch. Now these are fighting words, I understand. And I do still really like the Apple Watch, which is different from what Kate had to say about Apple Music last week. I like the Apple Watch, I think it is incredibly thoughtfully designed. I think if you're living in the Apple universe there's nothing like it in terms of the integration and how you can see your messages and respond to messages and control your apps and all of that.

I had this great hack going for a while, when I was in the shower, I'd wear the Apple Watch in the shower and then control my music from the watch, and it was pretty great, right? But battery life, it always comes back to battery life. And I've been doing a lot of outdoor stuff this winter and back to traveling a little bit again, and this Garmin, this Fenix 5S which I had written about years ago—gave it an Editor's Choice Award when I was at The Verge—I own it. It's great. The battery lasts for five days easily. I went back to wearing it at the start of the year and I haven't looked back. I really, really like it. And I've heard this from other people too. I know Adrian in our team is a huge Garmin watch fan. You Michael, you would never wear smartwatches, and then you started running and you got the Garmin.

MC: Right. Well, I have a Withings Activite Steel, which is a smartwatch, but not what most people would point at and say, that's a smartwatch. But yeah. I wear Garmin 735XT.

LG: Yeah. And it's great. It does everything I'm sure you needed to do. And also they have these transflective LCD displays and so they're easier to see in sunlight, and they're not so power-hungry. I don't know, you can configure the screen so that you can see things like elevation, you can configure it to track all kinds of activities, snowshoeing, backcountry skiing, yoga, sailing, whatever the hell you want. It's pretty, I don't know, I'm just a really big fan of Garmin watches. They have a sliver of the smartwatch market these days. Apple is clearly dominating, but I'm a fan.

MC: So here's my big question about your Garmin sports watch usage. Do you have the notifications turned on?

Most PopularGearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

GearThe Best Lubes for Every Occasion

Jaina Grey

GearThe iPhone Is Finally Getting USB-C. Here’s What That Means

Julian Chokkattu

Gear11 Great Deals on Sex Toys, Breast Pumps, and Smart Lights

Jaina Grey

LG: I do. And then I can't do anything with them. It's just a dumb watch at that point. I see it come in. I see a Slack notification or an iMessage come in and I'm like, “OK, I need to address that.” There's no way to interact with it on this watch. I will also say, I find myself taking this watch off more frequently when I'm headed into certain social situations, which I never did with my Apple Watch because the Garmin watch is bulky and it's …

MC: You can say ugly.

LG: It's not the most … It's not the most attractive watch.

MC: I don't think it's ugly. I think most people in our social circle, I assume your social circle, would see it and be like, “Oh, it's a Garmin watch.” And that says so much about you as person, right?

LG: Right. And that's interesting too, that the Apple Watch does not say so much about you, if you're wearing an Apple Watch anymore. In the early days it did, it was like, oh, you were a nerd, you were really into it. Or you were a tech reviewer. But now everyone has an Apple Watch.

MC: Or you worked at Apple.

LG: Right.

MC: In San Francisco, I just assumed that.

LG: Exactly. I don't even know how much this thing costs anymore. There are so many Garmin watches. Let's see, I'm looking this up right now. Boone, you can keep in the typing, typing, typing. The Fenix 5S, all right, it's still not cheap. Let's see, it looks like you can pick it up for as low as 400 bucks, but most places are listing it for 500. You get the Sapphire glass too.

MC: You can buy two Apple Watches for the price of the Fenix 5S.

LG: Oh, I love when this happens. I Googled this and the first thing that came up for news results was the Garmin Fenix 5S, is the fitness watch I don't want to take off, and it stated March 30th, 2017, written by yours truly. So, yeah. Anyway, that's my recommendation.

MC: Nice.

LG: A very old recycled recommendation.

MC: Nice.

LG: Mike, what's yours.

MC: Thank you for telling us what time it is. My recommendation is a show called The Dropout and it's on Hulu, you've probably heard of it. It's the story of Elizabeth Holmes and Theranos, and it is a fictionalized episodic account of that story. I was really on the fence about watching it, but I noticed that there were all of these shows coming out that have to do with things that were big in our world. There's Super Pumped, the Uber show. There's WeCrashed, which is the WeWork story. Then there's this one, The Dropout. I don't know why I selected this one first, just because I was in Hulu for something else, and I just clicked on it. It's trashy, it's not fantastic, but it is very funny. I really just like how over-the-top it is. There's one scene, in particular, I've been talking about this for the last couple of days.

Most PopularGearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

GearThe Best Lubes for Every Occasion

Jaina Grey

GearThe iPhone Is Finally Getting USB-C. Here’s What That Means

Julian Chokkattu

Gear11 Great Deals on Sex Toys, Breast Pumps, and Smart Lights

Jaina Grey

So excuse me if you've heard this, but there's a scene at the beginning of one of the episodes where Elizabeth Holmes, she's so enamored by Steve Jobs and by Apple and by the whole way that they present that company that she goes to one of the Apple stores the day that the first iPhone is released and you can buy it. And there's this giant crowd of people outside the Apple store, and they're all crying with joy and jumping up and down and screaming and cheering that the iPhone is finally on sale. And people are coming out of the store and they're showing their iPhone or holding it up and everybody's applauding. It seems ridiculous, but it's easy to forget that was what it was actually like when the iPhone went on sale 14, 15 years ago.

She's one of those people that we all looked at and we're like, “Wow, these people really like the iPhone.” So it gives us this fun little snapshot of what the rosiness around technology was at the time, and also just that halo of Apple and Steve Jobs that she fell under, which was really recognizable to me as somebody who lived through it, anyway. I think new episodes come out every week now, and it's a miniseries. So I think you can pop in and watch the first four or five right now, and then it's going to conclude, I think at the end of this month, beginning of April.

LG: How do you feel about Elizabeth Holmes as a character and as a person as you're watching this?

MC: It's difficult because I identify with her because it's Amanda Seyfried.

LG: All right. OK. You're not seeing Elizabeth Holmes, you're seeing Amanda Seyfried.

MC: Right. So it's not an immersive show. It's not like I only see the people on the show as the characters. It's Amanda Seyfried and the guy from Lost, Sayid from Lost. So it's like I can't really get into it and really feel them as characters, it's more just about the spectacle and the fun of it because everybody in the show is a buffoon. That you know that they all get duped.

KJ: Do they address the blinking?

MC: Not yet.

LG: The blinking as in the lack of blinking?

KJ: Right.

MC: Yeah.

KJ: Yeah.

MC: There're subtle things about the voice.

KJ: Right.

MC: We see her rehearsing the voice in the show.

KJ: I had a feeling.

MC: We see her rehearsing the wardrobe, we see her rehearsing the hair. So I'm sure at some point it happens.

KJ: I'd watch just for that.

MC: Yeah.

LG: I wonder about these tech scripted series that are coming out now, the ones that you just listed, because I think there is, I'm probably overthinking this, but I think there is a danger of turning these people into caricatures of themselves or feeling the lack of attachment or a distance from them as people who in many cases did real harm to their customers. And they've suffered consequences, right? Like Travis Kalanick was booted from Uber, and that was his company and his baby.

Most PopularGearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

GearThe Best Lubes for Every Occasion

Jaina Grey

GearThe iPhone Is Finally Getting USB-C. Here’s What That Means

Julian Chokkattu

Gear11 Great Deals on Sex Toys, Breast Pumps, and Smart Lights

Jaina Grey

MC: He's still filthy rich.

LG: Right. And now whatever he is doing.

MC: Adam Neumann, filthy rich.

LG: Yeah.

MC: Elizabeth Holmes is probably going to spend 20 years in jail.

LG: Right. Right.

MC: Maybe I shouldn't laugh.

LG: Right. And now there's entertainment value in these stories, which is just bizarre to me.

MC: It's slow-motion car crashes.

LG: Yeah.

MC: Watching these shows, that's what it is.

KJ: And occasionally breaking the law.

MC: Yes.

KJ: Did you ever break the law?

MC: That is the car crash. Did I ever break the law?

KJ: No, not you. WeWork.

LG: Yeah. Mike, did you?

KJ: You'll have to speak with my lawyer.

LG: Yeah. Did WeWork break a law? That's a great, no, just over-leveraged. Yeah. And now it's crazy because everyone's going back to WeWork's or WeWork-like spaces.

MC: Yeah.

LG: Yeah.

MC: The great RTW—the great return to WeWork.

LG: Arielle Pardes wrote about this for WIRED this week as well. Wrote about these three shows, and she liked WeCrashed the best.

MC: Yeah. She did not have kind words for Super Pumped or The Dropout, but that's fine. I still think there's plenty of bad television out there that's way worse.

LG: We should also mention that, of course, Super Pumped … the author of Super Pumped, the book, and who's also I believe a consulting producer or executive co-executive—he's a producer—is Mike Isaac, who was a member of WIRED for a long time and was a cohost of the Gadget Lab Podcast.

MC: Cohost on this show.

LG: Of this very show.

MC: Yeah.

LG: Yeah.

MC: I elbowed him out. It's a good book though. It's a great book.

LG: The book is really good.

MC: Great book. Read the source material, everybody. Read John Carreyrou. Read the two folks who wrote the WeWork book.

LG: Eliot and Maureen.

MC: Eliot and Maureen.

LG: And they came on the show as well.

MC: Read Mike Isaac.

LG: Yeah. Good recommendation snack.

MC: Thanks. All right. Well, that is our show for this week. Thanks for joining us, Khari. Thanks for being here.

KJ: Thanks for having me, of course.

MC: And thank you all for listening. If you have feedback, you can find all of us on Twitter, just check the show notes. This show is produced by the indubitable Boone Ashworth. Goodbye, and we will be back next week.

[Gadget Lab outro theme music plays]


More Great WIRED Stories📩 The latest on tech, science, and more: Get our newsletters!Driving while baked? Inside the high-tech quest to find outHorizon Forbidden West is a worthy sequelNorth Korea hacked him. He took down its internetHow to set up your desk ergonomicallyWeb3 threatens to segregate our online lives👁️ Explore AI like never before with our new database✨ Optimize your home life with our Gear team’s best picks, from robot vacuums to affordable mattresses to smart speakers

Related Articles

Latest Articles