Our co-host this week is veteran tech journalist and entrepreneur, Laurie Segall. Focusing on all the ways in which tech has sped up and slowed down our world, from television series to cellular phones, Lily Collins joins us to discuss her Instagram-influenced hit series, Emily in Paris and the award-contending film, MANK. Segall also speaks with leading Black ethics lawyer in tech, Rashida Richardson, who was featured in The Social Dilemma, about all the detrimental "-isms" plaguing the tech industry right now and where it may lead.
Our co-host this week is veteran tech journalist and entrepreneur, Laurie Segall. Focusing on all the ways in which tech has sped up and slowed down our world, from television series to cellular phones, Lily Collins joins us to discuss her Instagram-influenced hit series, Emily in Paris and the award-contending film, MANK. Segall also speaks with leading Black ethics lawyer in tech, Rashida Richardson, who was featured in The Social Dilemma, about all the detrimental "-isms" plaguing the tech industry right now and where it may lead.
Krista Smith: [00:00:00] Welcome to More Like This, a podcast from Netflix Queue, the journal that celebrates the people, ideas and process of creating great entertainment. I'm Krista Smith. I've spent over 20 years interviewing some of the biggest names in Hollywood. And on this show, I'm bringing you fresh new perspectives from across the entertainment industry, with the kind of access only Netflix can offer, but I won't be doing it alone.
I get to collaborate with some of the best writers, interviewers, [00:00:30] and experts in the business. My co-host this week is Lauri Segall, veteran tech journalist and entrepreneur, former CNN, senior tech correspondent, current 60 Minutes, contributor and CEO of Dot Dot Dot Media. Shall I continue? Hi, Lauri.
Laurie Segall: [00:00:49] Hi, thanks for having me.
Krista Smith: [00:00:51] Oh, it's so great to see you. So let me just ask you as someone that is consumed with tech, you've been covering it your whole [00:01:00] career. H-how are you finding the technical working from home?
Laurie Segall: [00:01:04] It's been interesting. I-I think it's accelerated quite a bit, what was already probably going to happen in the next, I would say like five years, but I remember at first, when we all started having to to work from home, I called up Matt Mullenweg, who's the founder of WordPress and WordPress, which powers a third of the internet has a distributed workforce, and this is something that they've done for a long time. And I remember him saying, Laurie, make sure [00:01:30] you over-communicate when you're working from home, like send more text messages than you would, include emojis.
Like this is a, a multimillionaire tech founder saying send emojis. He was like, because things will get lost in context. And I've always looked at this intersection of technology and culture. And, and I think like it was all of a sudden, like, we're starving for humanity through the lens of tech. And it happened so abruptly, and I broke everything like immediately, as soon as I started coming home, like as [00:02:00] someone who's covered tech my whole career, like I, like, I broke Zoom, I broke my computer.
I just, I broke everything. Speaking of tech, things like it- my mic just fell over right when you said that. [Laughs]
Krista Smith: [00:02:11] Right. It's so hard. Well, I, I, I, if someone doesn't say to me, Krista you're on mute, then I'm not worried, I mean that happens. That happens at least once a day. It's gotten a little bit better, but Oh my God.
It's we've we've all been, uh, dealing with it. Well, we have such a great, fun show and I have so many questions for you, [00:02:30] but the first one I want to know is what have you been doing on your downtime? So what have you been watching? What have you been reading? How have you been relaxing?
Laurie Segall: [00:02:37] Oh my God. I watch Law & Order SVU. Like, this is how I, I relax. Like I watch like escapism type television. I just watched. Bridgerton. I binge-watched it in like two days. So yeah, I know. I have like a, a massive problem when I like something and I do it all between the hours of like 9:00 PM and 1:00 AM or something. It's super unhealthy. I just finished writing a book, which I cannot say is the [00:03:00] most relaxing process.
So I think for me, it's just honestly, turning off for a little bit. And I live right near the West side highway, so I try to go for walks near the water for my head and listen to music. I think that's kinda how I, I try to deal with a lot of this, and I try to stay off Instagram as much as I possibly can for humanity, so
Krista Smith: [00:03:18] All right, well, let me just ask, is the book an autobiography? Or is it a fiction book? W-w-what is it?
Laurie Segall: [00:03:24] It's a memoir. It's a non-fiction memoir and it's about my time climbing the [00:03:30] ladder at CNN. You know, I started off in the breaking newsroom, micing up guests at the, at the, you know, beginning of the recession. And I stumbled in to one of the, I would say most important beats of our time, which is technology.
And I, I covered the minnows before they turned into sharks. And I would, you know, say we've got to pay attention to these guys who, who started this little company called Instagram when there were three people there. I became our senior technology correspondent at CNN, but I watched these small [00:04:00] entrepreneurs turn into these huge powerhouse tech brokers that changed the world and changed democracy and society.
Krista Smith: [00:04:06] Well, you mention Instagram, and one of the, one of the things we're going to talk about here is I have this great interview with Lily Collins, and I'm sure if you like escapism, you've seen Emily in Paris and she's also in Mank, which is the complete opposite, this brilliant film by, in black and white, by David Fincher about uh the writing of Citizen Kane.
But let's talk about influencer culture, cause [00:04:30] that's going to lead into our other piece on The Social Dilemma where you're featured heavily because you interviewed Rashida Richardson, but let's talk about the Instagram influencer. Are they really moving the needle that much? Are they really influencing or are they just creating a lot of noise?
Laurie Segall: [00:04:49] I think there was a moment in time where, you know, and you saw it in technology where there was so much weight behind influencers and behind, you know, how much, what they [00:05:00] represented and what they could do. And I think there's really been a push towards authenticity and mental health. And so I think the meaning of influencer, I hope it will change.
I think you can have influence and then not have a negative connotation, but I think that it has to change and the narrative has to change. I think we'll see a whole new generation of influencers that are going to be young, bad-ass, interesting folks, and, and it won't just be as transactional. I think people have a visceral reaction to things that just [00:05:30] feel as transactional, and I think you can sense that happening.
I am an optimist to some degree, but I think there is a real dark side of all of this that we have to pay attention to. And, you know, I think it's a cultural conversation that we started having years ago when I was at CNN, we started talking about the implications of this, but I think it's a national narrative right now, which we- they talk about in The Social Dilemma, you know, that's out there, that parents are talking about with their children. So, you know, I hope it evolves.
[00:06:00] Krista Smith: [00:05:59] Lily Collins absolutely shines in David Fincher's Mank as Rita Alexander, who is Herman J Mankiewicz's secretary, but it's her lead role in Emily in Paris as an Instagram influencer, that really brings forth her humor and versatility on screen. I talked to Lily about both of these completely opposite characters and a whole lot more.
First of all, I'm so happy to see you.
Lily Collins: [00:06:28] So good to see you too, [00:06:30] as always, I know in whatever capacity that is.
Krista Smith: [00:06:33] Right, and I was just thinking back about first meeting you when you were 15, I believe, and working for Nickelodeon.
Lily Collins: [00:06:42] Yup. Yeah, I walked into your office. I remember you're- exactly how your office looked. I remember there was like a photo on your desk. God that's so long ago. Ugh, I was a journalist. Yeah. I was a journalist and I was coming to you and you were an editor, and I was asking for a job.
[00:07:00] Krista Smith: [00:07:00] I want to start with Mank. David Fincher, Gary Oldman, you play Rita Alexander and you are a secretary, basically. You're, you're taking notes, you're helping him finish this script. And I have so many questions because one of the most interesting things about your character and Gary's character as Mank, is he's in a bed most of the time. And your performance is so still at the same time it's [00:07:30] so active. And I love that.
I loved watching you in this. And I was like, Oh my God, kind of blown away, uh, by what you were able to do in this movie. So I want to talk to you about when you first got that script, when it came to you, like what, what were your initial thoughts?
Lily Collins: [00:07:48] Um, well, thank you. I-I very much appreciate that. I got the script, I think it was about a week and a half before I went to Paris to film Emily in Paris. And [00:08:00] it was a very intense audition process. I had to put myself on tape, but I only had like three days to do it before I left. And I sent it in the day before I left and thought, there's just no way this is going to go anywhere because I'm not going to be in town. And it's David Fincher, so there's just no way. So I went to Paris. Found out there that he responded positively to the tape and wanted to Zoom with me at which point I did not know what Zoom was because it was not a household name [00:08:30] yet. So I finished a whole day of Emily and, um, and then zoomed with David from Paris. Of course my wifi was acting up and, and the zoom didn't work properly.
So I did, I did my, my side and my takes. And then David would give me notes to which point I never heard a single note because the Zoom stopped working. And I just, all I kept hearing was him say, okay, now, now go now try it. So I have no idea what his notes were and I just had to keep kind of trying whatever it was that I thought that he [00:09:00] wanted.
And I remember getting off that and thinking, there's no way I got it, but what an amazing experience. Cut to two weeks later, I found out that I got it and had to then fly back to LA twice for 24 hours each during Emily, which I'd never had a day off on that set because I'm in every scene, so that has to be figured out for weekends. And I was in the air longer than I was on the ground.
And I had to treat myself kind of like a robot. But little did I know that in a matter of eight months, we'd be in a quarantine phase where you couldn't travel at [00:09:30] all, so, I'm so happy. Needless to say that I, that I did both at the same time, but I, I love stillness.
I love being given parameters in which I can then play. And that can be said for the type of director, if someone knows exactly what they want and they can give me boundaries and say, this is, this is the shot. This is what I'm looking for, but then you can run wild and [00:10:00] insert different ideas here or there.
Um, I like that because I feel safe. I feel secure. I feel nurtured, but I also feel trusted to play and collaborate and try. And David allows you that as a, as an actor, he gives you such a safe place because you instinctively, you, you just trust David because he's a genius. I've admired his work ever since I can remember watching movies and he has such a distinct [00:10:30] vision and point of view and a way in which he articulates that to every actor so that they understand, and not even just actors, but everyone behind the camera.
And even though my scenes with Gary are very still sometimes, and literally seated and-and there's so much to play with in that one camera set up. I love old movies and I love watching, uh, all the Hollywood actresses speak without speaking, it's in their eyes, it's in [00:11:00] their reactions. And, you know, you're taught early on in acting that it's just as much reactionary as it is actionary and, and in a movie like this in black and white, where you're not as a viewer, distracted by color, which I'd never even thought of before, it's like when you're watching black and white, you're really forced to look at light, texture, and people, like what the actual people's faces are doing.
As an actor you're like, wow, there's so much more for me to play with here. [00:11:30] And God knows, Gary is giving me so much to react to that I was just so present and I found myself so present and not thinking as much on this set because it was all just so there. And I was able to just let go and, and be so in the moment.
So, um, I treasure those scripts that you read that aren't- that are pretty rare. You know, when, when you know that the stillness is also going to speak such volumes and [00:12:00] you know that the camera is going to be able to sit and not have fast cuts. Because I love that about old movies too, where an audience can make decisions for themselves and how they feel.
They're not being told how to feel through an editor's cut, you know, and sound design, really fast edits and people's attention spans seemed to be longer back then. And so y- the lingering shots allowed you to breathe and emote and feel, and David leaned into everything that an old movie would of [00:12:30] that time.
Krista Smith: [00:12:30] Mm, well, I interviewed Gary earlier, uh, and he had called your, he called your performance a revelation and that me-
Lily Collins: [00:12:41] He didn't. No.
Krista Smith: [00:12:42] He did, he did. And that made me really proud vicariously. I loved hearing that from him. And I'd love for you to talk a little bit about working with him.
Lily Collins: [00:12:54] So I met Gary when I was about two years old on the set of Dracula. Um, I came on set with my [00:13:00] family because our family friends had written Dracula and Hook and we were shooting on the same lot, and my dad was in Hook and I met him when I was two. And that, that's important in this story only because there's that distinct memory as a kid of having this connection with someone that then there's a familiarity and a respect there and kind of this nostalgia that Rita has never met Mankiewicz, [00:13:30] but she has to feel this kind of connection, this deep rooted connection with Mankiewicz because you want to believe as a, as a viewer that she truly like respects and loves and admires this person in a, in a nostalgic, familial way.
And that was already kind of in me with Gary. Cut to, I'm in a room with him rehearsing for Mank. And these little tidbits of, [00:14:00] of history that we had, allowed me to kind of draw a familiarity with Gary into Rita and Mank, but also just this deep admiration of him as a human being that Rita has for Mank. And throughout every experience with Gary. And when I stepped into that room for the first time after being cast as Rita, It's like his [00:14:30] smile and his laugh light up a room. It, it's truly those, those best moments of Mankiewicz that you see on screen of him filled with light and love and hope. That's Gary on a daily basis as an actor, he is so, he's so present. He's so giving. He's so kind. He's very nurturing and he's super playful. It's fascinating.
He- so I play British in this [00:15:00] and I know I grew up with a British accent, but I do sound like this every day. And I, and I don't stay in accent, uh, throughout my experience I dip in and out, and Gary is the same. Gary's has his accent on screen, and then he's like back to, you know, spitting out British jokes and humor, and we were laughing so much and it's like to be able to watch someone like Gary who's just so incredible and so in it, at the drop of a hat, go straight back to telling a joke and being Gary, and then go straight back into it when they [00:15:30] yell action. It's, it's he doesn't miss a beat. We have the gift of time with David, and sometimes a scene will take a week. Sometimes it takes a couple of days because of the different camera angles and whatnot.
He like never had a day off. It's Mank, you know? And, and he just never gave less than a hundred and was just the kindest to the crew. My level of expectation now for like a co-star has gr- shifted because I'm like, Gary just gives [00:16:00] everything and he's so wonderful. And I feel like I've been so spoiled.
Krista Smith: [00:16:05] Yeah, it was interesting, you said with Fincher, you get the gift of time, and and we all know part of his excellence is doing as many takes as you need to get it. And that attention to detail. Did you respond in a way that you didn't anticipate or like, how did you react to that?
Lily Collins: [00:16:23] Yeah. I didn't know what to expect other than things you hear. I'm a slight [00:16:30] sucker for playing around with things anyway so, I love it. And I know that that some people won't believe me when I say that, but I genuinely, if you're going to be playing with the best of the best, I want to play as long as I'm allowed to, I'm going to come to set even when I'm not working, because God knows I'm going to learn just so much by watching.
And if they're willing to take the time to play with me, I'm going to soak that up. [00:17:00] And the thing is also, you know, I would have ideas. Gary would have ideas, everyone else would have ideas. And David would go great, show me. And it's to be given the trust to also use the time that you're given to try something of your own.
That's an amazing collaboration. There's one scene in particular where I, you know, I've read the letter that my husband's presumed lost at sea. Gary makes a joke and I run out and I, we don't know for how long that I've been out, but I've had to gather [00:17:30] myself and think about what he said and kind of had a good cry.
And I come back and I walk up to the doorway to, um, say that he was right, but also to kind of give him a bit of a berating and we don't know at the time, but Gary has, you know, has drank some of the alcohol and he's passed out, but I come to the door and I just kind of start to have a monologue. And that for me, I think was the most amount that we did for one scene for me.
[00:18:00] But it's because we're like, well, I-I don't know how long she's been outside. So how much anger is she still bringing in? How much sadness is she bringing in? Do we think she wants to be a bit meaner here? Do we- like there's so many ways to go, that we want to have the ability to kind of have all the edits and also then he has to drop the bottle and then the lighting has to be perfect.
So it really is a game and it's a creative game that I love to be a part of, and if someone's willing to bet on me to play [00:18:30] that game, I'm going to play it. And so if I ever were to get to work with David again, I'd be so excited to do that.
Krista Smith: [00:18:36] Let's talk about Emily in Paris. I didn't realize you were going back and forth and shooting both at the same time.
People have loved this show, watched all of it in one weekend. I mean, the memes, the, all of it exploded on social media. Were you surprised at all by the, the juggernaut that it became?
Lily Collins: [00:18:54] We felt like it was something special when we were shooting it. I know that when I first read the pilot, which was the first two [00:19:00] episodes combined, it was that classic feel of that romcom from the early nineties that I just devour, but don't really get made anymore.
And I never expected it to become memes and gifs and Halloween costumes. Like I, Emily was a Halloween costume. Dogs and cats were Emily for Halloween. I was like, what is going on here? I just think people are craving escapism and travel. I mean, at least my, my [00:19:30] Pinterest feed, as well as my Instagram feed are just like pictures of pretty places that I want to travel one day. [Laughs] like, let's go here when we can again, you know?
Krista Smith: [00:19:38] Well, I look forward to your extended story. And to season two of Emily in Paris, uh, excited to see what's going to happen to Mrs. Cooper.
Lily Collins: [00:19:50] Me too. I have no idea.
Krista Smith: [00:19:53] I expect we'll see more hats, more berets, and hats and, and key chains and, and you know, lots of [00:20:00] primary colors and all of that stuff it's just so fun. Um, thank you so much, Lily.
Lily Collins: [00:20:05] Thank you, Krista. It's always so nice to see you. I hope one day soon, I get to see you in person.
Krista Smith: [00:20:11] Please.
What did you think of Lily?
Laurie Segall: [00:20:18] First of all, I love that she has like this, this journalistic background or, you know, and, and I love that about her. And I also I think there is something about, and I think a lot of folks have talked about this, but about Emily in Paris, that kind of struck a [00:20:30] chord, right? I think there was something about sitting at home in this pandemic, and like, for those of us who enjoy dressing up and going out and like that specific escapism at this certain moment, there was something I think that that kind of struck a chord.
Krista Smith: [00:20:45] Well, I'm dying to talk to you about obviously Social Dilemma, which, you know, came out at Sundance, but when it hit on Netflix, it was one of those things that, it it didn't even take 48 hours and it was bouncing everywhere. [00:21:00] So I would be remissed if I didn't ask you about the giant event that happened, post January 6th, where Twitter and Facebook made the huge decisions at that time to cut off the um, feed of the former president. So what do you think about those decisions, not, not personally, but from, from covering tech, and from your perspective, do you, is it about time? Is it the worst [00:21:30] decision? Was it a long time coming? Like what, what are your thoughts on it?
Laurie Segall: [00:21:33] I think it's a complicated decision, right? I think it's easy for us to say, well, it's about time. I keep going back to this idea, and this this one thing that someone I interviewed said to me back in 2017, and it's a line that still haunts me because it's happening. And he said, our world is becoming a chat room and we're becoming our own avatars and why I think that haunts me is I remember him saying that, we were talking about hate speech moving offline and the [00:22:00] dangers behind it. I think we can look back over the last couple of years and we can look at how much power and control that these tech companies have to shape democracy to shape our mental health, to shape the narrative.
I think the skeptic in me, and as someone who's known a lot of these leaders and who has followed this for a long time, says, you know, it was an easier decision to ban then president Trump two weeks before he was out of office because the stakes weren't as high, but it's going to be a complicated one because it opens up all sorts of issues.
And it, and it [00:22:30] shows us the power of these big tech companies. And it's, I don't think it's a power that they necessarily want. And so I think the the thing that everyone should be paying attention to coming next is the regulation coming down the pipeline, and everyone should be paying attention to something called section 230, which is, you know, the law that basically wrote the internet. Are these tech companies responsible for the content on their platform? And I think it'll, it'll be something that will completely change the landscape technology.
Krista Smith: [00:22:58] So what, what were your thoughts when [00:23:00] you first saw a social dilemma?
Laurie Segall: [00:23:02] Well I've, by the way, I-I didn't, so, I'm in Social Dilemma and I didn't, I didn't realize it.
Krista Smith: [00:23:07] You are.
Laurie Segall: [00:23:07] It first came out and I-I there's like a little, I have a quick cameo because I'm interviewing Zuckerberg during Cambridge Analytica, which was for folks who are listening and don't know Cambridge Analytica was just like, the data privacy scandal that broke the internet. It was like the moment for me that I realized like Facebook had really, really screwed up.
And so I went out to interview Zuckerberg and like, um, I [00:23:30] don't remember, by the way I booked that interview because I messaged Mark on Facebook. I remember that being a, uh, a historic moment for tech, I was saying like, Knowing what you know now, like what do you have changed? Anything or something like that, they took that line.
And so a lot of people were messaging me, messaging me and I, and I, I watched it. And if I'm being completely honest, it was a little painful to watch, right? In a good way, in a good way, because it was like all of these issues that we've been trying to get tech founders to talk [00:24:00] about that we've been like beating people over the head with for so many years saying guys like, the business model of Silicon Valley is, is really problematic. If I could extract one theme from all of the founders and the biggest founders and the biggest stories of my career uh, the one, the biggest problem of interviewing all these founders was they forgot about humans.
And it was just watching everybody say it out loud in a way that finally resonated and went mainstream. It was exciting to see it out there. And I'm glad people [00:24:30] are paying attention now. And I hope. We'll do something. I think that for me, it's always now what? And what is the next conversation we need to have? But it was the moment that it went mainstream and that makes me very happy because, you know, p- I, I do believe that if these companies have had more diversity, if they had answered these questions, if they had not surrounded themselves in these little filter, bubbles, that I've been a- I've seen firsthand that we wouldn't be in this position.
Krista Smith: [00:24:55] Well, speaking of that, you got to talk to Rashida Richardson, who's one [00:25:00] of the very few black female activists working in and around the tech space. I mean, like you said, there's so little diversity there. It's like a sea of, of white dudes in hoodies basically.
Laurie Segall: [00:25:10] Yeah.
Krista Smith: [00:25:11] But I want to set up that interview. I mean, we ran it in uh Netflix Queue, which was, you know, a few weeks ago and we're going to play the audio here. So can you just talk to me a little bit about what it was like and what you learned from her?
Laurie Segall: [00:25:22] Sure, she's a fascinating woman and I- it's, it's interesting because, um, her name comes up quite a bit in tech circles.
I think [00:25:30] one of those interesting things from the interview she told me was, you know, she's kind of this force behind the scenes and she's almost this Bourne for tech companies like, that gets in the way, um, she's an attorney who kind of you know, goes out and says, wait, this is going to be a problem, this is going to be a problem, you've got to pay attention to this.
And she actually started out at Facebook, you know, way back in the day, like years and years and years ago around the same time that I started covering technology. And what she told me as a black woman who s- went to [00:26:00] Menlo Park to Facebook was that she just, she said she experienced racism, that she could not fit there, and that she had all these ideas, and that it was not a place that she felt she could be in, that she thought she could do better work for the tech community outside of the tech community. You know, and I think that might be a good way to set up the kinds of issues that she fights for, which are civil rights and the future of technology and ethics.
And needless to say, her hands are full right [00:26:30] now.
To start it out, Rashida, how would you describe your role in the, in the tech space?
Rashida Richardson: [00:26:40] So I always start by saying I'm a civil rights lawyer that works on technology. And the reason I start there is because many lawyers that work in this space specifically like AI, big tech platforms tend to have more of an IP or even like antitrust background, which is completely different.
And my [00:27:00] work in this space tends to look, examine very critically the intersections between technology, race, and society and the law. So within that very large umbrella, that means looking at issues of algorithmic bias, trying to understand current sort of implementations or deployment of data-driven technologies and data practices within government.
And that is the lens. And then I also do a lot of work around policy [00:27:30] development and just thinking through interventions to address a lot of the problems that we see.
Laurie Segall: [00:27:35] Well, it's a pretty tall order these days, given everything happening in tech society and, uh, and culture in general.
Rashida Richardson: [00:27:42] Yeah, I think my special slice of this pie of tech has become more relevant over time, which is not necessarily a good thing, but in some ways I'm happy that my role exists because I think a lot of the problems that we see [00:28:00] in tech are just amplifications of problems within society. So, having someone with my background or even that's looking at these problems through my lens, I think is quite, um, important right now, and relevant.
Laurie Segall: [00:28:15] Eight year director of policy at the AI Now Institute, describe your role.
Rashida Richardson: [00:28:19] No, in the film, I was, uh, still in that role, but right now I'm a visiting scholar at Rutgers law school. So I'm doing very similar work to what I did at, uh, NYU in AI Now, but [00:28:30] just in, at a different university and in more of a research roll.
Laurie Segall: [00:28:33] Describe your day to day now.
Rashida Richardson: [00:28:35] Well, now it's- now, and then I'd say I wear two hats. One is that I'm a researcher with legal training, so I tend to interrogate and look at issues of how AI, big data, and data-driven technologies are implemented in society, and I tend to focus more on government use. Um, and then the other hat is being actively engaged in policy [00:29:00] discussion. So, and that's on the local, state, federal, and international level.
Laurie Segall: [00:29:05] You know, The Social Dilemma, uh, it was interesting to me, these problems, the business model in tech, the addictive nature of algorithms, tech companies vying for our attention. We've been having this conversation for years, but it certainly seemed like when this stuff came out, it struck a chord culturally. Why do you think that is?
Rashida Richardson: [00:29:23] I think society as a whole has been grappling with the relationship between [00:29:30] our- us as individuals in technology, but I think COVID has really lit a fire under this in that all of our lives is mediated through some form of technology, whether it's zoom right now, or through most of our day is spent on zoom.
The fact that kids with computers and internet are on them for most of the day for education, where in a non remote environment, they would be interacting with people. So I think everyone, um, is questioning their [00:30:00] relationship with technology in a new way, just because our lives have to be mediated through it.
In some ways the documentary helped humanize the issues to people, both in some of the dramatizations with, which some people said freaked them out even more. Um, but also helping break down certain concepts that I don't think are typically covered, even in basic media coverage of these issues of like understanding how an optimization algorithm works.
You understand that a little [00:30:30] bit more in that like, Oh, the reason why I get these alerts is for me to stay engaged and that's what it's optimized for. And I don't think when using many different technologies or even apps that people take the time to think of like, why is it that Waze- like, I don't know how to get around places unless I have Google Maps or Waze or some type of application showing me. Um,
Laurie Segall: [00:30:55] Yeah.
Rashida Richardson: [00:30:56] So I think it's both, it's striking, the movie [00:31:00] came out at a time that I think there's been more scrutiny towards technology, but then also during a global pandemic, people's um sort of use of technologies have probably become more complicated.
Laurie Segall: [00:31:12] We've all lived our lives, many of us on social media or we've had these devices, our data is out there, you know?
So when we look at a film like The Social Dilemma and people say, well, I'm going to delete Facebook. Well, unfortunately for you, you might still have Instagram or [00:31:30] WhatsApp, which is owned by Facebook, so whether or not you like it, you've opted in. So as someone who lives in breathes data and the, the bigger implications for society, what do you tell people?
Rashida Richardson: [00:31:41] I think people need to think more critically about their actions and understanding data collection in that you don't simply need a computer to have data collected about you. A credit card is another common way that a lot of information is obtained about us. Discount cards, or like the things that the grocery [00:32:00] store where it's like, you'll get 10 cents off, but they're also recording everything you purchase.
So I think it's, if, if we had better public education for people to understand just the wide variety of ways that data is collected about us, then people can make better informed choices about what you want to opt in and opt out of and part of it is the point you raised of like, just even understanding the mergers and acquisitions of the tech sector.
But I think it's also on a more [00:32:30] basic level of like, you're always trading off convenience for something else. And sometimes I think people need to question whether that convenience was worth it, to a certain degree of, or understanding what those inherent trade-offs are with some technological applications.
Laurie Segall: [00:32:49] The doc talks about predictive data analytics. What is the danger of predictive data analytics getting better and better and better, and being able to understand such human traits [00:33:00] about us, like when we're lonely or depressed or our mental health or what type of person we are.
Rashida Richardson: [00:33:05] Well first, I think it's important to understand that some of the inferences being made through these technologies are not always accurate and the harm that can be caused when false inferences are made.
And to give a good example, I'll take one from an actual thing I do focus on and have, um, written about in the use in the government context, in that there's [00:33:30] technologies called predictive policing, where it's essentially trying to predict who may be a victim or a perpetrator of a crime within a given window of time or where a crime may occur.
And that's all based on historic crime data. And when you're dealing with a jurisdiction, let's say like where I am, New York City in the, NYPD, which was found to be in constitutional violation by disproportionately targeting black and brown people for over a decade, then it's no surprise that those systems are then going to predict that those same [00:34:00] communities that were subjected to discriminatory policing are therefore inherently more criminal because of that data.
And so that's just one specific example of data being used where the inference is wrong or the there's mischaracterizations that can be made from the data. And it can have very harmful outcomes for individuals. So what if it's a job opportunity that you're not- now being shown or not even eligible for because of false inferences or maybe misleading inferences in the data.
And I [00:34:30] think that's part of the problem is we don't actually- one, data is not objective and accurately capturing things about our lives, but then two, when data and inferences being made from data are being used in many different ways that we're unaware of, we actually aren't, um, as aware of like the impact that it can have on different life opportunities that we have available to us.
The predictive police in example is one of a bad one, but one of a good one is like [00:35:00] radiology. If you feed a bunch of mammograms into a system, it can find a pattern and actually spot cancer better than the human eye can. And I'd see that as a good thing, but why it's complicated is if that data set is not representative of our society, and this has been shown in research that mammogram scan system may not work well for me, a black woman where there's less data about black people, because less data is collected because of people misreading laws. So, [00:35:30] I, even though I'm hyper-critical of predictive analytics in these systems, it's not to say that I don't think they should be used or that they cannot be potentially beneficial to us as a society in the future.
I think it's a question of, do we actually have data that's good enough to produce the outcomes?
Laurie Segall: [00:35:48] What kind of regulations should we see?
Rashida Richardson: [00:35:51] A lot. Um, the one, I think there's certain technologies that just should be banned or at least a moratorium in place. [00:36:00] And I've brought up facial recognition a lot of times, because it's a perfect example of a technology that does not work in the, at the accuracy levels that people think it does.
Yeah, it's used in so many aspects of society, policing, transportation, education, healthcare, like, so I think first we have to have a conversation about, about what technologies are already used in society. And posing major harms or risks to society that maybe should go back into the lab and just be studied [00:36:30] so we can figure out its potential.
And then also what are the right regulatory constraints? Another policy proposal that's been implemented sort of in Canada and is being considered on the state, local, and federal levels here and in Europe as algorithmic impact assessments, and that's essentially a model that is similar to environmental impact assessments in that if a government agency wants to use the technology, or if we as a society, want to support a technology, let's at least try to understand its [00:37:00] impact before it's used.
I also think there needs to be better sort of best practices or even incentivizing of better practices in the development process. So, I've mentioned it one time with facial recognition, but a common problem with many of these technologies is that you're not only dealing with bad data sets and that the data's bad, but the data sets aren't representative.
So if you're, if you don't have data sets that look like our [00:37:30] society, then you're not going to create a technology that is going to work for everyone in our society. So I think there's also just ideas around standards and development that like should be in place already that can help improve where innovation is going, but hopefully it can help stymie some of the criticisms I'm making, like, in some ways I don't want to be right, but I am.
Laurie Segall: [00:37:54] I mean, something that struck me in the documentary and, uh, Tristan Harris. He said something, he said never [00:38:00] before in history have 50, 25 to 30 year old white guys made decisions that would impact 2 billion people. Talking about product designers in Silicon Valley. I mean, do you think that will change? Do you think that's, are you optimistic?
Rashida Richardson: [00:38:15] No. Um, and, [laughs] and, but, but here's the thing it's because that's a societal problem.
I think there are major diversity problems within Silicon Valley, I've [00:38:30] experienced it. But I think you can make the same critique of finance, l- the legal field, like every field has a diversity problem, and that's why we have issues with dealing with issues of diversity. But the problem with the lack of diversity in the tech sector is these technologies are world shaping, they're global and in their reach and their tackling questions within society that only a few people get to choose. In [00:39:00] that like, what technologies are designed, what actually gets the green light to be like designed at scale to then be released to the public are decisions that are only being made by a handful of people and its leadership, which tends to be more heavily white and male.
And I also think that there's issues with design of like, you're less likely to notice the lack of diversity in a data set if your whole team looks like that data set. So there are [00:39:30] many challenges within the tech sector. I'm not very hopeful about it just because I've been in a lot of these conversations and the conversation has not changed.
I- there are still rooms I go into and I'm the only person of color. And that's a problem in that like I'm not the only person that focuses on these issues. So I mean, even this year alone in our country is showing that like, we don't know how to grapple with difference, whether it's racial difference, gender difference, socioeconomic difference at all, as a [00:40:00] society.
But, I'm hopeful. I hope [laughs] t- like to start very dark. I-I am hopeful that the sort of outrage and uprisings to racial justice that occurred continues because I think it's only by sort of recognizing that we have a problem can you change that problem. And I think the amount of scrutiny that is specifically directed at the tech sector, I hope they start to hear it. And don't try to just [00:40:30] do superficial fixes of like putting one token person of color, or one woman on a board and thinking that's a fix, but actually try to look systematically through a company or even through the R&D pipeline to say, like, how do we integrate diversity, equity, and inclusion as a practice within all parts? Not just assign it to someone we hire to deal with those issues.
Laurie Segall: [00:40:53] Given your experience with data, do you think we have to reinvision privacy?
Rashida Richardson: [00:40:58] Oh yeah [laughs] [00:41:00] I think in some ways privacy is a power blind construct, and what I mean by that is that your ability to invoke it as a right really depends on who you are and I think the murder of Breonna Taylor is a great example of that, in that the, in US privacy law, the home is a d- domicile and it's typically a place that has heightened privacy rights. And that's why police officers have to knock it your door before coming in.
Yet, if you're a black body in that home, you're not afforded the space, same rights that other [00:41:30] Americans are in that pr- in, in having their tightened privacy rights. So in some ways, I don't think privacy is the correct framework for us to understand all technology issues. I think there are many privacy concerns embedded with technology, d- big data, and data practices.
But I see technology as a broader civil rights issue, one of equity in that, like, even with the digital divide, how we, how do we talk about issues if let's say 20% of our country [00:42:00] still doesn't have access to reliable internet yet so many services are being shifted to the internet. And so like that's not a privacy issue, that's an equity issue. So I I think in some ways it's on lawyers and advocates like myself to do a better job at our- like re articulating the legal and social concerns related to technology. So then the people on the Hill and elsewhere can get a better understanding of what's necessary because I don't think a new privacy law is going to [00:42:30] fix, like if we came up with a new comprehensive privacy law, that it would necessarily fix all of the problems that emerge from new technologies.
Laurie Segall: [00:42:38] The Social Dilemma got folks' attention. People are paying attention. Some folks are paying attention, right, it became a cultural conversation. Now what?
Rashida Richardson: [00:42:47] I tell people to keep reading, because there's so much more, I think it-it was a feature film, so you and, and I think Jeff was intentional of like this, wasn't going to be like a Kens Burns 10 hour multi-section [00:43:00] cause like, if- they could have done that and covered a lot more topics, but it's kind of like, it's a feature film that I think sort of wet people's palette for the diversity of issues that are implicated by big tech in their business model, but there's so much more to the problem and to the solutions and just all of the issues there.
So I just encourage people to not just stop at the movie, but read a lot of the research in some of the research is that like, sort of deduce more down in shorter articles, [00:43:30] but there's like also public facing reports and books that people are writing for the public to engage in, so I, I just encourage people to keep reading and don't stop with the movie.
Krista Smith: [00:43:45] I know we don't have that much more time, but I have a couple more things for you. One, uh, in the tech world, in terms of every studio now w- is streaming, you know, Netflix is no longer the unicorn. [00:44:00] Are we ever going to go back into a movie theater? What do you think the future is for the entertainment business now that tech has fully arrived?
Laurie Segall: [00:44:08] Everything is new now, you know, nothing will be the same again, and anyone, and let me just say this, right, like as someone who has covered tech, like anyone who fights against this will be on the wrong side of history. You know, this is tech and Hollywood have now merged.
And if you want to go with the times, you know, you got to go fast. And, and so I think that this was going to [00:44:30] happen eventually. And the pandemic was just an accelerator for what was inevitable. And changed user behavior. So, I think the smartest Hollywood executives will be like entrepreneurs and the smartest entrepreneurs will pay attention to the trends, the trends in Hollywood.
Krista Smith: [00:44:47] Did you turn off all the notifications on your phone, and do you limit your screen time in terms of Instagram and Tik talk and all that other stuff?
Laurie Segall: [00:44:55] Um, I am literally the worst, honestly, I do not practice what I [00:45:00] preach, I am awful. I turned off my notifications for Instagram. I turn them off for Facebook, but I have to do a better job. And I remember interviewing Tim Cook and, and I, and it was about when they launched like their tech addiction tool to like, tell us how many hours we were spending on our phone, because like, we all need to know how horrifically addicted to our, our devices we are. And like, I don't know if like, the shame really works with me, but I was interviewing him about it.
And I asked him like, who's in control Tim, like man or machine? [00:45:30] And he got so animated and he was like, we, as human beings will always be in control. I was like, I don't know if I believe that, I really don't. When I was writing my book, I would delete Instagram for a day or two. Just if I knew I I really had to, to get things done, so, and I-I find that my mental health meter goes up quite a bit. And I like people a little bit more when I do that. And then I can come back into it, you know?
Krista Smith: [00:45:52] Okay, so Laurie, my final question. What kind of advice do you have for [00:46:00] people coming up that want to profession in journalism or in either on paper or in front of the camera or in tech?
Because you certainly pivoted and you know, within your own career for the last decade, but I'd love to hear your, your thoughts on that and what advice you would have for younger people.
Laurie Segall: [00:46:20] You know, none of my jobs were ever on paper jobs, right, I know it all looked, looked like they were by the book jobs, but no one ever [00:46:30] walked over to me like lovely Lauri, as like a 23 year old micing up guests at CNN and was like, we think you'd be amazing on camera. Like, are you kidding me? I was like a mess, you know, like I-I could barely like brush my hair and I think I was still wearing Vans to work or something, right, but like I applied for job and I got rejected.
I did. And I thought this was like the worst thing to happen to me. Like I remember just like walking down the streets of New York and being like, I've totally just like lost. It ended up being the best rejection to happen to me because then [00:47:00] I got placed as like a weird production assistant at, um, what I will refer to as like the bad wedding table of CNN, but I was done every day at 4:00 PM and I was paying attention to things happening in New York.
And something was happening back in 2009, 2010 in New York, and I thing was called technology. And so in my free time, I got so involved in technology and it became my job, became a place holder for me to explore technology. And I paid my way to South by Southwest. I was a production assistant, [00:47:30] pretending to be a producer.
I booked interviews with every major tech founder and then convince someone to let someone do something and somehow I was just there at the beginning and I just kept fighting for it. And then I wrote my own job on a piece of paper and said, we should have a multimedia reporter that didn't exist at the time.
You know, and I went into a corner office and I gave it, this piece of paper to someone and I was terrified, um, that they would say no. And they didn't say no, and by the way, so many people said no to me in so many other ways. The only thing I could say [00:48:00] from having interviewed the most successful entrepreneurs in the world is, it's never just the smartest person in the room that makes it, it's the most resilient out of all that I would say my advice is be curious, you know, fail a lot and be resilient.
Krista Smith: [00:48:16] Thank you so much, Laura, I love having you on the show. You have to come back.
Laurie Segall: [00:48:20] I'd love to.
Krista Smith: [00:48:22] And more breaking news, we'll have more interviews for you, but, um, thank you so much for coming on today.
Laurie Segall: [00:48:27] Thank you.
[00:48:30] Krista Smith: [00:48:32] All the films and series we discussed today are streaming on Netflix. For more, head over to Netflix, queue.com. That's Netflix Q U E U E dot com and follow us on Instagram and Twitter. Don't forget to subscribe, rate, review, and share. Listen in next time for more like this. [00:49:00]