July 13, 2021
EP. 113 — Your Ring Doorbell is Working with the Cops with Johana Bhuiyan
LA Times reporter Johana Bhuiyan joins Adam to explain how Ring built a private surveillance empire by promising kickbacks to an unlikely accomplice: the cops.
113 — Your Ring Doorbell is Working with the Cops with Johana Bhuiyan
Speaker 1 [00:00:22] Hello and welcome to Factually, I’m Adam Conover. Thank you for joining us once again to talk to some of the most fascinating experts on the planet about all the amazing mind blowing things that they know and you don’t. We have an incredible conversation for you today. Let’s jump into it. The tech industry makes a huge amount of money off of creating and collecting vast amounts of our private data and then selling it to other companies to feed you ads, or figuring out how to otherwise monetize it. That business model, I’m not going to say it’s all bad. However, the creation of all that data has risks and consequences that those companies rarely think through. In fact, sometimes they drive straight into those chasms headfirst, straight on; the consequences be damned. Let me give you an example: take the home security company, Ring. You might remember Ring from its star turn on Shark Tank a couple of years ago. I love Shark Tank. I don’t care how communist you are, Shark Tank is a great show. Bernie Sanders himself sits at home yelling at the TV on a Friday night. ‘No, no, no, Sharon, don’t don’t go with Mr. Wonderful. He’s going to screw you. Get a deal with Cuban. You got to work with Cuban.’ OK, that was my horrible little Bernie Sanders impression. I apologize for it. Look, Ring is now a massive success, one of the biggest successes in Shark Tank history. I’m sure you know them, your neighbors have them. They make indoor and outdoor security cameras and smart doorbells. That’s right, that little blue circle you see winking at you from all the doorsteps as you’re on your afternoon walk; that’s Ring. Amazon bought the company a few years ago for a billion dollars and there are now millions and millions of Ring devices on doorbells and on the outside of buildings and inside our own homes, constantly recording video of us and our communities. Now, that is kind of dangerous. If you get together that amount of private footage in one place, it becomes a pretty big target for some bad actors. So you need the people who are in charge of keeping all that data to be very, very careful about it and Ring (let’s just say) has not been. First of all, a couple of years ago, they had to fire some of their own employees after they were found accessing customer video feeds that they shouldn’t have been. And then, of course, Ring’s entire security system has been compromised again and again, allowing hackers to look in on people’s private lives and in some cases shout abuse at people who are watching TV in their living rooms. Now, all those stories are well covered, but it gets a lot worse because Ring has also partnered with law enforcement agencies around the country, giving them easy access to users videos as part of investigations. If the cops want to go search a private residence or private physical records of some sort, they have to get a warrant. But Ring has created a system in which the bar is much, much lower. They have, in effect, created the private surveillance network cops could only dream of. That’s not just dangerous to the people who own the cameras. It’s even more dangerous to the people on the footage who might be wrongly picked up or targeted by the police. As we know from pretty much all of American history, when police get extra powers to wield, they usually wield them at black and brown people first. And recently, there was a blockbuster story in The L.A. Times that exposed how Ring has been working with police departments, not just here where I live in Los Angeles, but across the country in a truly disturbing way. I’m not going to reveal it to you now. I’m going to let the reporter who broke that story tell you herself. It’s her second time on the show. She’s one of our very favorite guests and I am thrilled to have her back. She writes about technology and accountability for The Los Angeles Times. Please welcome Johanna Bhuiyan. Johanna, thank you for joining us on the show once again for your second time.
Speaker 2 [00:04:10] Thanks for having me. I’m so excited to be here.
Speaker 1 [00:04:12] So last time you were on the show, we talked about the gig economy. We’re not going to talk about the gig economy today.
Speaker 2 [00:04:18] Maybe talk about it a bit, yeah.
Speaker 1 [00:04:19] Yeah, the gig economy pervades our entire society so maybe it’ll come up. But the reason we started talking again was that you had a really blockbuster piece in The L.A. Times about Ring doorbells, which I find to be a constant source of concern. They’re a plague upon the lands. I see them everywhere I go. I’m being recorded by a little camera and I’ve always been concerned about it. So tell me, what is the issue that you are covering with Ring doorbells?
Speaker 2 [00:04:46] Yeah, so this was an issue specific to the LAPD. But to be clear, it’s happening at police departments across the country. But basically, Ring was trying to get a bigger foothold in L.A. and they also wanted to prove that they were an effective crime fighting tool. So they worked directly with LAPD and they gave the LAPD (at least 100 officers) free Ring doorbells or they these other cameras that they have and solar panels for those cameras. They would give these free devices to them, which at the time retailed for like 200 dollars a pop or coupon codes, so discounts for these devices. Then they would encourage them or ask them to promote Ring to not just to other officers in their station, but at other officers and other stations, neighborhood watch associations, community members. So they basically at the end of the day, were trying to get officers to lend their credibility to this claim that their Ring device will help stop burglaries or theft. At least 15, we got a hold of three thousand or so emails between the LAPD and Ring, so 15 is a very conservative number. But in terms of being able to definitively say, ‘These officers, after receiving a Ring device, promoted Ring to other people.’ There are a little over 15 people who in their email said, ‘Oh, I did this.’ And so at least 15 officers did promote Ring after receiving a free device. More than 100, though, received free Ring devices, which in and of itself could run afoul of LAPD rules.
Speaker 1 [00:06:21] Yeah, my first question was going to be, ‘Is this in any way legal?’ A private company just giving away free stuff. Free, expensive electronics or discounts (which are not quite cash equivalent but are getting closer to it) in exchange for services. What was their request specifically to the police officers? ‘Hey, we’d love to give you a free doorbell. In exchange, we’d love ____.’ What? Or was it that much of a tit for tat?
Speaker 2 [00:06:56] Yeah, it was. That was the thing that really blew me away; was the very direct requests of, ‘We want to give you this Ring device so that you can get a feel for how good of a crime fighting tool it is. And can you please tell these community members how good of a tool it is? Can you tell other officers how good of a tool it is?’ In some cases for the officers, it was kind of MLM style: they would get a discount code and then send it out to a bunch of officers and for every 15 uses of their discount code, they’d get another free device. While there were some situations like that where, again maybe not they’re not necessarily – It’s unclear whether they push the message. But they did get people to purchase Ring devices using their personal discount code and then got free devices in exchange for that. So there was little bit of officers pushing the message there. There were emails of officers saying, ‘I so believe in this product. I’m telling everyone I know about it.’ There was one email where an officer said, ‘I really love this product. I’ve been using it. I recommended it recently to people who were burglarized in the last few months. I said, “You should get this because it’ll stop that.”‘ So there are people pushing the message, but they’re also just people pushing the product.
Speaker 1 [00:08:09] You’re making the LAPD sound like they’re going around like Girl Scouts, selling cookies, but civil servants armed with guns who have a really strong position of authority in the community. And this is a national program, they’re doing this in other cities, too?
Speaker 2 [00:08:22] So this is a program that Ring said, in response to our story, that they stopped in 2019. So they no longer donate products to officers and request them to promote it in exchange, and that’s their words. So they admit that’s what they were doing. But what’s happening now instead is potentially more nefarious if you have concerns about surveillance. The officers now have access to Ring’s ‘Neighbors’ platform, which is basically a social network like NextDoor for Ring and other security camera. So NextDoor plus surveillance footage. So users can use surveillance footage, but there’s a law enforcement portal that gives law enforcement direct access to this feed of videos. They could request the videos directly from the users on the platform and if you don’t really know much about the process, it doesn’t sound like that big of a deal. But typically, in order an officer to get that footage, they’d have to go through Ring, go through the company and then get a subpoena or get a warrant and there’d be a paper trail. You’d be able to say, ‘Here’s how many times officers requested this information from Ring,’ and there might be a right of refusal or something like that from the user. There’s just a much more judicial process here and and you have to get a judge to sign off on it. Whereas this doesn’t require any of that. You can go directly to the user and if they say yes then you can get that footage. There’s this concept of consent. People are like, ‘Oh, if they say yes, then it’s not a big deal.’ If an officer asks you for footage would you feel comfortable saying, ‘No?’ Not necessarily. Up until literally two or three days after we reached out to Ring for comment on this story, Ring did not have any way of tracking how many requests they got from different police departments. It was a few days before we actually published it, they announced that they were finally allowing users to go into the app and see under each police department how many times they requested video footage. So this is a new level of transparency that they hadn’t had in the four or five years that they’ve had this platform. When we asked LAPD (just to give you an idea of how opaque it is) how many requests they have sent to Ring for this footage, they were like ‘It would be too much of a burden to count. We don’t keep track of it. It’ll be too much of a burden to count how many times we’ve asked for it.’
Speaker 1 [00:10:50] I was going to ask, ‘What are the police getting out of this?’ If I’ve got some product, if I’m hocking my new kombucha energy drink, can I just offer free discounts to cops and they’ll start selling it for me on the street? Are they really that hard up for money that any company is going to do this? And no, the answer is that in exchange for this Ring was also basically building a back door that allowed them access to surveillance footage without having to go through the legal process that would result in a paper trail and transparency and reporting and all those sorts of things. Instead, they just have a fast lane to getting vast amounts of user footage. I can understand why the police would want that.
Speaker 2 [00:11:33] Yeah, I think genuinely that the short term perk for a lot of officers was this free, high tech device. This started in 2016; the emails we have are from 2016 and 2019, and in 2016 is when they were getting the products and at that time Ring was fairly new. It was the hot product, it was three years after they were on Shark Tank. It was an expensive product, it was a cool product and they were going to get it for free. They wouldn’t have to spend two hundred dollars on it. So I think for a lot of officers it definitively was this short term perk and it’s unclear at that time how many officers were thinking about like, ‘Oh, well, we’re going to be able to get access to this easier, less transparent way of accessing surveillance footage.’ But certainly in the end, the thing that Ring did give them was this much easier, low barrier access to personal surveillance footage.
Speaker 1 [00:12:25] Well, let me just ask: I understand why Ring would want to promote their product as being helpful for law enforcement. I’ve certainly seen tons of Ring videos on NextDoor. That’s the most popular form of NextDoor content is like, ‘My package was stolen from my front porch and here’s footage of the package thief.’ I had a friend who bought a Ring doorbell specifically because his packages kept getting stolen. Then what ended up happening was now he had footage of the packages getting stolen and nothing changed. He told the police, ‘I’ve got footage.’ And they were like, ‘Wow. Yeah, that is a guy stealing your packages’ and that was it. It did nothing. He eventually took it down. So my question is, are Ring doorbells actually useful for stopping crime in any way that we’d be interested in?
Speaker 2 [00:13:17] Well, there’s no studies that prove it is, and that’s the issue with surveillance footage generally. There are so many studies that are like, ‘I don’t know if this really does anything.’ And when it does, there’s the added risk of actually misidentifying people because that keeps happening to black and brown people. But Ring at the time (and this is kind of another part of the story is), they did a pilot with the LAPD and the numbers that came out of that pilot, that study, was used in marketing materials all across the country for months, if not like more than a year. But we saw the email that showed that Ring exaggerated the number. So they basically looked at – They themselves say it was a randomly determined geographical zone that they called Wilshire Park. Then they said, ‘This is where we’re going to give people a bunch of free Rings,’ and so they took the number of break ins in that area during those months (I think it was like a six month period) where they gave people a bunch of Rings and they compared it to the vicinity. Just nearby neighborhoods that they didn’t give free Rings to. So rather than just doing percent difference; the very straightforward – There still would have been a reduction, if you did that math. They did some really wild thing where they did the percent difference between what the actual number was in the region where they had the pilot and what the number would have been if that changed. It was just very not the way that you do math. And the LAPD was like, ‘Oh, yeah, that’s fine. Sure, go ahead.’ And so what they found was that there was a 55 percent decrease in burglaries in the area that they had the pilot in. But the numbers were 9 down to 5 and so that’s a 44 percent difference. If you actually do the math. They just the way that they did the math was very weird. But even that, it’s not statistically significant.
Speaker 1 [00:15:17] It’s a fifty five percent but really there were four less, where there were nine previously. That’s an extremely small sample size, very clearly.
Speaker 2 [00:15:26] And then also there’s no indication of what other correlations there were. What if some the nearby places also had Rings? Nothing like that. But that statistic was used everywhere else. It was used within emails. Officers who were promoting Ring, at the behest of Ring in some cases, were using that statistic. They’re saying ‘There was a 55 percent decrease in this place, blah, blah, blah,’ and it just shows you just why a relationship like that can also have downsides, because you’re giving credibility to this claim of crime reduction that isn’t really true. It’s not true or at the very, very least, exaggerated.
Speaker 1 [00:16:09] Yeah. Let me say, first of all, I don’t have an enormous problem with people simply having a security camera. I understand a product that gives you a security camera with your doorbell especially; it’s useful to see who’s at your door if you’re in a multistory home, stuff like that. But I’ve also noticed, there’s been an immense proliferation of Ring doorbells, at least where I live in Los Angeles. I walk down the street and I just see those glowing blue lights pointed at me and I’m aware that I’m on camera. It gives a different feeling than one of crime stopping. Here in Los Angeles there are a lot of Scientology buildings, and they’re all covered with security cameras. When I walk by those, I feel intimidated and I don’t think those cameras are up there to stop crime. It’s creating a sort of a fortress mentality or that sort of thing. That’s that’s my personal experience of them. I’m curious, in your reporting, what have you seen as the consequences of this pervasive amount of surveillance that we now have on our streets?
Speaker 2 [00:17:20] Sure, yeah. For this specific situation; the issue with the cops accepting free Ring devices, specifically a personal security device and then selling it and encouraging people to buy it, is the concern that they’re going to use fear of crime to help sales. To essentially act as a long arm of a corporation; and we want officers to be ethical. We want to make sure that we can trust their advice. We want to make sure that when they say, ‘Oh, this will help prevent burglaries,’ they’re saying it because it actually will and not because they’re acting on behalf of a corporation. That’s the concern with the way that the officers were acting. But I think on the personal end, for individuals, this goes back to what I keep harping on and all of my reporting. It’s something that I’m really trying to focus on, is the stakes of surveillance. Of course, you want to be safe. Of course, you want to make sure that your personal belongings or your packages get to your home and nobody is stealing it. One hundred percent, that makes sense. But the way that these systems often are disproportionately targeting black and brown people can be really dangerous. It’s sort of like balancing personal safety with personal responsibility to people who are disproportionately targeted by law enforcement and surveillance systems and understanding how those two things interact. Neighborhood watch, period without technology, has historically criminalized black and brown people. There are people who will see a black person or a brown person walking down the street and immediately either immediately assume or think, ‘You know what, maybe I’ll keep an eye on them just in case they do something.’
Speaker 1 [00:19:04] Half of the videos that you see of people posting Ring doorbells on NextDoor are literally just someone saying ‘I saw a black person in my neighborhood.’ We could have a whole other episode about NextDoor and their problems with this issue. But I’ve seen those posts myself, I’ve seen my neighbors post such things. If you have your eyes open, you see it happen.
Speaker 2 [00:19:26] Yeah, and the thing with this partnership with Ring and the LAPD and other police departments is; not only are you going to say, ‘I’m Mary-Jo from a small town in Minnesota and I see a black person in my neighborhood and I’m like, “You know what? That person looks suspicious. I don’t know. I’ve never seen them before.”‘ Now the police have direct access, you have a very easy way of going to the police and being like, ‘Hey, this person is suspicious.’ The police might then go after this person or keep an eye on this person after literally doing nothing. They’re literally being criminalized and watched and surveilled simply because they looked suspicious on the basis of their suspiciousness is being black or brown. You’re adding convenience. The way that tech democratizes everything and it makes everything more convenient, it also has made targeting vulnerable groups much, much more convenient.
Speaker 1 [00:20:18] Yeah. Wow. I just want to return again to the point that you made, because I asked you a question and then you were like, ‘I just got to make this point about the police.’ I want to emphasize it: the fact that the company – I think we all know intuitively, that using fear to sell a product is wrong. You see one of those commercials that has a little old lady saying, ‘I don’t want my family to be saddled with my funeral expenses when I die. So I bought this rip off life insurance.’ This is exploiting an old person’s fear of being a burden. We don’t like this kind of ad. We have a revulsion against that kind of thing. So first of all, Ring has been selling their products avowedly using that, but then to employ public servants who are in a position of privilege and power over the issue of public safety and crime. People that are in a prime position to exploit people’s fear is deeply unethical and really, really concerning. Sorry, I could go on about it for a while, but thank you for reminding me why to be mad about that.
Speaker 2 [00:21:27] Yeah. You asked earlier whether it’s illegal and the police, the LAPD in particular, have a code of ethics and their code of ethics specifically prohibits them from accepting gifts that would even give the appearance of impacting any sort of city business. There’s a separate rule that says that you can’t use your position to ingratiate yourself, essentially. This situation is both those things, in my mind. When they first responded, they were like, ‘We haven’t looked through all three thousand emails.’ These emails came out of a public records request. So someone looked through it. But they said ‘We haven’t gotten a chance to look through all of it. But upon preliminary review, it doesn’t look too bad.’ But then there are emails where an officer literally is communicating with a Ring representative and is like, ‘Oh, yeah, I haven’t been able to convince my neighbors of getting a Ring yet. They’re elderly and they are really scared of technology. So I’ve just been watching them to see when their adult children come by so I can convince their adult children to convince them.’ And I’m like, ‘That’s not unethical?’ But LAPD officers who had already gotten Ring devices, emailed Ring and were like, ‘Hey, we’re having a family picnic. We usually scrounge up some money to get some measly raffle gifts or whatever. Do you see where I’m going with this?’ And then the representative is like, ‘Oh, yeah, I can get you a free device to raffle out to your family members.’ Wow. A few weeks after or about a week after we published, the LAPD police chief did talk to the police commission and they were like, ‘We are investigating this and we want to make sure that officers actually really do realize that you can’t accept gratuities that compromise your position as a civil servant,’ which is probably the right move. There’s a lot of people being like, ‘Well, the LAPD can investigate themselves and they’re not going to find any wrongdoing.’ You know what? At the very least, they’re giving the impression of some sort of accountability.
Speaker 1 [00:23:33] God, thank you. Thank you for doing the investigation to actually spur any kind of action on this. But the baldness of the transactional nature between Ring and the cops, it’s truly shocking to me. That it would be that blatant. But let’s return to the point of black and brown folks and other marginalized communities being affected by this. I’ve seen it in my own community: a couple of months ago, there was a whole series of events that happened on my street. Which is that in my little complex, one of my neighbors was like, ‘Hey, I saw kind of a weird guy on my porch.’ And there was a there was a person of color and he didn’t do anything, he was just kind of being weird, kind of erratic on the porch. We live in the city, there’s unhoused folks around. It’s Los Angeles, you know what I mean? I was like, ‘Hey, this looks like no cause for concern. I hope this guy gets some help.’ Then like half an hour later, we hear helicopters around and it turns out that a bunch of cops roll up, pull out their guns on this guy. A bunch of cops all at once, helicopters, the whole nine yards because I think someone further down the street had seen him on their Ring doorbell and called the cops. Then there’s multiple people on the street filming the whole thing on Citizen. This is just a guy walking around with a stick, nothing actually happened on the street. Nobody’s house was broken into. He didn’t try to jump a fence. He was just being weird on the street. This was on my street. All this was spurred by people sitting inside their houses, looking at these cameras and jumping to conclusions and having fear based reactions. But these fear based reactions were given to them by the technology.
Speaker 2 [00:25:27] Yeah. It’s wild to look at the history of Ring and use it as a mirror of where society has gone because Ring started as purely a convenience thing. Jamie Siminoff, the CEO of Ring went on Shark Tank and was like ‘I was always working in my garage and I couldn’t see who was at my front door. So I made this Ring doorbell and also it actually helps me watch my packages.’ But it was purely convenience. Then they made this really, really sharp turn into crime and made it all about how you can protect your community and protect your own home. And I think capitalized on a growing fear of black and brown people, honestly, just other people. Right. You’re seeing this so much in tech. You mentioned Citizen. There are so many other companies that are capitalizing on this real fear of crime. Oftentimes you can trace it back to one or two things, like counterterrorism stuff and anti-Muslim stuff; post 9/11. There’s a moment of crisis that spurs a lot of this stuff. Insurrection, for example. People are calling for all this facial recognition technology in response to this crisis. Even people that are against facial recognition technology are like, ‘Let’s use it on the bad guys.’ The issue is, we have seen historically that when the government and law enforcement get to choose and tell us who the bad guys are, oftentimes they’re black and brown people. Oftentimes that infrastructure that was introduced to respond to a very specific moment or a very specific crisis are just used disproportionately against black and brown people somewhere down the line. There’s just this real fear of crime and in a lot of cities there is actual crime reduction happening right now. It’s not really based in logic. It’s just a great selling point. How do you sell a security camera, not just for your packages but also for safety generally or burglary. When Ring was promoting its devices in L.A. or to the LAPD, it wasn’t just package theft. They were specifically talking about burglary and home break ins, which is a very different thing than what they launched as. They’re selling crime, they’re selling crime prevention without any real – It’s not clear that there was any real basis for that fear in the first place, or an increased fear in the first place.
Speaker 1 [00:27:59] The weirdest thing is: I’ve always felt that these products that are fear based, like the purchase is fear based. It’s like ‘Hey aren’t you afraid of your home getting broken into? So buy this product.’ The products almost always create more fear in the people using them. Everyone I know who has a Ring doorbell is constantly like ‘There’s someone on my porch. What is it? I got an alert’ and if some weird guy wanders by my front door, I don’t know about it and if they try to break in, well my door is locked. The locks work pretty well and the constant vigilance that it gives you – Citizen is another example of this, where there’s been plenty of reporting about how Citizen specifically juices their notifications to constantly keep people addicted to the phone and they come up with things to make alerts: suspicious person seen in ___. Just to make people go ‘Oh, watch out, watch out. Oh, there’s a suspicious person a couple blocks away, watch out,’ to make them feel like they need the app. We didn’t need to know about these non events happening around our community and having them presented in this way, just hooks us on this fear receptor. It’s like the cure is worse than the disease, you know?
Speaker 2 [00:29:17] Yeah, I haven’t reported a ton on Citizen but the reporting on it is a great, great example of the extreme version of this. When I did have Citizen, because I was testing it when I first came out on first launch, I thought, ‘Oh my God, San Francisco and New York are just trash cities.’ I’m from New York and I was visiting San Francisco all the time. So I had alerts for both of them on. And it’s like ‘There’s a man taking his dick out in the parking lot of McDonald’s? That’s horrible.’ But again, like you said, why the hell do I need to know that? Why do I need to know? How does that impact me literally at all? They have my location, I don’t need to know that. But yeah, and the reason why Citizen was in the news recently was that the CEO saw this man who supposedly could have committed a crime and was basically foaming at the mouth to catch this man and put all of their resources to catch this man and then publicized it. It ended up not being the guy who committed the crime.
Speaker 1 [00:30:20] They put out a fake APB to everyone on Citizen, like ‘We’re looking for this man.’ They literally had news anchors on the app talking about it, like ‘Our manhunt for so-and-so.’ And it wasn’t the right person. It’s despicable. Vice did a lot of great reporting on this, about how also the people who run the notification system there are specifically encouraged to juice the notification reports and all that stuff. The original name of the app was Vigilante. The whole thing is, again, we could go on about it forever. This is really new, this is not a thing that the tech industry was doing five years ago or that I thought that they were doing in the early days; of really exploiting people’s fears of crime and specifically people’s false fears of crime. There was a crime spike during the pandemic but compared to the 70’s and 80’s, every city in America is vastly safer than it once was. If you got good locks on your door, you’re pretty much good. I mean, if anyone’s a victim of crime and have been listening to this, I’m not discounting that experience. But if we look at the numbers, our fear as a society is not in pace with the actual reality on the ground and these products are specifically trying to create more fear in people and using the cops to do it.
Speaker 2 [00:31:50] Yeah. I think to be fair, again, like you were saying, if you have been burglarized or if you have been a victim of a crime or anything like that, of course you’re going to try to do things to ensure that it doesn’t happen again. I totally respect that and I think there are a lot of products on the market that do help you do that. I think all it is, is that we don’t want to make it easier for the officers. We don’t want to create a system where officers are freely able to access all of that information and data with no transparency, no bureaucracy, no clear due process. It’s not that we’re saying, ‘If you don’t feel safe, you shouldn’t do anything about it.’ We’re saying, ‘Well, should we have more guardrails for how officers (law enforcement, the government, whoever else, other private companies) have access to or can access that information?’ So that’s the big thing, because a lot of people their response will be, ‘Well, on the one end, we want to make sure that we’re safe. On the other end, if you’re not doing anything wrong then why does it matter if people are watching you?’ Both those arguments to an extent, makes sense. But on the second part of it, it’s like, ‘Well, do you want to feel like you’re constantly being watched because you are brown or black?’ If we decided that, well, actually not ‘if’ we decided. We have decided; the government has decided that domestic terrorism is actually that one of the biggest threats to the country. The FBI has released reports about this. And so if we decided to then broadly discriminate against people who look like the insurrectionists, and we were constantly watching them based on just what they look like then at a certain point you’re going to be like, ‘You know what? This is actually bad. It’s not great to just watch people based on what they look like.’ I think those are the two important points. But you mentioned tech companies weren’t really doing this five years ago. It’s similar to what I was saying about the Ring pivot. A lot of companies kind of came about to that; they were either subtly responding to that: they were subtly creating services. NextDoor is a great example, right? They weren’t actually launching as a crime fighting thing. I think what they saw was that it turned into that organically. People turned that into this neighborhood watch crime thing. I think what’s happening right now is a lot of companies are responding to that desire to make sure that all crime in their neighborhood is stopped, and Citizen for sure is an example of that. Again, I haven’t done reporting on it, this is just literally based on Motherboard reporting. Kudos to them, amazing publication. But this is what happens when you privatize law enforcement. This is what happens when you create the really weird private police that have private and business incentives and motivations. They are going to try to get the criminal to prove that their service is amazing, to prove that law enforcement should work with them or that their weird little network of – I mean, they basically have private police cars.
Speaker 1 [00:34:51] Citizen is doing that as pilot programs. I don’t know the current status of them, but they at least had been running them and had bigger plans to roll them out. Basically a private police force in Los Angeles, maybe other cities that would be like plenty of private security forces driving around all the time, armed to various degrees. But I guess on Citizen, you could summon an Uber fake cop. If you saw a scary person, you press the button and they show up if you pay a fee. It looks like that was the business model that they were going to, and there doesn’t seem to be anything illegal about it. But it’s deeply frightening, the idea. Especially because you say that these companies saw that it was people in the neighborhoods who are using the products this way and yeah, it’s certain types of people. It’s fearful, paranoid, often very comfortable people, the sort of folks who peak out from behind the curtains and say, ‘There’s a man I don’t recognize.’ Except now they have technology to record those people, blast it out on social media, maybe summon a fake cop one day. A lot of times these products are being marketed to paranoid racists. What else can we say about it, than that? It’s deeply weird.
Speaker 2 [00:36:19] I live in the Bay Area, I’m from New York; largely diverse cities. Obviously this is a podcast so I need to say: I wear hijab. I am very vividly and explicitly Muslim. In those cities, I don’t really feel that targeted. I don’t feel really ostracized or anything like that. Of course I get stares. Of course there’s been situations, but much less than in other places. I’m on vacation in Minnesota right now, and downtown Minnesota and downtown Minneapolis and that whole area is very diverse. But I’m in the suburbs of Minnesota. I’m in northern Minnesota right now, and never have I felt so fearful of the way that people are reacting to me. Imagine that constantly and imagine someone being like, ‘Oh, I actually now can call via an app. I can Uber a cop right now to follow this person and see what she’s up to.’ I went for a run in the neighborhood and I felt like I was worried what people might think and I was worried people might call the cops on me because I’m not from that neighborhood and I’m wearing a hijab. And it’s just something that they’re not used to. It’s a very real, real fear. If I’m in a neighborhood honestly with tons of security cameras, I’m going to run through the middle of the street. I’m not going to get too close to the houses because I know that a lot of people do have this very real fear of anyone who doesn’t look like them. They’re just constantly on alert about anyone from outside their community. It’s a real fear to know that you are being watched and one wrong move, one thing that could look potentially like I’m a threat. I literally don’t even look at the homes because I’m like, ‘I don’t want them to think that I’m scoping out their homes out.’ I don’t want them to think that I’m going to come back later. I’m an American. I was born in New York. I was born in Queens. Why should I be walking in any part of America and feel that way? That’s kind of the situation, of course, for black people more than me that’s the situation. That’s the environment that we’ve created and that that tech (I’m not going to say is the core of it and the root of it and the cause of it but I think) some tech companies have profited off of it, some tech companies have enhanced and enabled this fear based environment.
Speaker 1 [00:38:43] And when you’re jogging on that street, and you look up at the houses and you see that row of little blue circles staring at you, that’s all the more intimidating and it’s all the more a marker of ‘these people are afraid of people outside and they’re putting up a threatening front.’ That’s, in fact, part of the point, because that’s mostly the point of any security camera. Well, look, I really want to ask you, we’ve talked a lot about law enforcement. I want to talk more about the privacy implications for ourselves as a society about this. But we gotta take a really quick break. We’ll be right back with more Johanna Bhuiyan. OK, we’re back with Johanna Bhuiyan. We’ve talked extensively about the discriminatory potential and actual reality of this wide scale consumer surveillance. I want to talk about the privacy implications of it. Ring has had a lot of press for the last couple of years for the fact that they’re generating this enormous amount of surveillance footage that is stored on their servers. But they (apparently in the past) have had very lax security protocols for what to do with it. They’re basically creating huge amounts of very, very volatile data that they seem to be very bad at protecting. I remember there being a lot of news about security holes. Do you share those concerns? That’s a way in which they’re dangerous to even the people who own the devices.
Speaker 2 [00:40:20] Yeah, I think there aren’t enough regulations and policies about how long you can store data and personal information. I am really concerned about the security of all of that. I do also think that privacy – We talked a lot about law enforcement, but law enforcement access to things is also a privacy issue. Not necessarily for the user, sometimes for the user itself but for the people walking around. You’re like, ‘I’m just going to walk around’ and now my image is going to be in some police database or it’s going to be in a Ring database that police can access at any point. I think there’s privacy implications on all sides of this. Hacking anything is just about how much money you have and how motivated you are to hack it and you can hack it; any cybersecurity firm will tell you that. It’s not a matter of whether they are going to be able to hack it. It’s about how much time they have, how much resources they have and if they really, really want to hack something. So it is really important for there to be policy or privacy regulations that specifically talk about how you can store information and how long you can store that information. Going back to what I was saying before, this is again a part of personal consumer willingness; to give your information to companies and then just kind of deciding that it’s a cost of doing business with these tech companies for them to take my information and do whatever the hell they want with it. We’re only seeing (in places like California) the CCPOA and stuff like that, coming about right now. It’s still not perfect. Companies are still resisting it, but I think a lot of it falls back to the consumer. The consumer not really feeling like it’s that important anymore or maybe not that they don’t feel it’s important. Maybe they feel like it’s just impossible to reel it back in. The beast is out. There’s no way to put that privacy beast back in. We’re never going to have privacy again. But what my goal is with my beat is, just to consistently emphasize that it’s so important to at least try to regulate how our data is used because the stakes are the highest for the most vulnerable communities. It always comes back to that. I did a story recently about ICE requesting information from Google. Basically what happens is, there’s a legal request process and this is the same legal request process that the LAPD would have had to go through to get footage from Ring. But if it’s a federal agency then there’s national security requests, if it’s a local agency then they have to do subpoenas and warrants. But oftentimes, tech companies aren’t super incentivized to say no to these requests. Why would they? And so in this case, ICE requested the information of a user. We got a hold of an email where Google reached out to the user and said, ‘The DHS requested your information. You have seven days to get a court ordered ‘motion to quash’ this subpoena or else we’re probably going to give up all of your information’ and this is their Google account. So not Gmail, not one specific Google service, Google. Everything: maps and whatever.
Speaker 1 [00:43:34] All their searches, all of their driving direction history. These are people’s entire lives.
Speaker 2 [00:43:41] Yeah. Google will say that they gave them the opportunity to fight it, but not everyone has a lawyer on hand. Not everyone knows what it means to get a court ordered ‘motion to quash’ and I saw this email. If I saw this in my inbox, I would have been like, ‘Oh, some weird terms of service thing.’
Speaker 1 [00:43:56] Or it’s a phishing attack or something. Yeah.
Speaker 2 [00:43:58] Yeah, totally. There’s just so many barriers to being able to quash that information. And again, it goes back to, ‘Oh, why wouldn’t I give Google all my information? Why does it matter how long Google stores that information?’ Why don’t we have rules over how much information Google is allowed to give to law enforcement or why should we incentivize tech companies to not give our information up? This was a very unique case where ICE was using an administrative subpoena, which is different than a regular subpoena because there’s no judge. It’s their own subpoena. They don’t go to a judge. They’re just like, ‘Hey, we want your information.’ Still, to any lay person, you’re like, ‘Oh, yeah, I’m being subpoenaed to give my information,’ but there’s actually no judicial oversight over it. It’s not self enforcing. But in most cases, if it’s a federal government or federal agency asking for your information, they do it through national security letters or any other methods like that which typically come with a gag order. It typically comes with a year long gag order. So you actually never know, for a year or so (even more, because they might extend it) if your information had been given up. So that’s why it’s so important to care that your personal information is being stored forever on these cloud servers of all of these major tech companies.
Speaker 1 [00:45:25] I read this piece by you, but hearing you describe it blows my mind more about it. Because we have a presumption of being able to defend yourself in America, from something along these lines. That’s why we have search warrants. So that there’s a process. If the government wants to come into my home, presumably, they need to go to a judge and they need to prove why they can come into my home and go through the whole song and dance before they are actually able to. At least that’s what I understand from TV. It’s part of our civic education in America, knowing that’s required. But this is like an agency coming to your digital home, coming to actually someplace place there is a lot more private information than your actual home does. It has your entire search history. It has your entire email history. It has your entire life, your whereabouts, all those sorts things; photos, potentially thousands and thousands of photos. And what the government just comes and says that they don’t need a warrant and they don’t need anything else. They just say, ‘Give us this.’ And Google literally says ‘You have to go get a court order to stop this.’ How would you go about doing that? Like, if I go to the L.A. County Courthouse, is there a Google court order desk that I can go to? Who do I call? I have a lawyer, but he’s an entertainment lawyer and he doesn’t get back to me within a week about anything. So I think I would be up shit’s creek as well if I were to have this happen. There’s a presumption of innocence and a presumption that people can’t just go through my shit in America that this really appears to violate.
Speaker 2 [00:47:07] Yeah. The only real analogy to this is cops, some government agency or whatever, knocking on your door and presenting you with a subpoena, saying ‘I have to go through your home.’ But that, again, you can’t fight it. You don’t really get notice about it or whatever. But you know what’s happening. You see them going through it. Whereas this is just your information, who the hell knows? The CCPA, if you’re in California, you could be lucky enough to request your data and see what information they have on you. But you don’t really know what information they’re looking through. Within an administrative subpoena, the one thing that I should say is that they can only request specific type of information, subscriber information. So they’re not really supposed to ask for a location and stuff like that. We got ahold (for the story) of the actual subpoena, and they were asking for location and stuff like that. It’s just a matter of Google deciding whether or not to give it to them. So, they’re not supposed to ask for it, but if Google gives it to them, then Google gives it to them. They didn’t do anything wrong. But an administrative subpoena, the reason why you’re able to get information about it and stuff like that again is because it really should not be stuff beyond subscriber information. And there’s no gag order affiliated with it because it’s not reviewed by a judge at all. It’s such a black box of a situation and you have very few remedies for it. Those two things really matter, they really, really matter and particularly: this is ICE. Conceivably, they’re going after an immigrant. Conceivably, they’re trying to use this information in order to detain this person or whatever it is, for some ICE investigation. That person is a vulnerable person. That person is in a vulnerable situation and this government agency or this law enforcement agency is being given the tools to potentially detain this person or investigate this person or surveil this person by Google. Google is giving them the tools to do that. What do we expect tech companies to do in response to that? Especially if it’s a criminal situation, what are they going to do? If they get a subpoena, there’s not always a ton for them to do. But we have seen situations where Twitter has fought back. I forget what agency it was and all the details of it. But Twitter has fought requests for information about one Twitter account. I think it was the ‘alt DOJ’ or the ‘alt some government agency’ or something like that. So there have been instances where they will fight and we want to make sure, again, this is about consumer behavior. We should make sure as consumers, as individuals that we emphasize to these tech companies that our privacy is paramount, that our privacy really, really matters to us. That we’re not willing to just give up any semblance of privacy just to use your services, because in those cases, the tech companies will have to be like ‘Actually in order to at least give the impression that we care about privacy, we’re protecting our users privacy.’ We should fight this because and the response to me and that story was ‘We really care about people’s privacy, blah, blah.’ I’m not sure if there they were limited in what they were able to offer, and lawyers I spoke to also weren’t sure if they were able to give more time to quash this motion. But Google could ask for more time. We want to make sure that as consumers, we’re able to incentivize tech companies to fight for our privacy as well.
Speaker 1 [00:50:37] I completely agree with that, but we have limited powers as consumers. A couple of years ago I de-Googleified. I don’t use Google for anything. I have a separate email service. I use duck-duck-go for my searches and I try to keep stuff a little dispersed because I don’t like everything all being in one place. But I know somewhere there’s some repository I’m not thinking about that has a lot of private information and the problem is, I don’t know how all these different companies are storing this information. For instance, the company that has most of my shit is Apple. Apple has responded to the growing groundswell of desire for privacy by really being privacy forward in their marketing and saying ‘we’re the privacy company’ and their privacy practices actually are better than (to some extent) other companies. They have the most secure instant messaging and texting service. The the iPhones are truly encrypted. Apple has refused in the past to do things. There was a big thing a couple of years ago about them refusing to unlock or build a back door for the FBI and and other federal agencies. It’s better than other companies. But to what extent? I don’t know. How many law enforcement requests have they complied with? I have no idea. And despite that, hey, that’s maybe 40 percent of my shit on Apple. A whole bunch of it is on companies that I have no understanding of their data control policies. Ring, again, at some point appeared to just be putting all the videos on just some unencrypted server somewhere where anyone could grab them. I forget what the story was, it was like any Ring employee could look at any video from anyone at any time because there was no encryption on Ring’s side. Which is an obviously massive, awful security hole but there’s no laws around this. There’s no regulation. Any one of these companies can just keep a big, sloppy bucket of my data out on the shop floor, ready to get kicked over whenever any klutz walks by. I have no control over it, nor do I even know which ones have good policies or not. This is highlighting for me how much we desperately need actual regulation around user data, because we’re basically allowing all these companies to stockpile large amounts of hazardous material that is a harm. When you have enough of other people’s personal information in one place, it becomes a target: for hackers, for law enforcement, for bad actors of any kind. Not that law enforcement is always bad actors, but often we want to be protected from law enforcement. It becomes a target. It becomes a honeypot. It becomes something that those people want. So what is done with that information is really paramount. We need some basic standards in our fucking society around it.
Speaker 2 [00:53:30] Yeah, no, exactly. I just started reporting on surveillance at the beginning of the year. But that’s kind of the thing that I want to get across constantly: we have to care. We have to care what companies are doing with our data. We have to care where they’re putting it and we have to care who has access to it, because it might seem like, ‘Oh, we have no way of bringing this back in and we have no way of engaging in today’s society with technology without giving up that data.’ But there are regulatory methods to at least create guardrails, the companies may run afoul of them but at the very least we’ll have a means to keep them accountable. Part of it is policy and part of it is just giving a fuck as a society, because I think that’s the real issue here. There is some movement right now with policy and stuff like that. But in response to almost every single one of my articles, I will always get people being like, ‘Why does it matter that they have our information? How else are you going to live? Why does it matter if people are watching if you’re not doing anything wrong?’ I’m like, ‘It matters. It matters. It matters. It may not matter for you today. It could matter for you in a few years, but it does matter for so many vulnerable people.’ I mentioned counterterrorism efforts after 9/11, one story that I’m looking at right now is just how so many of those surveillance tactics; the surveillance playbook was really expanded post 9/11 and used disproportionately on Muslims for years and years and years. But so many of those tactics are now being used on black and brown people. So things that get introduced in moments of crisis will then be proliferated to the rest of the society and oftentimes be disproportionately targeting black and brown people, immigrants, queer groups and any marginalized groups.
Speaker 1 [00:55:20] Yeah, the idea that we can’t do anything about it is so weirdly pervasive. It’s bizarre because the only reason for it is that the tech industry, the Internet, everything that comes along with it has only been around on a consumer level for, what, 30 years? So there were no laws about it because it didn’t exist yet and we just need to write some. We have them in other areas. In the medical field, we have HIPPA regulations that very carefully dictate under what conditions medical information could be taken. The group I do homelessness services with, we’re very cognizant of HIPPA regulations. Whenever we’re taking anyone’s medical information, we actually avoid taking it for that reason; because we know it’s a hot thing. Anyone who is dealing with that knows the same, talk to any social worker or anyone else. And that’s because, I don’t know, at some point it was we in society were like, ‘Oh, yeah, this is something that we need to make sure everyone’s very careful with.’ And we passed a couple laws, and I think we have a presumption about our doctor’s office: that they’re going to treat our medical records with confidentiality and why can’t we have the same expectation around our doorbell footage or anything else?
Speaker 2 [00:56:37] I think the reality is people don’t know how high the stakes are, they don’t know how high the stakes are for a lot of people and the stakes aren’t may not be high for them right now. My hope is that if I continue to highlight human stories and the human impact of all of these surveillance issues and privacy issues, that people start coming around to it. It’s not just me, there’s so many amazing reporters doing this work. You talk about surveillance and privacy and people’s eyes glaze over. So, we got a lot of work to do. But I think it’s just a matter of getting across really, really, really how much harm could be caused to so many people if you don’t start regulating a lot of the way our data is being used.
Speaker 1 [00:57:21] The point that you make about how it might not harm me that much to have the Ring doorbell out, until I use an unsafe service that is giving the footage to law enforcement and exposing my footage to hackers and all these sorts of things. Maybe I as an able bodied white guy will say, ‘Hey, what does it matter to me?’ But it harms others. It harms other people in my community that I should and do care about, even if it’s not always visible to me.
Speaker 2 [00:57:52] Yeah, exactly. It just goes back to balancing my personal need for safety and other people’s personal needs, it’s also safety for them too. It is a safety issue for a black man to be surveilled by cops, misidentified as a criminal and then be targeted in some way, shape or form. We’ve seen that it’s a safety issue. It’s not our safety versus theirs, it doesn’t have to be us or them. There just needs to be a better way to make sure that there are least some sort of guardrails or regulations around both those things.
Speaker 1 [00:58:26] You know what it reminds me of; is an issue we talked about on the show before in regards to car safety. We have NITSA, we have the carmakers, we have everyone working on keeping the person inside the car safe. There’s much less attention paid to the safety of the person who is hit by the car. If we were paying attention to that, these big flat front SUVs with the really flat grill; we wouldn’t have those because those are much more dangerous to get hit with than a sloping roofed car where you roll over the top or the sort of pillars that prevent us from seeing, etc.. If we were focusing on the actual vulnerable person who’s getting hit by the car, we would do things a little bit differently. I can grant ‘Hey, maybe for where you are and the neighborhood you live in, for the history that you have, it makes you safer to have a camera pointed on the outside of your home, pointed at the sidewalk.’ But we also need to consider how it makes the people walking down the sidewalk less safe, there is another person on the other end of that camera and those people are never talked about by Ring or by the LAPD or any police department really, or by the tech industry. It’s really only folks like you who are talking about those people’s safety.
Speaker 2 [00:59:37] Yeah. I’m not going to say. I don’t know, maybe they have talked about it. I don’t know. I can never say never,
Speaker 1 [00:59:44] Sure, I’m editorializing.
Speaker 2 [00:59:45] Yeah, I think people are sold on their own personal safety and they should be allowed to protect that. But we also have to be very, very aware of how much our consumer and our individual behavior actually impacts other people almost systemically.
Speaker 1 [01:00:04] Yeah. Well, I can’t thank you enough, Johanna, for coming on the show and for doing this reporting and coming out to talk to us about. It’s been great to have you back and we’ll have to have you again next time you crack something huge like this.
Speaker 2 [01:00:14] Thanks. Thanks for having me. This was so fun. I am sure that I misspoke sometimes, but I just started reporting on this, so hopefully my reporting will prove it out.
Speaker 1 [01:00:24] Well, you’ve done a lot in a very short period of time. Where can people find out more about you and your work?
Speaker 2 [01:00:29] You can find me at latimes.com. I’m on the business section, so typically my stories are there. Hopefully my stories are also on the front page. But you can also find me on Twitter @JMBooyah.
Speaker 1 [01:00:46] Awesome. Thank you so much, Johanna, for coming on the show.
Speaker 2 [01:00:47] Thank you. This was really fun.
Speaker 1 [01:00:54] Well, thank you once again to Johanna Bhuiyan for coming on the show, if you want to check out her work, go to the L.A. Times. If you want to support all of the authors that you hear on this show, remember, you can access our special bookstore at factuallypod.com/books. That’s factuallypod.com/books. When you buy books there, you’ll be supporting not just this show, but your local bookstore. That is it for us this week on Factually. I want to thank our producers, Chelsea Jacobson, Sam Roudman. Ryan Connor, our engineer. Andrew W.K. for our theme song. The fine folks at Falcon Northwest for building me the incredible custom gaming PC that I’m recording this very episode for you on. You can find me online @AdamConover wherever you get your social media or at AdamConover.Net. Until next week, we’ll see you next time on Factually. Thank you so much for listening.
July 26, 2022
How can we best help animals, when it’s we humans who cause their suffering? Animal Crisis authors Alice Crary and Lori Gruen join Adam to explain how the same systems that hurt and kill animals also harm humans. They discuss the human rights abuses that happen in industrial slaughterhouses and how palm oil monocrops are devastating the world’s rainforests. They also share how we can have solidarity with animals in our daily lives. You can purchase their book at http://factuallypod.com/books
July 19, 2022
In times of turmoil, it can be useful to take a longer view of history. Like, a LOT longer. Paleontologist and author of “The Rise and Reign of the Mammals” Stephen Brusatte joins Adam to explain how mammals took over the Earth hundreds of millions of years ago, and why we survived and achieve sentience when dinosaurs died out. Stephen goes on to discuss why taking a deep look at our history can help prepare us for the crises of the near future. You can purchase Stephen’s book at http://factuallypod.com/books
July 13, 2022
Trans people have existed as long as, you know, people have. But the barriers to legal inclusion and equality are still higher than most people realize. “Sex is as Sex Does” author Paisley Currah joins Adam to discuss why institutions have been slow to give legal recognition to trans identities, why Republicans have shifted their attacks from bathroom policies to trans youth in sports, and why the struggle for trans equality is tied to feminism and women’s liberation. You can purchase Paisley’s book at http://factuallypod.com/books