Alexa, Play My Alibi: The Smart Home Gets Taken to Court

This week on Get WIRED, a look at how data from smart speakers and other connected devices is used to solve crimes.
amazon echo flex speaker lit up
Photograph: Chloe Collyer/Bloomberg/Getty Images 

This week on Get WIRED, we're bringing you an episode from WIRED's other podcast, Gadget Lab. If you're not yet a subscriber to either terrific show, scroll down for links.

As smart speakers for the home continue to grow in popularity, police departments have started to take notice. Now, whenever attorneys and law enforcement officials are investigating a crime, they can put your virtual assistant in the hot seat. They can cross-reference a variety of information from smart devices, including location data, audio recordings, and biometric data. Together, it can paint a picture of where a suspect was and when, often far more reliably than any human witness.

This week on Gadget Lab, WIRED senior writer Sidney Fussell joins us to talk about the strange murder case where a smart speaker became the star witness. We also share tips about how to manage the privacy settings in your own smart tech.

Warning: This episode features a brief conversation about domestic violence and assault.

Show Notes

Read Sidney’s story about law enforcement collecting information from smart speakers here. Find more episodes of the Get WIRED podcast here.

Recommendations

Sidney recommends the show I May Destroy You on HBO. Lauren recommends Vanity Fair’s September issue, with a cover story about Breonna Taylor. Mike recommends the episode of the podcast Questlove Supreme with Bootsy Collins.

Sidney Fussell can be found on Twitter @SidneyFussell. Lauren Goode is @LaurenGoode. Michael Calore is @snackfight. Bling the main hotline at @GadgetLab. The show is produced by Boone Ashworth (@booneashworth). Our executive producer is Alex Kapelman (@alexkapelman). Our theme music is by Solar Keys.

If you have feedback about the show, or just want to enter to win a $50 gift card, take our brief listener survey here.

How to Listen

You can listen to Get WIRED through the audio player on this page, and subscribe for free wherever you listen to podcasts.

You can listen to this week's Gadget Lab through the audio player on this page, but if you want to subscribe for free to get every episode, here's how:

If you're on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts, and search for Gadget Lab. If you use Android, you can find us in the Google Play Music app just by tapping here. We’re on Spotify too. And in case you really need it, here's the RSS feed.

Transcript

Michael Calore: Lauren.

Lauren Goode: Mike.

MC: Lauren, do you ever wonder if the things that you say to Alexa might become evidence if you committed a crime?

LG: Well, I try very hard not to commit crimes, Mike.

MC: Alexa, do you send your recordings to the police?

Alexa: Sorry. I'm not sure about that.

MC: Alexa, do you send your recordings to the FBI?

Alexa: No, I work for Amazon.

MC: Alexa, does Amazon share your recordings with the FBI?

Alexa: Amazon takes privacy seriously. For more information, and to view Amazon's privacy notice, visit the Help section of your Alexa app.

LG: Yeah, I'm sure lots of people are going to do that.

MC: [Laughs]

[Gadget Lab intro theme music]

MC: Hi everyone. Welcome to Gadget Lab. I am Michael Calore, a senior editor at WIRED. I am joined remotely by my cohost, WIRED senior writer Lauren Goode, who is also the host of our other podcast, Get WIRED.

LG: Hi, everyone. And hi, Mike.

MC: Hi. Good to see you again. We are joined this week by WIRED senior writer Sidney Fussell. Sidney, welcome back to the show.

Sidney Fussell: Thank you for having me.

MC: Of course. Today, we're talking about smart speakers and the surprising role they might play in crime investigations. If you have an Amazon Echo or a Google Smart Display or some other voice-enabled device in your home, you probably know that they're listening much more than they let on. There have been stories about how they've picked up on threads of conversations, even when people haven't used the wake word or asked an explicit question. But it turns out that the voice data these devices collect can be used, either in your defense or against you, in a court of law. In fact, attorneys and detectives are growing more savvy about the smart home in general. They've been able to string together data gathered from multiple smart devices to pinpoint people's locations and activity, and possibly prove or disprove their alibis.

Now, Sidney, you wrote a story for WIRED.com this week, about how smart-home data can be used to try to exonerate a defendant. I would like to ask you to tell us about the crime that opens your story. And, before you do that, I should warn listeners that there are some graphic details in this story about an alleged murder.

SF: Right. So, in July 2019, police were called to a home in Hallandale Beach, Florida. Hallandale Beach is about 20 minutes outside of Miami, and they were called to the home of Adam Crespo and Sylvia Galva. There was … Galva's friend, who was also in the home at the time, called 911 because she overheard a very violent argument between Sylvia and her boyfriend, Adam.

When police arrived, Sylvia was dead. She had been stabbed through the chest with a 13-inch blade. The boyfriend, Crespo, was arrested and charged, but he told police that she grabbed the blade, the two of them were struggling, and she ended up accidentally stabbing herself. He's saying that he did not stab her. He was charged with second-degree murder, but that was his defense, that essentially she died accidentally.

And so, ultimately what happened was, police arrested Crespo charged him with murder. And his lawyer, who I spoke with, his name was Christopher O’Toole. Christopher said that he actually believes that the smart speakers, the Amazon Echo speakers that were there that night, will actually prove Adam Crespo's version of events. Which is that the two of them were potentially arguing, but he didn't commit a crime that night. She died, accidentally.

When I spoke with Mr. O'Toole, he said, quote, "I actually thought of it as this being the first time an Amazon Alexa recording could be used to exonerate somebody and show that they're innocent."

LG: And is that what ultimately happened, with the recordings?

SF: So at this point it's unclear what exactly the voice recording is going to show, because the trial is on hold. All of this occurred last year. Because of Covid, there's a lot of … hearings and criminal proceedings were pushed back. So, as of right now, the trial is on hold.

LG: Sidney, in your story, you mentioned that requests from police for this kind of data have increased a lot since 2016. Why is that?

SF: There's a couple of different theories about why it is that police have been requesting this type of data more and more. In its most recent transparency report, Amazon said it fielded about 3,000 requests from police just in the first half of 2020. That's not specifically smart-speaker requests, but just in general, they received a huge increase in requests for user data. It's up about 70 percent from this same time period in 2016.

I talked to a few different forensic investigators about why that may be, and there's two main theories. The first theory is simply media attention. Whenever Amazon Echo data, or whenever smart-home or smart-device data, is used as part of the prosecution or the defense, it gets a lot of media coverage, because it's dark. It's interesting. And I think we also have this underlying fear that we're being listened to by our devices.

So the second theory, which compliments the first, is this idea of search templates. And so whenever law enforcement requests data from Amazon—it's the same thing with Google and its speaker system—they have to send over either a search warrant or a subpoena. And they have to say things, the details of the crime, what it is they're looking for, the devices they'd look at, and what evidence they believe may be on the device. So, with the search warrant that was used in the earlier Florida case, the police said they believed that the devices may have overheard the argument or events leading up to it.

The forensic investigator I spoke with said that a lot of police departments are writing these search warrants, creating templates of them, and then sharing them among themselves. So instead of every single time a police department wants to request this type of data they start from scratch, going through the individual details of the data they want to request, and the device that they believe it on, they can share these templates, which are prewritten to include … and it'll say things like, there will be a bracket that says, "Here's where you'll talk about the evidence that you want. Here's where you'll talk about the data that you think you'll find."

Basically, the template makes it much easier to write these things, and they're always based on previous search warrants that were successful, in which they did get the data. And so, as these search templates are being shared, it's becoming a little bit easier for police departments to write templates or, excuse me, to write search warrants and subpoenas that are tailored to what it is that they're trying to get from Amazon or Google.

MC: What kinds of things can the companies actually share with law enforcement when they request the data? Is it specific data? Is it transactional stuff? Is it metadata?

SF: I spoke to forensic investigators about the type of data that they can find on smart speakers, and although there's a lot of attention that's paid to voice recordings and yes, voice recordings do play a part in some of these investigations. One of the main things that these investigators are looking for is activity logs. And so, essentially, everything you do using a smart speaker has a time stamp next to it. So, it'll say 11:00 pm, you used your speaker to search for the name of an actor. And then, 30 minutes later, you used it to order a pizza. All this stuff is time-stamped. And, if you've taught your speaker to recognize you, it also identifies you as being the user who made whatever the request is, whatever the query was.

These time stamps can be used to create a timeline by investigators. So they can see, at Thursday between 11:00 and 1:00 am, you did this at 11:25, and then you did this at 11:45, and then you did this at 12:01. And so, these activity logs are very instrumental when you're trying to determine what happened around the time of a crime. So when you think about the Florida case, that activity log would have all different types of requests and pings during that time period. Even if it doesn't have those recordings that are triggered by a wake word. And that's one of the things I really want people to take away from this. It's not just when you use the wake word and ask something like, "Hey Alexa, what time does something start?" Everything you do gets recorded in these logs. And so investigators really want to know what were you doing in these specific time periods. And that data is top of mind, just as much as voice recordings are.

LG: And are they using this data in a digital triangulation? If they're requesting speaker data, for example, might they also cross-reference that with something that you're doing on your Fitbit? Or a message you're sending in Facebook Messenger at that time? Or any other service. Not to single those out, but any other service you might be using, where you're essentially leaving a digital footprint?

SF: That's exactly right. When I spoke to investigators, they said that data on one device typically leads to another device, leads to data on another device. And so, with this case in Florida, they first subpoenaed, or they first collected Adam's smartphone. And then, from there, they went back and a few weeks later, they did his … the speaker. And they also looked for social media data. And so the investigations expand over time, and they all ... The data from different devices correlate with each other, especially when you're trying to do things like figure out who people were meeting with, where they were going, what they were doing during particular times.

That was one of the most surprising things I learned when I was reporting this, that it's actually not extremely far-fetched for a smart speaker to potentially exonerate someone. Because, when you think about these time stamps, unique to the user activities, you could say, "I was not at that bar at 11:04. I was not in that house at 11:08, because my speaker shows that, at that exact time, I was in my apartment ordering a pizza." Or, "I was in my apartment, saying, 'Alexa, play this music.'" It's very unlikely, but it's completely possible that a potential defendant, or person of interest, could set forth the timeline established by their speaker as a defense. To say, "This is where I was. I was not where you think I was."

LG: I'm wondering if Amazon is able to outright reject these requests, if it chooses. And I'm thinking about Apple's refusal to share communication data with the FBI around the San Bernardino shooting. Which was at least partly because Apple said it didn't have a backdoor to that data. And I'm wondering where Amazon stands on that?

SF: Right. When I spoke to Amazon and Google, they both said that they have, essentially, a priority list when it comes to these types of requests. And the number one priority is going to be issues of homeland security and preventing terrorist acts. Those requests get answered, immediately, yes or no.

Then, under that, I think it's child abuse. Under that, murder. And it goes down the line, to other civil cases. There's actually, interestingly enough, a lot of divorce cases involving Amazon Echo data, by the likelihood that you can maybe catch someone cheating or things like that, based on what the Amazon overheard. But they do have the ability to say yes or no. The most high-profile case was in 2016—there was a murder case in which police requested Amazon speaker data. And Amazon initially said no. They issued a very long, I think 90-page, explanation of why it was they didn't want to hand over that data. But they ultimately did release it because the defendant consented and said, similar to this case, "I don't think that there's anything in those files that would prove that I did anything wrong."

And so, can they say no? Yes, they can. It would never be easy though, because they're going up against these police departments. And they're going up against ... They would need a very steadfast justification for why the answer would be no.

I will say, though, Amazon does sometimes do partial responses. So maybe you ask for 10 different things, and Amazon gives you seven. Or four or three. I'm not aware of too many cases where they outright reject and say no, but it is common for them to give partial responses. So maybe you'll get a little bit of the metadata but not the actual content of the voice recordings, or things like that.

MC: All right. Well, this is very fascinating, so far. We're going to take a break, and when we come back, we're going to talk more about smart home devices and what you should know about your own privacy.

[Break]

MC: Hey, we're back. OK. So even if you aren't plotting any crimes, it's a good idea to be aware of just how much information your smart devices are gathering from you. That's why we're going to offer some tips here for how to keep your private life, well, more private.

Sidney, you spend a lot of time researching these devices and reporting on all the ways they are being used and exploited. I'm curious, what are your own thoughts about having a smart speaker in your home?

SF: Well, first and foremost, if you want a Bluetooth speaker, you can get one that isn't an always-on, listening device. There's a lot of different options out there, just from a consumer perspective. But also, I'd say that, when I think about smart devices, not just smart speakers or smart devices in general, I really think about this idea from Jonathan Zittrain. He's a Harvard professor who writes a lot about privacy, and he coined this phrase, intellectual debt. And he explained it as, when you buy something, you don't necessarily know how it works. So you go into debt, but eventually how it works will affect you, in some way. And it's like the bill coming due.

And that's how I think about all these devices. You buy them, not really paying attention to the privacy policy, not really paying attention to the types of data they collect, and whether or not you're OK with that data being collected. And you're like, "Oh, this is fine." But eventually, it will affect you in some way that you may or may not even realize. So there's a debt coming due.

And so I think about the cumulative effect of buying all these different devices and not actually knowing whether or not that data goes to the police, whether or not that data goes to the insurance company, to see if you actually are healthy or not. And so, I just think not necessarily about … I don't think it's useful to pinpoint any individual specific devices, quote-unquote "the worst." But for people who like to rack up these gadgets, I would ask them, "How much do you actually know about this stuff? And what's going to happen when you realize, oh, all of these or some of these or one of these collects my data or shares it?" Or "There's some type of aspects of it that I find unacceptable."

LG: Sidney, how do you weigh utility of these devices versus that intellectual debt? Do you ever have moments where you say, "OK, that was really useful. Now I have to consider how useful that was, relative to the amount of privacy that may be eroded."

SF: What I would say is that the utility is very individualized. So what I consider to be acceptable or reasonable, for you, may not work for anyone else. And I think that you have to remember your own individuality, so that when you look at these advices or see how they're marketed, you ask yourself, "Well, is this for me? Does this fit with what I find to be comfortable with?" And not just assume that it works for you.

I've written a lot about the downsides of speakers, but I know from an accessibility standpoint, they can be revolutionary. They can be very, very helpful for people who are dealing with different physical abilities or who are older. So, their calculus is very different from mine. So I would just say it's important to develop what your own calculus is and work from there. I would also send that message to critics of some of these devices. All the time, I hear the people who are like, "Oh, you hate privacy." Or "Oh, you're worried about privacy, but you have an iPhone, therefore you're a hypocrite."

And it's like, no, just remember that everybody's dealing with their own thing and has their own calculus. I think that people that are critics of the tech would do well to remember that as well, so that they don't come across as preachy.

MC: Yeah, I agree. And that's, actually, that's the advice that I give to a lot of people who I have discussions with about this. I'd say do an inventory of whatever privacy settings are on your device. They've actually made it a little bit easier for us to do, these companies, because whenever you buy a smart speaker or you have any sort of connected device in your home, it's usually controlled by an app on your phone or by a web interface. So you can open up that app, and you can just go through the privacy settings one by one. And for each one, ask yourself, do I really want this on? Or should I turn it off? And, you can answer that question for yourself on almost all of them.

The tricky thing that I found is that, when these devices first started shipping, you remember a lot of them didn't come with on/off switches, right? They didn't come with mute switches. They were saving everything, by default. So it's possible that when somebody first created their account, two years ago, three years ago, it was set up in a way where everything is just saved all the time, on default. Every Alexa command that you've given to your Echo speaker, over the last three or four years, has been stored indefinitely on a cloud server somewhere, by default.

The rules have changed recently, where companies are now only storing things for 90 days by default, or 18 months by default, various sets of time. And, you have control over this. You can go into your settings and see. So even though you feel as though you've already done that audit, it's really good to have good hygiene and go back and do it every six months or so. Because the settings change all the time.

LG: Yeah. That was actually going to be one of my tips, Mike, which is, if you feel like you want to welcome these devices into your home, make sure that you understand the controls or, better yet, that the company has designed them with very explicit controls. And that's not just about microphones; it's also about camera shutters. When we first started to see smart displays come onto the market, not all of them had camera closure buttons. So the camera would just always be, potentially, activated.

And then eventually, I think Lenovo was one of the first companies that came out with one of the smart displays that had a little, I think it's like a shutter, right? Am I describing that correctly?

MC: Yeah, that's it. It's a physical shutter.

LG: Like, a physical shutter. Yeah. I've read about products. I should know what this is. A physical shutter that you could slide to the side, to actually just ensure that there was a physical covering over that camera. So that even if something malicious were to happen with that device, you're not being filmed at home in your kitchen or wherever else you might have it. But I have to say that, over time, I have either removed all smart speakers from my home, or I've deactivated them.

I had a Google smart display for a while, which I liked initially because it served as a kind of digital photo album. But then, I have so many photos in my Google Photos, that it just got to be too much to manage. It's not that much, but it just was that little barrier to managing the album that I wanted to show on a smart display. And then, it was just showing all these randos from my camera roll, from over the years. And I was like, oh, this is annoying. So, I deactivated that.

And then I have a Sonos that has Alexa capabilities, because Sonos has made smart speakers in recent years, that have both Alexa and Google Assistant capabilities, but I just haven't found the value to be there. And I think it was actually the Sonos CEO who said in a recent interview, and I'm paraphrasing, but that voice technology at some point would hit an inevitable plateau. And then it would have to make some kind of leap, to continue to prove its value, continue to prove its utility. And maybe that involves some verticalization, too, of the voice offerings. And personally, I have not seen that leap happen.

I'm mostly checking the weather now, or if you're in California, the air quality at this moment. And I'm doing that, just swiping on my phone, and it's pretty easy. Or if I need to adjust the EQ or something like that on my smart speakers, I actually like going into the app and doing that. I have just, yeah. I haven't really missed having a smart speaker in my home.

MC: Yeah. And people are pushing really hard to make the utility thing sing. One disturbing trend that I'm sure you've seen as well is that hotels are starting to put Echo speakers in their guests rooms, because they think that that provides an added service. When you check into the hotel, if you can ask for things from the hotel, using your voice, instead of picking up the phone. I have not really figured out a way out, of that particular conundrum. I meticulously unplug everything, whenever I go into hotel. I even unplug the little dock that they give you for charging your phone. And I'd just rather use my own. So.

LG: But that's a good point too, because for some people that utility of going into a hotel room, particularly in the time of Covid and not having to touch things? Would be a real value proposition, if you will. I'll put on my Silicon Valley hat. But for others, like yourself, you're like, "Yeah, no, it's not worth it just on the off chance that a query I make or something I said could potentially be sent to a server somewhere." And maybe not even having fully grasped whose account that is, exactly, on the hotel Echo that's sitting there or whatever it might be.

SF: I will say, I don't know if this actually happened, but I heard a funny story out of Japan, where the speaker was recognizing snoring as a query. So, people would fall asleep, snore. It would ping you awake. And then you were just stuck in this horrible cycle of your snore causing your speaker to wake you up.

MC: I love that.

LG: Well, and also parents often say, that they love the smart speakers. Because their kids are asking to play the same song, over and over again, and it means they can just throw out a voice command rather than having to hand over their phone or something.

MC: Yeah. These are all futures that I don't want to live in. Let's take a quick break. And when we come back, we'll wrap up the show with our recommendations.

[Break]

MC: All right, welcome back. Sidney, you're a guest. You go first. What is your recommendation?

SF: My recommendation this week is for I May Destroy You. It's a drama on HBO Max. It is excellent. It is terrifying. It is supernatural. It's really amazing. It's a drama written and directed by Michaela Coel. She's a brilliant artist out of the UK. I really can't go into any plot details, except to say that it starts … It's a fiction based on a real-life traumatic experience that she had, and it really heightens every single episode to know that a lot of this was based on what happened to her.

But she takes that as the ground floor, and then from there really explores things like how we tell stories, how we heal. How complicated relationships are. And the finale just aired a few days ago, I think. And it really just blew my mind with a really release for meta, Kaufman-esque discussion of what the point of the endings of stories are, and what we … It was a finale that was about, What do we get out of finales? Especially knowing that, with traumatic experiences, we very rarely get just a neat little ending: "That's over with." So it really was an interrogation of how you close the drawer on trauma and move on. And she's a genius. It's incredible. It's hilarious. It's very dark. And it's on HBO Max.

LG: I think that Angela Watercutter may have also recommended that, when she was on the Gadget Lab. Mike, do you remember? A few, maybe a few weeks ago?

MC: I can't remember anything beyond two days before yesterday.

LG: Well, now even more reason to watch it.

MC: What's your recommendation, Lauren?

LG: My recommendation this week is the September issue of Vanity Fair, which is a sister publication of WIRED. For those of you who don't know both Vanity Fair and WIRED are owned by the same parent company, Condé Nast. This issue in particular is special, because it is guest edited by Ta-Nehisi Coates. And there's one article in particular—he whole thing you should read; you should pick it up; you should subscribe—but there's one article in particular that is an as told to, by Tamika Palmer, who is the mother of Breonna Taylor. And Tamika talks about her own life, growing up. Her own upbringing, and how she came to be Breonna's mother. And she talks about Breonna and her spirit and her resolve in such a way that she really feels alive. Which makes the whole article feel even more heartbreaking, because we know of course that Breonna is not alive, due to horrific circumstances.

So my recommendation this week is to pick up the issue, or subscribe to Vanity Fair. And in particular, read this article, which we will link to in the show notes.

MC: Nice. Well, those are a couple of heavy recommendations. So I'm going to try and offer something that's a little sweeter. We all know Questlove, right? Questlove, the drummer for the Roots, former WIRED, one-time WIRED cover model and just all around cultural touchstone. He has a podcast, it's called Questlove Supreme, where Quest and a few of his friends interview celebrities from the worlds of music and television and movies. Every week, during quarantine, he's been playing some older interviews that he did a couple of years ago. And this week he dropped a two-parter where he interviews Bootsy Collins. And Bootsy Collins, if you're not familiar, is a bass player. And you may know him as the guy behind a few of James Brown's biggest hits. You may know him as the bass player in Parliament-Funkadelic. You may even know him as the guy with the crazy sunglasses who's in the Deee-Lite "Groove Is in the Heart" video.

Bootsy is one of the most important musicians in funk, and R&B, and in hip hop, because his work has been sampled so often. And he has stories. He has all kinds of stories. He tells you stories about how James Brown hired him, how James Brown fired him, which is an incredible story. About his years with George Clinton, about Bootzilla. And it's all just this rambling mess of fun conversation. Questlove Supreme has a very informal pattern of telling stories. They go back and ask questions. They jump around a lot, in timelines, and Bootsy just rolls with it. So, it's about three hours worth of interview. And I wouldn't recommend that you listen to all of it, unless you're a big Bootsy Collins fan. But you should at least try it out. He's a lot of fun to listen to.

Also, Bootsy rarely curses. He is a man of faith. So he comes up with all kinds of creative ways to not curse and still curse. It's pretty awesome. So that's my recommendation, Bootsy Collins on Questlove Supreme. And those episodes dropped earlier this week. And as of this taping, he just dropped a new episode that is an interview with George Clinton, Bootsy's boss, in Parliament-Funkadelic, which I have not listened to, but that's also something to look forward to.

LG: Nice.

MC: All right. That is our show. Sidney, thanks again for joining us.

SF: Thank you for having me.

MC: Of course. Thank you all for listening.

LG: And before we go, I wanted to share something about our other podcast, Get WIRED. And we actually need your help, for this.

In the coming weeks, we're going to be doing an episode all about back to school, where we're exploring how parents and teachers and kids are trying to adjust and reinvent education, in real time, because of the coronavirus pandemic. And we want to hear from you about your experiences, so far, in remote education. And what's working, and what's not during this especially challenging back-to-school season. And we want to hear about your frustrations and things that you have figured out that work, while you're trying to figure out homeschool over Zoom. So call us, we've actually set up a voicemail. It is not my personal number, but it is a voicemail that will take you to the Get WIRED podcast, nonetheless.

We want to hear your experiences, and please leave us your name and where you're from, and we really appreciate it. That number is 415-534-9498. And once again, 415-534-9498 to leave your voicemail with the Get WIRED podcast. And thank you.

MC: Would you also encourage parents to have their children speak into the voicemail?

LG: Sure. I would love to. And I think we can pretty much guarantee, we're not going to share those with law enforcement. Almost guarantee.

MC: All right, well bling that hotline, and you will hear you and your family on Get WIRED, possibly, in the future. If you have any feedback about this show, you can find all of us on Twitter. Just check the show notes. Gadget Lab is produced by Boone Ashworth. Our executive producer is Alex Kapelman. Goodbye, and we'll be back next week.

[Gadget Lab outro theme music]


More Great WIRED Stories