- A police union is urging its officers to post ‘The Punisher’ logo Monday 7:33 PM
- Redditors call for a Nestlé boycott through memes Monday 6:16 PM
- How a 10-second Disney jingle became a meme in Thailand Monday 4:48 PM
- Instagram users share photos showing gruesome killing of 17-year-old Bianca Devins Monday 4:33 PM
- The horror game banned for mocking China’s president probably isn’t coming back Monday 3:31 PM
- Cheap vibrators, condoms, and lube: The most satisfying Amazon Prime Day deals Monday 3:07 PM
- George R.R. Martin says fan backlash won’t affect his ‘Game of Thrones’ ending Monday 3:03 PM
- The very finest Area 51 memes Monday 2:52 PM
- Tweet map ranks states where people are boycotting Amazon Prime Day Monday 1:54 PM
- Lil Nas X says he will perform at Area 51 for free Monday 12:56 PM
- The best Prime Day deals for gamers Monday 12:53 PM
- How Republicans are dancing around Trump’s racist tweets Monday 12:42 PM
- Not even anti-immigrant groups are defending Trump’s ‘go back’ tweets Monday 12:37 PM
- Netflix’s latest chase thriller ‘Point Blank’ lacks electricity Monday 12:27 PM
- Jay Inslee floats Megan Rapinoe as his secretary of state pick Monday 11:33 AM
‘This is not an all-or-nothing zero-sum game.’
It was supposed to be an opportunity for lawmakers to better understand the debate over strong encryption. What actually happened may be more revealing.
With law-enforcement officials and cryptographers sparring over whether tech companies should retain a way to bypass their own products’ encryption, a House committee brought together representatives from both sides on April 19 for a hearing on the issue. But almost as soon as the law-enforcement witnesses—two police officers and an FBI official—started speaking, technologists on social media began slamming their testimony as technologically ignorant and dangerously misleading.
At the center of the social-media firestorm was one of the police witnesses, Indiana State Police Captain Charles Cohen, the head of the Indiana Internet Crimes Against Children Task Force. Asked about the dangers of tech companies weakening their encryption so they could provide investigators with unencrypted data, Cohen repeatedly stressed that the threat was minimal because companies protected their products with “firewalls.”
Security researchers and technologists were perplexed. “What he’s saying only shows he doesn’t understand the problem,” forensic scientist Jonathan Zdziarski said in an email to the Daily Dot, calling the idea “completely nonsensical.” Matt Blaze, an associate professor of computer science at the University of Pennsylvania, who testified on the technologist panel, told the Daily Dot afterward that he was “baffled” by Cohen’s remarks.
In an attempt to clarify what he said, as well as expand on the law-enforcement perspective in the heated encryption debate, the Daily Dot spoke to Captain Cohen about his safe-deposit-box analogy, his “firewalls” references, the tone of the public discourse, and what he thinks will happen next.
Tell me a little bit about your background in cybercrime investigations and how that led you to the issue of encryption.
Captain Charles Cohen: Right now, for State Police, I’m responsible for our Office of Intelligence and Investigative Technologies. Included in that is, I’ve been responsible for the Indiana Internet Crimes Against Children Task Force for about eight and a half years now. Crimes against children, [we’re] talking about child pornography, online child solicitation, online child sexual extortion. [I] have other areas of responsibility also, within the department, but [I] do spend a lot of my time working with Internet crimes against children.
Because it involves the Internet, because it involves people that are using encryption, whether it’s for data at rest or data in motion, it’s becoming an increasing concern for us.
What is your view of the cryptography experts and security researchers who say that technology companies shouldn’t be forced to design their systems so that they can provide unencrypted data in response to a warrant?
Well, we’re looking at, as an example—and this is just an example—looking at Apple. Apple previously had the technical ability to comply with legal process, meaning comply with a search warrant issued by a judge. I’m not aware of any instance where that created a potential or actual security vulnerability. They no longer have that capability.
“I can’t think of any risk to our ability to do these investigations that I’ve seen in the last 15 years that even holds a candle to this challenge of encryption.”
From my perspective, this is not a zero-sum game, meaning that it’s not an all-or-nothing [security situation].
The analogy I used as an example when I was testifying a couple weeks ago was that of a bank deposit box. To expand upon that, there is a technical possibility that that bank deposit box will be broken into—that someone [with a] safe deposit box will be the victim of a theft. It is possible to break into someone’s house and steal the key, then create a fake identity and use a fake identity to trick the bank into allowing them to use that key, plus the bank’s key, to open it up. Those things have happened in the past. Banks have been robbed and burgled [in situations] where bank deposit boxes were compromised.
But nonetheless, those bank deposit boxes offer enough reasonable security that people feel comfortable storing their most sensitive information, [and] the most valuable items that they hold, in those bank deposit boxes. Bank deposit boxes are not secure to a mathematically infinite [level], but nonetheless they are secure. But there still is a way for those banks to comply with legal process.
I’ve done it probably a dozen or more times in my career. We obtain a warrant signed by a judge. We call the bank manager. The bank manager, the lawyer, look at the warrant. They make sure it’s a legally proper format. They use their key to open up the bank, we drill the other lock, and we look to see what’s inside that and we search … the bank deposit box and seize the contents of it, if it’s in the scope of the warrant.
I am always concerned, from anyone, whether it’s about encryption or something else, that says that any situation is an all-or-nothing—that there are no alternatives, [that] it’s completely a zero-sum game, unless … we make it [so] there’s no theoretical possibility for any kind of encryption, and then we are also going to tie the physical data chip to the physical motherboard—do all these layers of things. That causes me concern, because it is always a balance between civil liberties, right to privacy, and the ability of government to protect citizens and solve crimes. And there’s always a balance between the two.
Computer scientists don’t like the safe-deposit-box analogy, because while a bank’s master key only works on its boxes, encryption vulnerabilities are more fundamental and more universal. A cryptographic algorithm might be deployed across a wide range of applications, including in national-security roles. What do you say to cryptographers who say that it is not just about privacy versus security—that there are security equities on both sides?
I would say, to anyone that says that, that there a number of potential solutions, including taking the key and putting it with … a trusted third party. There are a number of alternatives.
What I would say is, from an investigative standpoint, from a law-enforcement standpoint, there are a number of potential solutions, including the solution that, as an example, Apple had prior to [making] the change on where they held the key, for them to hold the key.
There are a number of different solutions. It does not have to be a technical dumbing-down … of the encryption, or a lower encryption threshold. But there needs to be a way for law-enforcement to be able to access this information, whether that is done directly by law-enforcement, whether that is done with the assistance of the manufacturer of the encryption scheme or device, whether that is done through a trusted third party.
I will tell you, and I cannot speak for [all] law enforcement, but for myself as an individual investigator, any of those options are completely fine. The only thing that causes me concern is default, ubiquitous hard encryption that customers can’t even choose to turn off if they would want to turn that encryption off—where there is no solution, whatever that solution is, for us to be able to look at this.
This creates, in my mind, personally, a real risk to public safety, a real risk to individual safety, that far outweighs the risk to personal privacy or safety from a theoretical possibility for … this encryption being broken.
I want to ask you about some of your specific comments at the April 19 House hearing. You repeatedly used the word “firewall” and distinguished it from encryption. You said, “If you think of an iPhone, or an Android OS phone, as a safety-deposit box, the key the bank holds, that’s the private-key encryption. The key the customer holds, that’s the public-key encryption. But what the bank does is it builds firewalls around that. There’s a difference between encryption and a firewall.” Now, I’ve talked to several cryptography experts, and they all said that these firewall comments made no sense. And to be clear, the way you described public and private encryption keys, that was not correct. So I guess I’m wondering, where did you hear about this notion of a firewall in this context?
That’s my—again, that is my understanding. The firewall—and I did talk with Professor Blaze afterwards, and I understand his position and the position he’s taken with relation to industry encryption and law enforcement.
It, again, is not a zero-sum game. This is not an all-or-nothing zero-sum game.
But again, firewalls have been around, going back … since before encryption, and [they] predate encryption. Firewalls are separate and different from encryption—the same way, again, using that analogy, which is not a perfect analogy but one that helps people understand the concept of encryption and firewalls [in a way that’s] very similar to what we’re seeing in that bank deposit box.
Again, I would stand behind that analogy, going back to that [key] vault being the firewall and … the locks being the encryption.
It sounds like you’re saying that, with a firewall, weakened encryption isn’t as big of a concern. Experts don’t agree with that. So just to be clear, can you tell me where you heard about the idea of a firewall being relevant here?
The firewall notion, again, is mine. This, again, is me trying to explain this concept to me legislators.
There are no perfect analogies, but when you’re talking about … trying to explain to someone that doesn’t have that technical background, trying to explain to investigators, to legislators, you have to talk about things that are … not … specific to a cryptological standpoint, or strictly from a forensic standpoint.
When cryptographers are saying that they don’t know how to do this—that Apple abandoned its old system because it didn’t consider it to be secure—doesn’t that concern you? That people who are supposed to know how to design these systems don’t know how to give you what you want?
I think that maybe some of the challenge that … industry [faces] interacting with law enforcement is [that] law enforcement does not have that expertise when it comes to cryptography. They’re not looking at this from a strict academic standpoint. And those that are cryptographers don’t have the expertise in doing investigations and are not looking at it from a forensic standpoint—meaning, applying science to the criminal-justice system.
It may be the fact that you’re talking about, in some cases, people that have different areas of expertise. That tends to be a challenge.
Going back to the zero-sum game thing, there needs to be a way for law enforcement [and] industry to be able to work together to … solve these challenges, and that’s something I hope, if you … watched the testimony, that’s hopefully something that you heard from all three [witnesses] that were testifying from a law-enforcement perspective: the desire of criminal investigators to work with industry to try to find solutions, to not throw our hands up and say, ‘Well, this is a problem that can’t be solved,’ or, ‘There’s no way that industry and law enforcement can interact or work together,’ but that there has to be a way for us to work together to try to solve these challenges.
[We can’t] look at this as an all-or-nothing, complete lack of privacy or complete theoretical security—there needs to be a way to, frankly, serve the needs of crime victims at the same time [as we] serve the needs of privacy and civil liberties. Whether it’s looking at privacy from the government or whether it’s looking at the ability for people to have their information secure.
I’ll tell you, I don’t know of anyone that has ever come up with a perfectly secure alarm system in a house or business, [or] a perfectly secure bank deposit box. But nonetheless, people live in houses, people go to work, people [use] safety deposit boxes. If you’re talking about a mathematical, perfect security, that hypothetical, mathematical, perfect security, that’s something that doesn’t exist, at least [as far as] I’m aware of, anyplace.
Yes, a lock on my door might be imperfect, but breaking that lock only affects my house. A breakable cryptographic lock on a database containing classified information affects many, many more people and systems. That’s not theoretical, either. That’s been done. There’s a firewall made by a company called Juniper Networks that used code that was vulnerable to exploitation; any company using that firewall to protect their data could have been compromised. Can you talk about the national-security consequences of weakening encryption?
I really don’t work [in] national security. I work in domestic criminal investigations. So I really wouldn’t want to speak to a national security standpoint.
What I was really trying to testify about, when I was asked to testify by the committee, was the challenges faced by law enforcement related to, frankly, investigations other than terrorism—other than national-security investigations.
That committee, [based on my] talking to the staffers, had heard from the FBI … talking about a very specific national-security matter. But the reality is, the challenge of not being able to access encrypted devices, not being able to access encrypted communication, impacts criminal investigations far [more broadly] afield than counter-terrorism investigations [and] national-security-type investigations. The staffers asked that the members of that committee hear about those other types of investigations and what the impact is on those investigations.
What do you think of the security researchers who disagree with you and their suggestion that this simply cannot be done in a secure way? Do you think that they are being honest when they say that?
I would not want to attribute dishonesty to anyone that says that. I would not want to say they’re being disingenuous in any way in their belief. I would say that they are looking at this from a zero-sum-game [perspective]. They’re looking at this from an all-or-nothing [perspective]. They’re looking at it as a mathematician looks at a problem, where the standpoint is, ‘It is either all or nothing. I cannot think of a mathematically exact way to provide this for you.’ But there have to be ways to get access to that [encrypted data].
I do not think, in any way, that they are trying to put people at risk, that that is their intention. I have not gotten the impression, from anyone I’ve talked to on this issue, on any side of the equation, that there is an intent to put people at risk, that there is an intent to put children at risk. And I would not way to attribute that [to them] in any way.
But I will tell you, from my perspective and from my experience, this [status quo] does put people at risk. A lot of the investigations I do involve child victims. And I will tell you that there is a real and present risk to children related to this. And I’m always … using children as an example instead of adults [because] that’s the area where I spend a lot of my investigative efforts.
I understand the position that we can’t think of a mathematically exact way to do this—we can’t think of a way to do this and say with a mathematical certainty that this encryption scheme can’t be broken if we provide a method [of access]. But I can also tell you that there are active investigations where we can’t find child victims. There [are] active investigations where we can’t bring justice to those victims because of the current encryption that’s in use right now.
I worry going forward, as that encryption becomes increasingly ubiquitous; as it becomes [the] default that it can’t be turned off; as instant-messaging platforms … you know, WhatsApp said that they were encrypting their communications, Viber said they were encrypting their communications recently; as the encryption becomes more robust, more ubiquitous.
The FBI is not being [hyperbolic], they’re not exaggerating, when they say that investigators are going dark, that we increasingly are lacking the ability [to decrypt data]. I’ve been doing Internet-crimes-against-children investigations for the better part of 15 years. I’ve been responsible for lots of them for the better part of a decade. I can’t think of any risk to our ability to do these investigations that I’ve seen in the last 15 years that even holds a candle to this challenge of encryption.
We have to find a way to allow criminal investigators to access a child victim’s cellphone, access a murder victim’s cellphone, access a child sex offender’s cellphone, a suspect’s phone—not just for that incriminating evidence, but also for potential exculpatory evidence inside there.
What do you think of the rhetoric of some law-enforcement officials, including Manhattan District Attorney Cyrus Vance, who have accused tech companies of thinking of themselves as above the law, of setting their own rules, of aiding terrorists, and of deciding that they know better than voters and policymakers what the level of access should be?
I haven’t heard those comments, so I can’t speak to them. … What I can tell you is very similar to what I put in my written testimony: I do have concerns related to any private company making unilateral business decisions without checks and balances, without oversight, related to the ability of judges to be able to access information.
“It, again, is not a zero-sum game. This is not an all-or-nothing zero-sum game.”
The hard reality is, regardless of legal process, any search warrant issued by any judge, I lack [and] the rest of law enforcement lacks the technical ability to serve that search warrant. The Chief Justice of the U.S. Supreme Court can issue a search warrant, based on oath or affirmation, describing what to be search [and] where to be searched, completely within the letter of the law, and, for the first time in the history of the United States, the executive branch, law enforcement, lacks the technical ability to be able to comply with that search warrant. I can’t think of another parallel to this. That’s really never existed before in the history of the country.
If you gave law enforcement a search warrant—again, for a bank vault, or for a vault in someone’s home or business—it might take a couple of days, but we’ll be able to comply with that warrant. [But] for the first time in the history of the United States, many people—arguably most people now—are carrying around with them vaults that criminal investigators, law enforcement lacks the technical ability to access. And that is a challenge that we need either a technical or … legislative solution to.
And that really is where I started my testimony a couple weeks ago: If we are not going to have a technical solution, it’s becoming increasingly apparent that we need a legislative solution to that challenge.
This isn’t adversarial toward … industry or any particular company. This is a huge challenge for law enforcement and one that we don’t see a good technical solution [to]. That’s why we’re saying, increasingly, that there needs to be a legislative solution to this problem.
And I will tell you, the same way I said I would not want to apply malice to any of those cryptographers when you asked me that question … this is not a position where you see malice or from any criminal investigator. This is where there is this unmet need and we’re asking for help. We’re asking for help from industry, we’re asking for help from legislators or anyone else that can help us.
From my perspective, the solution is not to ad-hoc, individually, try to hire hackers to figure this out. Because that isn’t good for people’s privacy either.
As you look ahead and imagine where this debate could go, are you optimistic that the public will side with you and agree that there needs to be some sort of guaranteed-access mechanism in encryption? Or do you think people will lean more toward listening to the cryptographers?
I don’t have a crystal ball. If I was going to guess, though, what will happen, what I would guess would happen is that there will be some sort of unfortunate, very traumatic event. And I’m not talking about a national-security event. But there will be an event that helps shine a light on what that [investigatory] challenge is, and the public will start to see what that challenge is. I don’t want that to be the case, but that’s my concern, [that] that’s what’s going to happen. Meaning that something bad will happen to somebody prominent or to a child that grabs [the] media’s attention. Because I don’t think that, oftentimes, people, members of the public, necessarily will consider the issue one way or the other until they’re presented with that issue. So something will be brought [to bear] to present that issue to them.
Is that a good thing—that traumatic context for considering an issue?
I don’t think it is. I just have to look at myself: Would I, as a member of the public, think about that issue? If this was not part of what I do—trying to solve crimes [and] trying to help people by gathering evidence, [with] some of that evidence being encrypted—I have to think about, would I even consider this issue in my day-to-day life? If this was not something that I do as part of my career, would I think about that?
And likely [I would] not. Likely, all I would know is that I see in the media. I [as a member of the general public] may not even pay attention to a media story if it didn’t personally affect me, or if I didn’t have an example with which I could relate. As a father myself, if it didn’t relate to me and my family, would I consider this issue?
I’m kind of curious what your opinion is. … How much do you think this resonates with people? Do you think they consider this as they go about their day-to-day lives?
I don’t. Polling is inconsistent. It’s not something that people feel is very tangible. All of these questions are about digital things that you don’t grasp until it really matters—until there’s a terrorist incident or until you need to protect your data. I will be very curious to see if this becomes more real to people than it already is.
I agree with you, and that’s what I was trying to get across. You put it more eloquently than I am. I’m not sure that people even think about this.
In part, I measure it—because I haven’t done a poll, but after I testified, people in the community [started] saying, ‘Hey, I saw your testimony. I never thought about that before.’ Or, ‘I never even knew about encryption.’ Or, ‘I never realized my phone was encrypted.’ Or, ‘I never realized that you couldn’t get this stuff off this’ phone. And I also hear a lot of people say, ‘Oh, well, you guys really have a way of getting this, don’t you? I know you’re saying this to make it easy, but if you really wanted to open up a phone in a murder case, you really could do it, right?’
I’m not sure that the public has thought through this.
Well, I guess we’ll have to see where this debate goes. Because you’re right that we don’t know.
I’m curious, what do you think?
About what people expect?
I mean, where do you think people are going to come down? Where do you think people are going to come down?
It’s hard for me to predict, because if you had asked me what national-security policy would look like before 9/11, I don’t know that I would have said that the public would embrace the Patriot Act. I think it is very dependent on circumstances. You mentioned before that you suspect there will eventually be a big tragedy. If that happens, I think this debate will become even more chaotic and unpredictable. I don’t know that I’m equipped to predict that.
I’m the same way. [And] I’m not saying something big. I’m saying something that will resonate with people. It doesn’t have to be something big. [It could resonate] for whatever reason: major media market, attractive victim. There’s lots of reasons why a certain thing will stick in people’s minds, or why, frankly, the media will grab onto a particular incident.
[But] if I had to predict, [I’d say] some incident like that will occur, for whatever reason, [that either] garners a lot of media attention or resonates with a large number of people, that will help to refine this issue. Without knowing what that incident is—that incident could be one that … who knows?
This interview has been condensed and edited for clarity.
Eric Geller is a politics reporter who focuses on cybersecurity, surveillance, encryption, and privacy. A former staff writer at the Daily Dot, Geller joined Politico in June 2016, where he's focused on policymaking at the White House, the Justice Department, the State Department, and the Commerce Department.