Section: CSAD 2016

Interview with CSIS Fellow James Lewis

Conducted by Alex Pijanowski, News Editor

James A. Lewis is a senior fellow at the Center for the Study for Strategic and International Studies (CSIS) where he is also the director of the Strategic Technologies Program. His personal record includes an impressive amount of foreign service work, as well as advisory work to the United Nations and the U.S. government on matters of political and military considerations. His geographical regions of expertise are Asia and Central America.  

 

Q: As I was reading your biography, it really occurred to me that your background is quite varied. You’ve done foreign service work, you’ve advised the United Nations, you’ve been an advisor to governments, and you’ve made policy yourself. Now that you’re in a more academic context, how do you marshal all of those experiences and sources of knowledge into analyzing things like cyber-security?

A: It’s useful to ask, ‘How do we know things’? For me, having the ability to draw on those experiences, you get a sense for whether something feels right or not, and I’m going to talk about that later in the day. The second thing is, it gives you a little bit of a sense about how things actually work. You know the question last night about how the FISA [Foreign Intelligence Surveillance] Court has only rejected 12 out of 40,000—it doesn’t take into account the reality that you have to go through multiple layers of approval to get to that point by the time a case gets to FISA. So, knowing things like that is helpful; knowing how foreigners and non-U.S. persons think is helpful. It’s helpful knowing how to write in a policy context, which is very different than an academic context.

 

Q: A few years ago, Edward Snowden released certain documents, specifically relating to the National Security Administration [NSA]. That was kind of a watershed moment when the public at large realized what some of the government’s policies were on information. I wonder if you think there’s a clear link between this event and subsequent demands that the public has has made upon companies and governments to be transparent with how they’re using information. If so, does that make the job harder of people who are collecting data?

A: The short answer to the second question is no. If you’re in law enforcement, you have so many constraints and legal requirements, and technological limitations, that their job will be made more difficult as people react to Snowden. On the foreign side, the main effect has been to constrain the relationships the U.S. had with other countries’ intelligence agencies. Partnerships are subject to more scrutiny overseas, which isn’t a bad thing. It’s too early to tell if Snowden has damaged collection. I would say we’ve seen a change in behavior in terrorist groups and foreign intelligence agencies, and presumably folks compensated for that. Snowden didn’t actually release any data. Snowden, a bit naively, ended up in Russia, and the Russians control him completely. He is limited in what he can release, and talking to people at NSA, Snowden had a big set of data that included not only U.S. practices, but foreign practices, and only the U.S. stuff has been released. So, it’s a distorted picture designed to damage the United States—it’s not an accurate picture. Again, that’s where it helps to have a little bit of experience, and to be able to say, ‘I know what the French, the Germans, the Russians, the Chinese, and the Indians do.’ That the focus is purely on the NSA is a bit naive.

 

Q: In one of your opinion pieces in the New York Times, published earlier this year, you said you hoped that governments and transnational tech companies could come to some kind of agreement on ways that those governments could access encrypted devices and turn the information into plain text. I wondered if you had any specific recommendations for those sorts of regulations, and if you had any hope that they would be agreed upon.

A: I think that’s where Ben Wittes’ point from this morning—that the discussion is still at a very early stage—is exactly right. This is a political issue, and we don’t know what the contours of what a solution would be until we have more discussion. But, there are many countries in the world beyond the U.S., including democracies, that are concerned about encryption. There is some degree of risk from not letting people have encryption; there is some degree of risk from letting people have end-to-end encryption; and societies have to weigh how much risk they’re willing to take. I was in the first crypto-wars, which the privacy people would tell you they won. I would tell you that we won, because we set up the apparatus that Snowden disclosed. It was the golden age of signals intelligence. We could do that then because the Internet was small; it was largely American, and it didn’t do very much. Now, it’s billions of users, many of whom don’t live in the United States, and we have to take their views into account. So, what other countries think will shape our views in some way, and in some ways, we’re a bit of an outlier as a country. Things we tolerate here would not be tolerated anywhere else. I’m not talking about authoritarian states; states say you can’t insult the king, or you can’t insult a religion, or you can’t promote Nazi memorabilia. We’re an outlier, and we’re going to have to deal with that when we deal with this problem. Other countries may have to deal with more restrictions than we want. To say, ‘We don’t have to do what you think is a good idea’ might have worked ten years ago, but I don’t think it would work now.  

Q: One thing you mentioned was the first of the crypto-wars. What are the hallmarks of that time period?

A: You had the introduction of new communications technologies. There were two parts of this, and one, Director Comey referred to last night. You had the move to fiber optic communications, which can’t be tapped the way copper wires can be tapped. So, you had a law called the CALEA—Community Assistance to Law Enforcement Act—that says that, when you build in fiber-optic switches, you have to ensure there’s an ability to conduct wiretaps. At the same time, you had some people realizing that this thing coming along, the Internet, which presented some real risks. Frankly, they were much more concerned about the risks for which we would now call cybersecurity—that foreign entities and criminals would take advantage of Americans online, which has turned out  to be true. Their solution was to create something called a Clipper chip, which turned out to be a remarkably bad idea. It would have been a chip that encrypted all your communications, but allowed law enforcement to get access to it with the proper authorities. Everyone hated the Clipper chip—the Clipper chip was probably about ‘92—and everyone was on crusade to find out what the solution was. At  the end of the day, in about 2000, the government decided that it was better to let people have encryption because the gains in security outweighed the losses to intelligence and law enforcement. Now, we’re reconsidering that, because it’s a very different world. Russia’s no longer friendly, China’s aggressive; in this new environment, do we want to rethink that initial deal? Also, we’re not the only people on the planet; we will have to think about what other countries want was well. That was what the crypto-wars were: how do we make this Internet thing more secure? The theory was to let them have encryption. It didn’t work, and now we need to rethink it.

 

Q: It seemed like one of the themes running through Director Comey’s talk is that strong encryption is a double-edged sword. It’s one way that innocent citizens can protect their information, but it’s also one way that people who want to do harm to us can plot and communicate with each other. Do you have any thoughts on whether that’s accurate?

A: A better way to think about it than has been portrayed very often in the press is, ‘Under what conditions can the plaintext of the encrypted message be recovered, and who can recover it?’ What we’ve got with end-to-end encryption is, the sender and the recipient are the only two who can see the plaintext. Now, there are ways to defeat that. There’s no such thing as unbreakable encryption, but they’re more expensive, they’re more risky, it’s harder to apply on a broad scale. The flip side of that is, suppose you used encryption that was really, really strong, but also recoverable, meaning there was some way for a third party to gain access to the text. Most companies use this kind of encryption, because they don’t want their employees running pornography rings, or starting terrorist cells, or selling real estate on a company computer; that creates liability. So, it’s possible we’ll have encryption that is both strong but meets the needs of law enforcement. That’s what often gets lost in this debate. The term “back door” is just silly. Nobody wants back doors; back doors have never worked, but it’s a good way to tar and to create a pejorative sense about the practice. It’s not the most honest debate you’ve seen in your life.    

 

Q: Another essay you wrote recently, called “Posturing and Politics for Encryption,” seems to be mostly about the interaction between Apple and the FBI as they were trying to tease out the details of the San Bernardino case. You say that tech companies want to show up front to their clients and their companies that their information is safe with that company. Do you think that there’s anything short of a major catastrophe or tragedy that would cause those companies to rethink that strategy?

 

A: I think, in general, companies cooperate when there is a lawful request. You need to distinguish that from the actions of intelligence agencies, [14:07] including Russia and China, who will access this information unlawfully. That’s not the case for NSA—they cannot access the information of Americans unlawfully, but they can access foreign information. And that’s what I think the companies are trying to reverse, and this does get back to Snowden: ‘When the FBI shows up at our door, we don’t just give them everything they want.’ It’s an effort to preserve the sense of the global market; that you can trust American products, and that’s a reasonable goal. The dilemma arises when you get to other countries that want to have access to the plaintext, and require the American companies to cooperate with them. I think the point of that particular essay was, just, don’t treat the FBI worse than you would treat the People’s Liberation Army [PLA] or the Russian intelligence services; let’s treat everybody all the same. So, if you cooperate with the Russians and the Chinese, maybe you could cooperate with the Russians and the FBI—that doesn’t seem like such a big deal. The FBI is going to show up with a warrant that some judge has looked at; FSB and PLA are not. Snowden did this to American companies. People now distrust the use of those products, because they think it means the NSA is involved. Europeans in particular tend to be confused about the difference between NSA access and FBI access. The FBI doesn’t have access to European records; the NSA does. What was left out of the Edward Snowden disclosures is, the Russians and the Chinese do the same things. OPM was a good example of this: there was a mysterious accident where, I think, about a third of the world’s internet traffic was accidentally routed through China. It was an accident; it was the Chinese intelligence services creating a global surveillance system, and they got it wrong. But, you’re not going to read that from Snowden, because the people who control him have not released. So, it’s a very difficult thing, and can we come to agreement? I don’t know. Probably among like-minded nations, like Western democracies, agreement is possible.  
This interview has been edited for length

0 Comments

Comments for this article have closed. If you'd like to send a letter to the editor for publication, please email us at collegian@kenyon.edu.