Heading 1

You can edit text on your website by double clicking on a text box on your website. Alternatively, when you select a text box a settings menu will appear. Selecting 'Edit Text' from this menu will also allow you to edit the text within this text box. Remember to keep your wording friendly, approachable and easy to understand as if you were talking to your customer

TAG Cyber Law Journal

February 2019
A veteran lawyer recounts the challenges he’s experienced over the past quarter-century. 
Cybersecurity is protean. There are always new threats, always new scams. And lawyers are constantly grappling with the risks. Lost in this ceaseless flow is a sense of perspective. If only it were possible to step back and observe patterns over time. Actually, there are individuals who have been toiling long enough in the cybersecurity trenches to have a sense of history. That’s one reason a lengthy conversation with Stewart Baker, a longtime partner at Steptoe & Johnson, proved so valuable. Baker has written books on this subject and hosts the influential Cyberlaw Podcast. He has also viewed the field from the public and private sectors. And his government experience extended from a stint in the early 1990s as general counsel of the National Security Agency (NSA) to a top policy job, during which he literally helped invent the brand-new Department of Homeland Security (DHS). He recalled these experiences in a two-hour interview that was full of revelations.
     We are publishing the edited interview in two installments. In PART I, Baker explains how he came to a career that combines law and technology. The heart of his story begins when he joins the NSA, which he found was inextricably connected to World War II cryptography. And it was only after he landed there that he learned more about the U.S. Supreme Court justice for whom he had clerked, John Paul Stevens, who had been a cryptanalyst himself during the war. The encryption battles that broke out during his time at the NSA, pitting government against industry, were in many ways remarkably similar to recent skirmishes and tensions.

CyberInsecurity News: I want to begin by acknowledging something that struck me in my research. Your mother must have been an extraordinary woman.
Stewart Baker: Yeah, she was. My father died suddenly after a high fever when I was 3, and then she discovered that she was pregnant with my sister. And she set out to earn a living. She ended up at Ford, running some of their earliest computerization programs in the international division. I’ve always thought that my affinity for technology came from her.

CIN: Was she interested in technology before she arrived at Ford? And did she have skills in this area?
SB: No. As far as I can tell, she did not. Her work with Ford was what introduced her to technology. My sense about a technology career in a lot of circumstances—certainly in the 1950s—is that it was a matter of wanting to know something about a new field and being willing to learn. Very much like cybersecurity is today. I continually return to the advice I got from a Brown undergrad 10 years ago, who said, “In this field, you’re either self-taught or you aren’t any good.”  

CIN: Your mother also died when she was very young—of colon cancer. How old were you when she passed?
SB: I was 17. I was finishing my senior year in high school.

CIN: Your mother actually gave you your start working in tech, didn’t she?
SB: One of the last things that she did was to help me get a summer job at Ford’s Rouge plant, in their steel expediting department, which meant that I ran a bunch of punch cards through the sorter every day to figure out where our steel was, and how to get our steel to the places that we needed in time. I do have a long history, going back to the ’60s, of working with computers. But I don’t know how much that influenced my later career. Frankly, what influenced my career most was that for years—generations—people went to law school because they didn’t want to have to do any math. I was happy to do math. And so my career pushed me in the direction of doing the stuff that other lawyers weren’t doing.

CIN: What drew you to law in the first place?
SB: When I was 8 or 9, relatives and teachers and friends would look at me after something came out of my mouth and say, “What, are you going to be a lawyer when you grow up?” And my father, who I didn’t really know, had been a lawyer. And that probably played a role in my thinking, even though I had no idea what lawyers did.

CIN: What came out of your mouth when these people thought you were demonstrating a future career path in the law?
SB: If I’m being nice to myself, a feistiness, a willingness to challenge people by using words and my brain to try to get them to change their mind or do something I wanted. But part of it was just an early adolescent enthusiasm for talking back.   

CIN: You said that technology wasn’t something you were aiming for. When you were in law school, did you have a particular area that you thought you’d go into?
SB: No. I was enthusiastic about the law overall. Partly because I had done a few things before I went to law school. I vividly remember my first summer at O’Melveny & Myers as a summer law clerk, when I realized that this job consisted of sitting down and reading cases and then writing down what I thought they meant. There was no outdoor work. And there was air conditioning. These were not things I had previously experienced in my employment history.

CIN: I have to ask: What was it like clerking for Supreme Court Justice John Paul Stevens?
SB: There is another thread I should probably pull here. My father was in the Signal Corps, probably encrypting communications. Possibly intercepting and decrypting them. Something I didn’t really focus on until I went to NSA in the ’90s. But that wasn’t the only unexpected thread I discovered then. When I took the job at NSA and I told Justice Stevens, for whom I had clerked, he said, “Now, Stewart, let’s have lunch.” And over lunch he said, “I promised to take this secret to my grave, but since you’re going out to work there, I think I can tell you that what I did during World War II was signals intelligence of the sort that NSA currently does.” He told me a few stories—not many—and indeed his role and his unit were an important, or at least an intriguing, part of the signals intelligence effort by the United States against Japan. There were really two centers for that. One back in Washington and one at Pearl Harbor. He was at Pearl Harbor; he got the job because he was really good at bridge, I think. They competed with Washington to try and break the codes of the Japanese.

CIN: When do you think cybersecurity as we know it began? What do you date it by?
SB: I’d say 1989, which was when “The Cuckoo’s Egg,” by Clifford Stoll, was released. Stoll disclosed for the first time a foreign government’s use of hacking to steal secrets to facilitate espionage. There were certainly cybersecurity issues before that, but they were thought of as glitches and bugs that had to be patched and would be no problem. The idea of using those glitches and bugs systematically to steal secrets was not something that the U.S. government was especially focused on. But the East Germans sure were. And this book, which is one of the great books on cybersecurity, is about the first known use of computer hacking and attacks to serve intelligence purposes. Cliff Stoll was an astronomer. He was doing computers because he was self-taught, like everybody else in that field, and he was reconciling a billing, and he found a 50-cent discrepancy in the time online that they were being billed for. He started picking away at that, and he discovered that an East German hacker who had broken into the system was extracting a large amount of information. So it’s the story of his unraveling the attack and identifying the hacker. For years it was the best explanation and really the best wake-up call that you could get to the risks of cybersecurity.

CIN: When did your career in cybersecurity begin?
SB: It began with taking the general counsel job at NSA. It was about the same time that I got my first computer. My acquisition of the computer was in part driven by the fact that in my job I was required to understand the technology and its implications for law and government and society. And so I figured I’d better acquire one.

CIN: What about your background made you think: “Ah. The NSA. That’s where I belong”?
SB: It was nothing like that. I was happily practicing international trade law. I was working with a very eminent partner at Steptoe who had been the legal adviser to the State Department, and NSA’s previous general counsel had gone to the State Department and become the deputy legal adviser there. She was asked to help them find a new general counsel. She ended up talking to my patron in the firm, who later told me, “Well, Stewart, I gave them your name, because I was sure you wouldn’t do it.” I of course knew nothing about NSA. No one knew anything about NSA at the time. I went out to interview there partly on a whim. I believed there was more to life than the billable hour. I had been in government before—as deputy general counsel of the Education Department—and I was ripe for going back into government.
     At NSA they quickly introduced me to the problem of encryption from their point of view, which was that encryption was wonderful for protecting American secrets—nothing but the best encryption should be used for U.S. government communications. But it had become a success as an intelligence agency by trying to make sure that nobody else anywhere in the world had encryption they could completely rely on. NSA’s job was to break encryption. They would steal the keys to the machines. Anything they could do to make sure that the encryption didn’t work around the world for other governments. And one of their tools was export controls on encryption, which guaranteed that the only encryption that could be exported would be encryption that NSA could break. That worked great, from a national point of view. If you’re a nationalist, you say, “Our people deserve the best. Those foreigners deserve nothing. So let’s give them crappy encryption and reserve the best for ourselves.” Which worked for a little while, but came under threat as people like bankers said, “We do business abroad. We need encryption abroad, too.” So a variety of compromises were reached for some industries, but the big conflict that was just emerging was with Silicon Valley, and with Microsoft in particular, which very badly wanted to sell their products around the world.
     They were just embracing the internet, or something like it, which they mostly wanted to turn into a walled garden. But they definitely had a vision—of people buying things and selling things and communicating over computerized networks that were open to the public—that very much resembled the internet that we have today. And their enthusiasm for that vision was tempered by the fear that no one would buy anything if their credit card numbers couldn’t be protected from eavesdropping. No one would say anything if their communications couldn’t be protected, too. So Microsoft wanted to send very strong encryption around the world in software you could buy for 50 bucks. That meant that Microsoft and NSA were on a collision path over whether to allow the widespread use of really strong encryption. And for a variety of reasons, I ended up charged with defending NSA’s position that we shouldn’t export good encryption. That meant that I had to address all of the issues and all of the remarkable advantages that Microsoft painted when they talked about the internet to policymakers, advantages that Microsoft blamed NSA for not allowing the company to deliver. My job was to respond to all of the arguments from Silicon Valley about the need to export strong encryption, and that forced me—because this is what lawyers have to do—to learn all about the internet, its commercial prospects, how it functions and the like. And I became converted to the vision that the internet was going to be a big deal. Of course, I couldn’t afford to be convinced that strong encryption was so necessary to that vision that it should be released to the world. It was a long fight. NSA gave ground slowly from 1992 to about 1999. Now you can get very good encryption in free software. And NSA has had to learn other ways to collect intelligence without expecting to always break the encryption. They had bought enough time to be good at that, mainly by getting the intelligence out of people’s computers. So that was what really got me into technology—being forced to learn it, in what amounted to an adversarial context.

CIN: In my research I bumped into the Clipper chip   controversy, which was during your time. That seemed to be a very big deal. And what struck me as remarkable was that the issues that roiled people in the early 1990s, when this all blew up, haven’t disappeared. Back doors, encryption …
SB: No, they’re the same. It is interesting. I sometimes say we’re in what amounts to the Second Worldwide Crypto War. This time mainly focused on law enforcement. The first one was focused on intelligence. I feel a little like a World War I veteran confronted by World War II, which was both similar to and different from the first war. But many of the arguments are at least similar. Some of the arguments that are the same probably shouldn’t be made in the same way. It’s a little discouraging how people dig in and pursue the same argument for 25 years—on both sides. And simply become more convinced than ever that their side is right. I think the latest war is one where there is room for compromise, but no one wants to find it.

CIN: Can you paint a picture of cybersecurity back then? In your view, who were the bad guys, who were the targets and what were the risks?
SB: The unquestioned bad guys—from NSA’s point of view and the U.S. government’s and probably most of industry’s—were foreign nations. If a foreign nation gets access to your computers, nothing good is going to happen. Maybe nothing will happen. But if something does, it’s not going to be good. From the U.S. government’s point of view, it could be a commercial or a technological advantage that is lost. From a company’s point of view, the same. And the U.S. government just doesn’t like to have its nationals or its own computers subject to intrusion by foreign governments. That’s the biggest worry. There are, of course, a host of commercial hackers, and ideological hackers—I’m not sure how serious that last group is. But there are plenty of people who make money doing this. And lots of governments that do it for intelligence purposes, or do it, as in the case of North Korea, to make money. That’s the biggest threat, to my mind. The problem that we faced in the first crypto war was that there had been a group of cryptographers who deeply mistrusted NSA—and had, really, since the Vietnam War—and were sure that it was doing things that reduced security. And they wanted to build products that even NSA could not break into. So there was something between a wary respect and a deep hostility toward NSA on the part of those folks, and that affected the debate almost from the start. They were the people who were talking publicly about NSA’s role with respect to encryption. And its role was certainly not without room for criticism. NSA felt about encryption technology that it was kind of their bailiwick. They had invented all the best encryption techniques. They had invented decryption techniques that no one understood, and only they knew how to write code that was proof against those techniques. If NSA wanted to introduce a weakness into an algorithm that was later relied on by the private sector, that was at least an option. It wasn’t always an option that was pursued. But it certainly was an option that was on the table. Even though there was some risk that would ultimately redound to the disadvantage of the users, who could include Americans. That was part of the consideration, but it wasn’t something guaranteed to win the day.
     Now, you cannot overestimate how significant the decryption victories of World War II were in shaping NSA’s culture. They were, one way or another, part of breaking Japanese codes, and Nazi codes, and everyone agreed that those decryption achievements shortened the war and maybe made it possible to win the war. Given the stakes, no one wanted to be caught in the situation again where we did not have an overwhelming advantage with respect to dealing with foreign nations’ codes. At the same time, the Soviets, who had seen that experience, had developed formidable capabilities of their own. We only occasionally got little glimpses of what was going on inside Russian communications, because their encryption was so good and so disciplined. So everybody was aware that what we had achieved in World War II was not ours by birthright. It was going to have to be something we scrapped and clawed at if we wanted to get that advantage again. So NSA was reluctant to surrender any advantage, including export controls on encryption.
     For Microsoft and the rest of Silicon Valley, they were already deep into the cycle of destroying other people’s businesses by turning them into software. Microsoft had done that successfully and become a massive new company at a time when new companies were rare. Basically by eating other people’s businesses and saying, “Oh, sorry about that!” and moving on. I think they started out with the assumption that NSA’s encryption and decryption advantage was just one more business model that was not going to survive Microsoft’s software advantages. They were more confident maybe than they should have been, but they had plenty of experience and a lot of achievement behind them. You had two very proud, very self-confident organizations dueling with each other over matters that each of them thought was central to their future.

CIN: We’ll come back to that theme later, when I ask you about public and private sector cooperation. But to return to your story, after you left the NSA, you rejoined Steptoe for 11 years, during which time technology, surveillance and national security were all big parts of your practice—until your next big move into the public sector. You spent a year as general counsel of the Robb-Silberman Commission , which reported on U.S. intelligence failures in the lead-up to the Iraq War. Why this toggling between private practice and working for the government? Was it by happenstance or design?
SB: Large chunks of our careers only make sense in retrospect. There was no doubt that I liked government service. I was inclined to say yes when asked. And then inclined to seek out opportunities if I hadn’t been asked in the last eight or 10 years. I lived in Washington. I didn’t have to move to work for the government. There were problems with that, from a career-building and client maintenance point of view, but that’s what I did. And I don’t regret it. The reason I went into the weapons of mass destruction (WMD) commission was, after the 9/11 attacks, it just felt wrong to be sitting on the sidelines if there was something I could do that would be useful to the government. So after 9/11, I was open to going into government. If someone had called before Robb-Silberman, I would have done that.

CIN: What was your role at the commission?
SB: I’m pleased to say that, under the guidance of the two co-chairs, we actually produced a consensus bipartisan report that everyone joined. I was the general counsel in charge of the drafting team, so I put together the draft, with a lot of help. Then Judge [Laurence] Silberman, in particular, took what were probably 100 recommendations and drove them through the Bush administration with enthusiasm and a certain amount of craftiness to make sure that they didn’t just sit on the shelf—which is the usual outcome for commission recommendations. The conclusion was that nobody lied about believing that there were WMD in Iraq. Everyone believed it, including the intelligence professionals, who came to the same conclusion as the Bush administration as a whole. And we thought the reason that they came to that conclusion was that we didn’t have very good intelligence on Iraq, and everything that you could see about Saddam Hussein’s behavior looked like he was guilty. It looked like he was hiding something, and we had caught him hiding WMD a couple of times before. So the smart money bet was that he still had them. That turned out to be wrong. Once we realized that bad intelligence was at the heart of the problem, we spent most of our time looking for ways to improve U.S. intelligence capabilities on WMD, so that we didn’t find ourselves in the future relying on common sense.
     When I finished work on the commission, Judge Silberman recommended me to Secretary [of the Department of Homeland Security Michael] Chertoff. He said, “This is somebody you will want to hire.” And Chertoff did hire me to do policy at DHS.

NEXT: In PART II, Baker will describe early tension between the U.S. and Europe over privacy, his view of Edward Snowden’s effect on the NSA, the big issues he sees now in cybersecurity and the role that general counsel should be playing.