IGF-USA 2015 Breakout – Maintaining Trust Online: Cybersecurity, Encryption, Backdoors, and Privacy
>>DAVID VYORST: I’m going to introduce our
moderator. Jon is a full professor at Carnegie Mellon University, addressing information
networks. In government, he served on the Federal Communications Commission As Chief
Technologist, in the White House As Assistant Deputy Director of Technology Policy, focusing
on telecommunications research, in the House, Energy Subcommittee, responsible for telecom
and e commerce issues, and he helped lead a program to assist developing countries with
information infrastructure industry. He has been CFO for three high tech companies, and
a member of AT&T Bell Laboratories, and Microsoft. He’s also done a whole bunch of other stuff,
too. (Chuckling.) And Jon will introduce the panel. We’re not going to give full bios,
they’re in your handouts for this panel, and on the website. Jon?
>>JON PEHA: Thanks, David. I’m happy to be here with an amazing group. I would like to
take credit, but I can’t. This is David’s fine work. I’m happy to have been drafted
as speaker and moderator. First, I want to let’s frame this issue a little bit, and define
what we’re talking about. The title of this was “Cybersecurity, Encryption, Backdoors,
and Privacy,” which sounds like we were just we couldn’t decide what to talk about, so
we had four topics. There really is one topic here. There’s just a lot going on in that
title. I like Dave’s comment this morning, because
I make it myself. So, it must be good, about security. This is really about security, but
security means many, many, many things. And that confuses us. And there are different
elements of it. So, two of those elements in particular, I want to pull out. One of
them is that security may mean protecting information in all sorts of ways. Those we
don’t want to get it, also a privacy issue. And that’s something we care about.
We also care, sometimes, as a security issue, about surveillance, about tracking down entities
that are dangerous to us, might also be an issue in security. And sometimes those things
come into conflict, is what this is about. To me, you know, it used to be a fascinating
abstract issue. As, you know, you got my some of my past from David a minute ago. As a professor,
I thought about this and have written about this. In government and places like the FCC
and the White House, we thought about this as a problem that might happen to somebody
else. There is an economic adage that, you know, a recession is when your neighbor loses
his job and a depression is when you lose your job.
This is now an important issue, if anyone who’s following the OPM hack of recent weeks,
you know, happened in these government positions to go through the most invasive imaginable,
you know, investigation in order to get security clearances, had to reveal the embarrassing
things I did in junior high school that I’ve been able to hide from my mother all these
years, and, you know, now that’s all out there. So now this has become a serious issue. And
part of that, of course, makes me want to say, you know, what can we do to better protect
information? What could the organizations, both government and government contractors
have done? Are there ways that technology there would be better protection and avoid
all vulnerabilities? It also makes me want to say, I want to know where my data went.
Maybe it’s the Chinese government, maybe it’s somebody else. We’re still waiting to find
out. That’s kind of a surveillance issue. I need agencies that are capable of finding
that out. And the realization that I want technology
to be vulnerable from agencies in another country, and I would like to be able to be
tracked down leaves me confused. This is what we’d like to talk about here. Let’s see. Oh.
I would also say, this is going to come this is an inherently interdisciplinary discussion.
I hope we’re going to hear a couple of sides of this. I am an engineer who has spent a
lot of time with policymakers. My engineering brethren, many of them like to believe that
the policy is irrelevant, when, in fact, the laws and practices and policies and internet
governance we put in place have a lot to do with the incentives to protect that data or
not. The incentives to build in back doors or not.
And it is enormously important. But also, there are people in this town who like to
think that the policies really make the decisions. I happen, right now, to be example of the
Future Internet Architecture Project That NSF has pumped tens of millions of dollars
in, to the question of, what is after TCP/IP, I’m pleased to be a piece of that. We have
engineers sitting, making seemingly obscure technical decisions like things about how
do you do authentication, or in this case where encryption can be on or off, which the
default? These seem very technical, and yet, they have
enormous impact. I hope we’re both going to get a little technical and policy discussion,
and a lot that goes in between. And now, let me bring some speakers in who can help us
get started. I guess, Nick volunteered. Maybe I will start this way and move down. Our first
speaker will be Nick Sullivan, who is the Security Engineering Lead at CloudFlare, and
previously worked at obscure technical organizations like Apple that may have thought about security
a little, too.>>NICK SULLIVAN: Thanks, Jon. So, just maybe
a little bit more about myself. I’m a technologist. I build and develop trust enhancing technology.
And that’s been my focus for a while. As Apple, I helped develop iTunes, and iTunes was a
revolutionary new way of getting music delivered over the internet. And one of the most important
things that made iTunes successful was the fact that you could trust that the music that
you were getting, and the manner in which you were getting it, was both secure and trusted.
So, around the time that iTunes came out, there was the whole Napster debate, debacle,
and eventual shutdown. And at the same time, various other organizations and open source
software came up to try to deliver content to people in a, sort of, untrusted, unregulated
way. And this led to a very bad situation for people who had gotten used to getting
music online, but were instead flooded with malware and viruses, and all these sort of
things. So, one of the most important things that helped Apple work with the businesses
that were the content owners and content creators was cryptography and strong encryption.
The fact that Apple built this in meant there was a trusted ecosystem, and all of the direction
that the internet provided to people, which was access to content, access to technology,
access to information, easily and very simple and comfortable to use for the average person.
Two years ago around the time of the Edward Snowden revelations, I left Apple to join
an internet company called CloudFlare, whose vision was to provide trust for websites and
the internet. And this sort of fit with my previous role at Apple, which was to take
something that was potentially risky and potentially dangerous, which is browsing the web, and
add trust where it could be added, and take technologies of encryption, which I had studied
in school, and done a master’s degree in, and apply it in a way where it could scalably
help an entire industry. So, the internet is and the web specifically
has grown enormously, and exponentially since the time that it’s come around. It’s no longer,
say, a hundred thousand websites indexed on Yahoo or something like that. It’s a multiparty,
multidisciplinary, multi stakeholder, multi jurisdictional part of our everyday lives.
And the technology that makes it up is owned, operated, and controlled by not only, sort
of, large governance bodies like here, but small businesses who run and operate websites
and provide internet services. So, this is great. And it’s grown, and it’s
been fantastic for enabling the world to access information, but along with that comes a lot
of danger. And the internet was not designed purposefully with security completely baked
in. It was something that was added along the way, and something that a lot of people
from a lot of different places have contributed to help make it better. But, there’s no catch
all solution to securing the way that you surf the internet, and to improve trust.
So, one of the projects that I took part in at CloudFlare since I joined was universal
SSL. One of the challenges that people have with websites one of the very basic technologies
you have to help ensure that you as a web visitor and you as a web property are secure
is HTTPS, which the underlying technology is a protocol previously called SSL, has been
standardized by the ITF. It’s been around for almost 20 years this technology, but it
really transforms what was a communication between a person visiting a website, and the
20 or up to a hundred different hops in between. When you send a message to a website and it
comes back to you, it goes through numerous different transformations, physical transformations
through different media, whether wireless, wired, and through different circuits and
computers along the way, routers, which is proxies, and the like. And what SSL does is
provide a way to make sure that you have confidentiality and authentication between the two end points.
It doesn’t there’s certain things it doesn’t provide, but, in any case, this is very difficult
for a lot of people to set up operationally because you have to purchase a certificate
from a certificate authority and know how to run a web server and install it, and make
sure it doesn’t expire. And there were previously and there still
are some, sort of technical reasons why encrypting data is more expensive than not encrypting
data. So, what universal SSL did was enable websites we engineered a solution to allow
websites to use our free service, but yet gain SSL encryption on it. So, the web is
big, and the internet is big, and it crosses everything. So, the more you can do to reduce
the threat surface, the better it is for people’s privacy online. And that’s something that
I’ve been focusing on, is maintaining accessibility of information while improving trust.
>>JON PEHA: Thank you. Next up you know, I’ll just go down the line. We have Eric Burger,
who is a Computer Science Professor at Georgetown University, and a Non Executive Director of
the Public Interest Registry, in his spare time.
>>ERIC BURGER: This way you can’t stop me. You can still kick me. So, I want to focus
on actually, not the second half of the title, but the first half of the title of this session,
the maintaining trust online. I’m going to deviate off of encryption for a while, but
trust me, I’m going to bring us back to encryption and security. But, to give a level setting
here, basically, as Nick pointed out, we need trust to be online. We need to trust that
we’re not going to be getting malware. We need to trust that the information we’re getting
is the information we’re looking for. We need to trust when we give someone information
that they’re going to treat that information in the way that we expect them to.
And so what I’m going to give an example of is a project that we just did a prototype
of, just wrapped it up at Georgetown, where we were looking at HIV and AIDS patients in
the Maryland district and Virginia. And the problem that the public health authorities
were having was, let’s say someone lived in Maryland, worked in Virginia, and went through
the district. And so, they live in Maryland, and maybe they may or may not be known to
the health authorities as having HIV, but they go to their doctor in Virginia because
it’s convenient. It’s near their work. The doctor has records.
Virginia now knows, hey, here’s a person who’s got HIV and they buy their drugs sorry, they
get their medicine in the district. The people in Maryland are saying, hey, we’ve got this
person, and we actually do, this is an infectious disease, we need to know if they’re infected.
Are they getting treatment? Well, they don’t know that because Virginia knows that. Then
if they’re getting treatment, are they taking care of themselves? Are they getting and taking
their drugs? They don’t know that, because that’s information held in the district. So
there’s this need for everyone to share the information.
Now, of course, no one wants to share that information, because that’s highly sensitive
information. And so now, when we get to the technology, is it good enough to encrypt the
traffic? Because you can imagine, we’ve got the server of Georgetown connected to all
these public health authorities sending us the information of people who are infected.
Obviously, that has to be encrypted. But, is that enough? Well, the first is that enough
question I will hand out, and this lunatic sitting to my right is one of the authors
of this, risking it all, locking the back doors of the nation in cybersecurity, AAA
USA. Let’s say we get this information onto a server, but, for all sorts of good reasons,
we put a back door not we, the vendor of the systems put a back door into that system.
And that could be used for finding, you know, bad people that are doing bad stuff, but this
happens to be HIV information, which we, as a public, probably don’t want to have leaked
out. So, what can we do about it? As you can imagine, we encrypted everything, we put this
machine literally in a vault, and by the way, we didn’t have the combination if it caught
fire. It could’ve well burned itself out, because it was in a vault, you know. I mean,
it was that secure. And so, we were working on some things in two areas. One is call the
multiparty computation. So, this is where Virginia has its information,
the district has its information, Maryland has its information. And it shared that information
amongst themselves. You look at the information, you have no idea what it is. The really cool
thing is, you don’t know what the computation is doing. It says, we’ve got a match or we
don’t, that’s the only thing you know. In fact, you can set it up such that you don’t
even know who it is, you just know you need to go look somewhere. Another area of work
that we’re working on is called zero knowledge computation, used a lot, the target was cloud
computing. Where I will scramble my program, and I will
scramble my data, and I’ll give you my scrambled program, which is a program. It runs. It takes
the scrambled data, does when it does, and out comes a scrambled answer. The person doing
the computation has no idea what the computation was. They look at the source code and go,
wow, this is scrambled code. It’s still code, it still runs. But what’s the why are we looking
at that stuff? The idea is that we need to move from, sort of, the Fort Knox security
model of having high grade encryption, trusted personnel, only looking at something when
we have the permission to look at it, to even if you did look at it, you’d have no idea
what you’re looking at. And the idea is that this gives people the
confidence to sort of give up their data. And you might ask, that sounds pretty esoteric,
I’m not infected with HIV, what do we care? Even though, as Jon said, sometimes you think
you don’t care and it turns out you do. Let’s look at the libraries. The American Library
association Has Done a lot of campaigning about the Patriot Act. 215 is the part where
the government can request records about physical stuff, like books.
By the way, the status is not clear. It may be that with the last round of the Freedom
Act, it might not have been renewed. That’s not clear. But it does give us kind of a morality
tale, a real what if situation, because it certainly was the law for the past ten years.
So, like me. So, I’ve read some books on psychological disorders. I watch Sense 8 on Netflix. And
I read books on alternative World War II, you know, alternative histories. So, what
kind of profile does that give me? An example I have in one of my patent disclosures, if
I Google HIV, does that mean I’m a researcher looking for information?
Does that mean I have HIV? Or in the Facebook sense, do I like HIV? There’s a lot of worries
about what it means to have all this information at our fingertips. So, I saw a lot of people
shaking their head, yeah, that would be bad. So, listening to the American Library Association
people, a lot of people go, that’s great, you’re protecting our stuff. They’re taking
the 1980s approach to what I’ve been talking about in terms of our work in multiparty complications,
zero knowledge, which is when you rent borrow a library book, they’ll only keep that information
as long as you return the book. The moment you return the book, they tear
it up, delete the record, whatever. They don’t have it, you can’t get it. What’s the number
one thing people ask at their library? When I get a book from Amazon or a movie on Netflix,
they give me suggestions. This is the book you’re going can’t you do that for me, too?
As Jon pointed out, this is not a black or white, we should or shouldn’t do something.
There are times when we want to have this information, and as consumers. It’s not just,
yeah,, we all agree, we want to catch terrorists. No, actually, I want you to track me a little
bit. So, as long as we can figure out how to deliver
both, and as long as people are confident with where their data goes when they want
it to go, people keep using the internet and we get more and more and more connected. Thanks
a lot.>>JON PEHA: Thank you. Next up, Bobby Flaim,
a Special agent with the Federal Bureau of Investigation, and a big player in a lot of
internet issues.>>ROBERT FLAIM: Thank you, I work for the
Operational Technology Division, and that’s the part of the FBI that does the lawful intercepts
and digital forensics for all of the FBI, specifically, for our 56 field officers that
we have throughout the country. In so far as encryption and technology before I go any
further, everything you’re going to hear me say, I’m not the Director of the FBI, this
is not my official position you’re hearing. My thoughts, my perceptions, based on the
fact that I’ve been working in internet governance, and I work in the technology division.
So, if I can put that disclaimer out there. You won’t mistake me for the FBI director,
because he’s a foot taller than me with a full head of hair. And encryption this is
very, very timely, because he just testified last week about this very subject. And this
was after a few times he’s actually testified a few times on Capitol Hill about this very,
very issue. We in the FBI call it going dark. Because of encryption, and other technologies,
and how they’re advancing, going dark means the growing concern that law enforcement,
not just the FBI and not just in the United States, but international law enforcement
has an ability to do those intercepts and digital forensics.
And I think I can a that so far as law enforcement is concerned, we’re pretty agnostic about
technology. The only thing we really want to do is our jobs and our mission, which is
to capture criminals. So, we’ve been around a hundred years. It started out with the telegraphs
and underwater cables, or landlines and telephones. We need to progress with technology to ensure
that we are fulfilling our mission, which is public safety.
So, public safety actually cuts both ways, right? On the one hand, we have the OPM breach.
I’m the victim of this crime, in so far as that the security was not in place. The system
was hacked. Now my information is God knows where. So the public safety activity is proactive,
offensive to ensure we are protecting the public. On the other hand, as law enforcement,
we do defensive. Once criminal acts are committed, we find them, chase down the criminal. How
are we going to do that, therefore, we need the ability to be able to pierce through that
technology to accomplish the mission. And that is the crux of the matter. We can
talk about a lot of different things with technology, encryption is just one of them.
But the idea is how are we as a society going to ensure that we can perform the mission
of public safety, which is what law enforcement is designed to do. So, it really even goes
beyond law enforcement. It goes beyond what we do every day, and it really is a question
for society to answer. We want to have a society in which there’s norms and regulations, and
public safety officials, not just even law enforcement. There’s a lot of civil officials,
as well, can perform their mission. And if they can’t, then, you know, we’re at
a different space. We’re at a different question. What is going to happen? And how are we going
to ensure that the norms and regulations of society continue? And if they don’t, people
are going to be ready for those consequences, because we’re really all in it together. And
the idea that it’s we’re pitting law enforcement against technologists or encryption really
isn’t what it all boils down to. And like I said, a lot of times people have
misconstrued what the director has said in so far as we want a back door, we want special
keys. That is not the case at all. The case is, when we have a lawful order and that’s
what it’s all about, lawful order, we already do this by going through a court process when
you have a legal order, how do we get that information that we are legally entitled to
based on an investigation? And we are not the ones who are supposed to come up with
a solution or prescribe technology, we’re not the experts. We need the experts to do
that. I think that is going to be very important.
And I think some people are asking for us to come up with a solution, but I don’t think
that would be a wise move in so far as that since we’re not the experts, I’m sure there’s
something that we might come up with that may not be the best solution because it just
might be very parochial in so far as how we look at things. So, with this discussion,
with encryption, I think we as law enforcement when I say law enforcement, international
law enforcement are pretty open in terms of what solution we come at. But, if our mission
is to protect the public, if that mission is to be maintained, we have to find a way
to work with encryption and still do what we do. So, I think that’s all I have for an
opening statement for now, so, thank you.>>JON PEHA: All right, thank you. And last
up, Mieke Eoyang is director for National Security at Third Way. She has previously
been Chief of Staff, and Staff on the Select Committee on Intelligence, where I imagine
this comes up a little, too.>>MIEKE EOYANG: It does. Thank you, Jon.
We’ve heard about what the private sector does to secure the internet, and we’ve talked
about some of the challenges there. And Bobby talked about what the government does to try
and secure all of us, not just on the internet, but just generally. But, one of the tensions
inherent in this panel is, what happens when government is the entity that is trying to
get the unauthorized access? And it’s not just about Bobby and the FBI, there’s a whole
other set of three letter agencies that we’ve seen press account after press account that
have gotten access, unbeknownst to companies and users of data that we didn’t know.
So, I think everyone would agree that we want government to be in a role where they can
help secure all of us. I don’t think, as technology has evolved, we’ve come to terms both in this
country and internationally about what the right role is for government in terms of their
access to that information, what should the rules of the road be, and what should the
rules of the road be in a world where the government is both the protector, the victim,
and the person who is trying to get access in ways that they may not want someone to
know about. And I think that’s really the challenge here.
I don’t think that we’ve had a good conversation in this country between these three points
of view in terms of what the government wants. I’m very sympathetic to the FBI’s concern
that they are not going to be able to see when bad guys communicate with each other,
and someone who works closely with people in Boston, you look at the Boston Marathon
bombings, what happened on 9/11, terrorists plots in this country, there are things I
want the government to be able to stop. But I think the question is, when can the government
have access to them? What kinds of level of suspicion should they have before they can
access, how big a pool of information do they get to build in order to find the information
that they are looking for? And if people think that they are acting in
a way that is too broad, and the market response is that they are going to provide everyone
with encryption that keeps all unauthorized people who are trying to access the information,
not just the government, is that okay? Because any type of access that’s provided to the
government, given the number of adversaries, not just on the criminal side, but from other
nation states, given any potential access to could be out there, someone else will find
a way to abuse it, even if you are able to create access that the government could have
to data, you would have to be able to trust that the person that the government gave that
access to would not take that access and take that information straight to the Chinese and
the Russians with everything that we know about how the government conducts the kind
of national security surveillance that it needs.
I don’t know that we are in a position where you can honestly say, you know, they say,
Benjamin Franklin said two men can hold a secret if one of them is dead. Can you honestly
say that if you provide access to something that’s available only to authorized users
that that information won’t get out there? What does that mean if we are not in a place
where we can have privacy?>>JON PEHA: All right, thank you. Well, let’s
see if we can drill down and get a little bit more specific. I’m going to start with
you, Nick. You know, this room was probably a little savvy. Most people outside this room
tend to think if you’re in your living room and, you know, the windows are drawn, what
you do on the internet completely is opaque to everyone, right? Let’s imagine I am at
home, my windows are drawn. I’m going to go. I’m going to do a little online banking. Maybe
per Eric’s trial, I’m going to do a little searching on HIV and I’m worried about my
own health, maybe check a little email. As I do those things, what am I exposing, and
to whom am I exposing it?>>NICK SULLIVAN: Yeah. Well, lots. (Chuckling.)
Obviously, it’s kind of a setup. But, if you are browsing the internet, you’re leaving
a digital trail through hundreds of different organizations in different jurisdictions.
And one of the things that I mentioned earlier, HTTPS, is one of many tools that are used
to help secure different parts of the internet. And as we know, the internet architecture
is extremely complex, and interrelated pieces. Some of them are encrypted, some of them are
not. Some of them have the famous catchword of meta data that we’re so happy to talk about.
But, there is metadata, there are various different identifying pieces of information
that fly around. To be more specific, if you are browsing on your web browser and you’re
going to bank’s site, what really flows over the internet, right? That’s sort of the easiest
first question. So, in a technical sense, your web browser does a DNS query. So, www.bank.com,
and it asks various domain name servers, you know, the phone book of the internet, translate
names into numbers and you get IP addresses. DNS servers, typically, you’ll be set up with
one that’s at your ISP, so your internet service provider will know and have a record of which
websites you’re going to. And potentially, if they don’t have the record,
then they will go upstream to another DNS service, such as the DNS administrator for
this website. So, these requests are un encrypted and fully go over the internet, and over,
first of all, your local network. And if you’re connected to a cable network, every request
inside of your local neighborhood, if you’re, say, use Comcast or Cox or something like
that, is broadcast. And every single person in your neighborhood can tell that you are
doing these sort of things. And so, this is just the DNS. (Chuckling.)
And so once you get your address, you connect to that web server, the IP address that you’re
connecting to, and establish a TCP circuit, meaning you send a hello to them, they send
a hello back, and you confirm that this is an actual end point that you want to connect
to. And what SSL does, HTTPS almost all banking sites have a little lock icon, the website
starts with HTTPS. That means that it’s encrypted and it’s encrypted with TLS, which is the,
kind of, most popular protocol for encrypting things online.
And what does that give you, really? Well, it gives you privacy and authentication from
your computer to that computer that says that it’s the DNS has said, is your bank. So, everything
between you and the bank is scrambled and nobody listening along the line will know
what the content of that data is, but very clearly, in the headers, the internet headers,
you do have a “to” and a “from.” So, what someone can what you’re protected from is
someone reading the actual information that you’re typing into your bank, whether it’s
your password or your transaction history, but between you and the bank’s website, it’s
encrypted. What it doesn’t protect you is from what happens
inside the bank’s infrastructure. So, you’re connected from point to point from your browser
to the website, and it’s protected, but the metadata is there and visible to anybody who
controls any hop along the way. And once it gets to your bank, it usually hits a piece
of machinery called a load balancer, and behind that there’s a set of computers called a database,
databases are some, applications servers, various other computers that are connected
together. If your bank is a traditional bank, they have
that in a data center somewhere and it’s all locked down. Potentially, it could be on the
cloud, meaning it’s on a rented piece of hardware somewhere else. Here in Ashburn, there are
quite a few computers that people use, AWS, Amazon’s data centers are there and it’s one
of the more popular places. A lot of the times, your data goes there. Once it passes the first
machine, there’s no guarantee that anything is encrypted. So, I guess, in a practical
sense, you’re sitting at home, you have your browser in incognito mode, but, anybody on
any part of these networks, and, say, computers along the way, who may have pieces of hardware
installed in them as back doors or, you know, if it’s manufactured in a place where there’s
a piece of hardware in it that is not trusted, they can read who you’re talking to, when
you’re talking to them, and how much data you’re sending.
And then potentially, once it reaches the bank, you know, stuff happens behind the scenes.
>>SSL, there are whole movements, right, to expand SSL as a solution.
>>NICK SULLIVAN: That’s what I mentioned before.
>>So what I’ve just heard is, my ISP, other ISPs, even my neighbors get some information
even with SSL about who I’m talking to, even with SSL. We need to come back to what happens
on the other side when it’s un encrypted. Does it matter that my neighbors and these
other can you learn anything important simply by observing those things from the DNS lookup,
and from the header information?>>NICK SULLIVAN: Absolutely. If you’re as
Eric was describing, Googling different things, going to different websites, someone can find
out and potentially make a profile on what you’re doing. That doesn’t necessarily say
who you are, but, depending on who the person is, they can make assumptions about what websites
you’re going to, when you’re doing it, and from a law enforcement perspective, it can
help establish, say, digital trails, digital fingerprints of something that you’re doing.
If you’ve been investigated and your ISP is tracking this, they can go back and say, yes,
this person did visit this site, this type of this crime.
>>JON PEHA: Excellent. I have a great CS professor here on my left who I may bring
in. So, rising SSL is one of the things we are seeing changing, and we’re seeing that
solve some problems, but it leaves some holes.>>NICK SULLIVAN: I have to be clear, the
content is protected with SSL, and that’s a very, very important thing. Metadata less
so. And you can derive a lot from metadata, but SSL does protect your content.
>>JON PEHA: Are there other things that are happening right now that are important in
terms of trends of protecting information, and new ways to do it?
>>And this is definitely not knocking CloudFlare, I was one of the first customers.
>>NICK SULLIVAN: Sure.>>In fact, again, this is a shade of gray.
One of the greatest things that came out was the SSL, because basically, CloudFlare sits
in front of my website. And I have had, typically after something like this, what’s this Burger
guy doing? Standardstrack.com, and standardstrack.org. One of the problems is CloudFlare is lying,
right, because it’s not me. It’s them as me. Which is what I want. You know, I want them
to present themselves as me. There is another little issue with SSL, which is not necessarily
an SSL issue, or TLS. It’s what we in the ITF called the magic pixie
dust of key distribution. And the magic pixie dust in the old, before public key cryptography,
the only way we could do stuff encrypted was we had to share a secret. I had to give you
a secret, tell you, I’m going to give you this password that will encrypt this software,
and only someone with that password or a heck of a lot of computing power could ever un
encrypt the information. And as you can imagine, that’s great if I’m
going to, you know, go with Jon or get one of my cards on the back of my card, it has
my public key. But what do you do when you’re going to your bank? The bank’s not going to
have a million relationships and say here’s our little secret, and by the way, please
don’t tell anyone. So we developed public key cryptography. A lot of neat mathematics
such that, like, Symantec can go to Bank of America and issue them a certificate, and
Bank of America can issue certificates for their servers, you know it’s Bank of America
because they say so, and Symantec says so, and I trust them because when you get your
browser from Mozilla, operating system from Apple, it comes preloaded with Symantec and
U.S. Department of Defense, and 300 other of your most favorite people you trust, like
telecom. One of the problems we have now that we are
addressing with the technology called pinning. If I got to my bank and see it’s signed by
GeoTrust, that’s Symantec, I trust them. If one day, my bank is signed by a telecom, my
browser will say it’s good even though it’s not. This technology, again, was a theoretical,
except in the intelligence community, where it may not have been used up until the day
that Lenovo did it. And that’s when, you know, again, it hit everybody.
They were saying, we’re going to improve the experience of our customers by sitting in
the middle, lying with a trusted issuer. That issuer is no longer trusted. So, from a policy
so there’s some technologies that we’re working on, improving that. From a policy perspective,
one of the questions you need to ask yourself is, am I going to put my trust in people,
or am I going to put my trust in technology? Ultimately, you need to have your trust in
people because technology can do some bone headed things.
But when it comes to privacy, right now our trust model is, we trust the government’s
going to go to the court to get permission to do whatever they need to do. And this has
worked. And in the United States, no matter how much we want to whine, it actually has
worked pretty well. Again, if you look at my introduction there, you know, the most
advanced technologically and most liberal country and one of the richest countries in
the world, democratically elected, and then the government went south from there.
All of their structures based on trust continued to work on trust, but the trust was the wrong
kind, and all of a sudden all of their citizens were exposed and a lot of them were murdered.
I’d rather that we can come up with technological solutions that can provide access when needed.
We took General Alexander’s gauntlet, and we think we have an approach. But one of the
downsides is the search has to be auditable, which would be okay for the FBI because they’ve
gone and gotten their subpoena, not okay for the intelligence community, so that’s still
an open research problem.>>I’m dying to take that in a hundred directions,
but I’ll I haven’t heard I haven’t yet been able to bring in two great speakers, so I
can’t do that yet. But, we’ll tee up things that are coming. I mean, among the things
I just heard, there were systems built that created security and some of the players may
not have had incentives to use them as was expected, which will be coming up as we get
to policy, and certainly a Lenovo case, and even more generally, holes are shifting, we’re
plugging some and creating others. For the non engineers that wanted to pull something
out that, this is complicated. (Laughter.) This is what an engineering professor does,
I just learn to translate. A lot of places to go with that, but I’d love to bring in
a perspective of those who have to think of this from an intelligence, or in this case,
a law enforcement perspective. You’ve heard of we just talked about a couple of developments
that are interesting and complicated such as the rise of SSL. We could be talking about
a bunch more. You said you used the phrase “going dark,” which much in discussion. Push
a little harder on that, and, you know, when things went from telegraph to telephone, also,
change means new things happen. What when you say going dark, could you be more specific
about what you’re losing, and maybe, are you gaining anything, too? How are these changes
in technology really affecting how the FBI works?
>>ROBERT FLAIM: Well, I think from the security aspect of it, you know, we want our own communications
to be secure. So we’re relying on private industry, and that’s what we’re using with
our own customized tools, as well. But obviously, we are gaining. The better the encryption
and security, we as law enforcement gain. Going dark actually is a program that was
started by the FBI several years ago, and now it’s one of the director’s top initiatives.
And as technology is advancing, it’s becoming increasingly difficult to be able to do intercepts
and digital forensics. And that presents a problem to us, because
as we as the FBI already resourced and we’re pretty lucky in so far as a law enforcement
agency, because we have the most resources, we’re the best funded. But what happens to
the state locals who don’t have a cyber squad or digital forensics? What do they do? And
we do help them as law enforcement, or as the FBI. What it means is that as this technology
is advancing and we don’t have solutions for it, we don’t have the ability to act as quickly
and as efficiently as we need to in response to a terrorist threat, a kidnapping, pedophile,
a murderer, a bank robbery, you name whatever the crime is.
And every time you’re losing time, or even worse, the actual inability to solve a crime,
you know, then that presents a real danger. So, with all of these advancements in many
different fora, not just with encryption, but there’s lots of advancements in technology
that we are having to deal with what we call data as rest, which is stored data, and data
in motion, which is intercepts, it presents a real challenge to us. And if we have to
drill down and do the onesies and twosies solutions to each piece of technology as it
comes out and devote massive amounts of resources to one case, you’re taking away from other
cases that need to be worked on. And we have to start prioritizing. We already
are, and that’s at the detriment to other cases and other issues that we need to address
in total. So, for Going Dark, there are significant issues and problems and challenges that we
need to address. And it needs to be an enterprise solution, not onesies and twosies. And even
people within law enforcement are very used to saying, even if they can’t get to something,
we’ll work around it. Don’t worry, forget that company or this technology, there’s another.
But now the odds are really working against us, and it’s presenting a real danger. So
when we talk about Going Dark, that’s what we’re referring to.
>>MIEKE EOYANG: I find it really interesting, having helped draft laws that govern the way
government looks at information. The presumption in government that technology will bend to
the legal frameworks. I think it’s a real challenge here that the idea is that technology
somehow must accommodate the FBI’s mission. That the FBI has a mission, that they have
a court order, and they should be able to get the information because of that court
order. But as Jon pointed out, technology evolves. And actually, if you look at the
kind of information that the FBI not just the FBI, but the government at large has access
to in terms of the metadata and people who put things on Twitter, who, you know, whose
data passes through all these points in the chain, all of which have varying degrees of
security, actually, the government has access to far more data about individuals in the
last 20 years than they ever did in a period prior to that.
It used to be, if two people had a conversation with each other, the government had no way
of measuring that conversation. But now, if you send an email to someone else, the government’s
ability to collect domestically, the metadata, to from information, routing of the information,
perhaps the subject, there’s a huge amount of information the government can build in
terms of its persistent ability to look at people and say, we can build a pattern of
life just off of metadata, not even content. The government may have less information than
it used to have, but it still has tremendously more information than it ever did before.
If some ways, it’s a question of how well is the government resourcing itself to be
as efficient as possible with the information that it does have, rather than how can we
make sure we still have access to everything. In the days after 9/11, as the government
started to tell people, if you see something say something, and you had huge numbers of
tips flowing in, they were flooded by information that they couldn’t use by false leads that
sent agents down rabbit holes. The question is, how can the government be better at discriminating
to identify information it needs as opposed to pulling in everything just in case they
might figure out later that they need it.>>JON PEHA: Bobby, and then Eric.
>>ROBERT FLAIM: I don’t know what the implication was, but, I don’t think the idea is to vacuum
up metadata. What we’re referring to is very specific cases where we have very specific
needs, whether it’s based on a telephone call or a device. So, I agree with you. We don’t
want to suck up metadata, because that wasn’t the problem with lots of tips. Okay, we have
the information, then we’re responsible. We have to analyze it. We have to go down those
rabbit holes, if they’re false leads. I remember sitting on the Washington, D.C. sniper case.
I sat on the Austin bombing hotlines in cases and all of that. We absolutely don’t want
to do that. And I don’t think it’s a question of bending
towards, you know, the FBI wants let me rephrase that, I’m trying to phrase it in a way that
I think we’re trying to be very specific, or very tactical, very clear that when there
are very specific issues in technologies where that difference leads to the criminal, then
that is the information that we would need to accomplish the job. So, I’m with you in
so far as not vacuuming in data that we may or may not need, or requiring laws or changing
laws that would, kind of, advance that theory. But, we are already seeing it. It’s happening.
And it’s going to happen in greater numbers if we are at a disadvantage and the criminals
are at an advantage. So, that’s the real message, not, you know, bending things or trying to
get more information than we need. We don’t want that. We don’t. We just want to be able
to if there’s a crime being committed, we just need the information so we can solve
the crime. And I think it’s, you know, it’s a very complicated issue in so far as once
you start involving technology. But it does boil down to, you know, that basic principle.
And, you know, obviously, you know, we have to be better. Every agency could be better
at resource allocation and stuff like that. But that isn’t one of the problems that we’re
having now.>>One of the questions we have to ask ourselves,
is the internet, or electronic communication, different, or the same? So, for example, the
FBI could always ask for records of who is sending a letter to whom. When I was young,
a letter cost 8 cents and took about a week to get from here to there, and you had to
think about it. You had to write it. I remember at summer camp, I had to write a letter once
a week home. Really. And I almost did. My son is out of the country and I Skype with
him almost every day. So, is the internet different? Yes, because
the volume of communication has gone up dramatically. The cost and part of that is because the cost
to me has dropped dramatically, but it also means the cost of collecting and archiving
has gone down dramatically. But then we ask ourselves, yeah,, but, you know, when we sent
the letter, we had very strict expectations of what was inside the envelope was private.
And people more rabid than the library or post office people, when the police would
say we want to look inside the letter, it was like, over my dead body.
But, the flip side is, can you show me the envelope? Sure, here it is. There’s no expectation
of privacy. So, is that different? The economics might make it different. One of the things
I’ve been on the record for saying over the past few years is, do I really mind that the
NSA could crack TLS for me? You know, not in general, not a back door thing, but root
force attack, probably not, because it’s going to cost so much they’re really only going
to use it for the real terrorism case. Does that mean the Chinese can do it, too? Yes.
But, again, they’re going to be doing it for the heavy either their internal security,
or, you know, trying to get their goals, which is probably not looking at me. Do I want everybody
to have everything on that, probably not.>>Yeah, and I think something that you brought
up, which is interesting, is the two arenas that data is important data at rest, and data
in transit. And something that must be stated here is these are two very, very different
problems, technologically speaking. Data in transit, computer scientists have talked about
this for a while, and theorized you can protect data between two points. And up to a calculatable
amount, whether people like to say the heat death of the universe is how secure you can
send data from one point to the other. And as Eric, as you mentioned earlier, the
key management problem is the actual, really difficult problem when securing data. You
can break data in transit if you can find out where the keys are, which are at the end
points, from either, from one point to the other. So data at rest and the management
of data, this is a problem that computer scientists haven’t solved with the magic of public key
cryptography. It’s really, if you have two people communicating with each other, they
have a key, and that key is located on a specific place.
So when you talk about having access to data in transit, or data communicating between
one person and another, that’s a very different question than data at rest at an end point.
And mathematically, it’s different, as well. If you do have an end point, a computer, you
do have a key. And this is these two points need to be, I guess, distinguished.
>>I’d love to play engineer and jump in, but I’ve got to move on and ask a really hard
policy question. What I hear so, I’ve heard, for example, that even great security hasn’t
plugged the things that are exposed in control traffic. But you can do really well with the
data in motion, except what’s going on at the end systems. Okay. And I’ve heard that
you would like to be able to get at that data when legally authorized for something important,
but you don’t want to specify how. So, are there policies and laws that we are
discussing that can move that needle, that can solve his problem? You know, there’s bills
being discussed in both the House, and maybe you can tell us what’s being discussed and
the state of that. Would that affect the issues we’re discussing here, and now?
>>MIEKE EOYANG: There’s not a current proposal to change policies on encryption. Congress
just finished debating in the last month a change to surveillance technology that Eric
mentioned, section 215 of the Patriot Act. Under that authority, the government had collected
all the metadata on everyone in the United States communicating domestically on a landline
phone. As some people say, they were not building a haystack, they were creating a government
monopoly on hay for that category of information. When we’re talking about bulk collection,
it has been technologically possible for the government to Hoover up everything in certain
categories for some time. The Congress said, we don’t want the government holding that
data. You can leave it in the hands of the telecommunications companies, who have to
keep it for their own purposes. The government can come get the pieces they need when they
need it with appropriate authorization. That’s actually a fairly big change in the way that
we govern that, saying to the government, look, you can take what you need when you
need it to try and solve Bobby’s problem on metadata.
There are other programs in which the government is authorized to collect content data, specifically
content data where one end of it is outside the United States, content data where your
emails are between two people inside the United States, the government has to get a warrant
for information. Where one is outside the United States, you’re emailing with your kid
at summer camp who happens to be outside the United States, the government doesn’t know
it’s an American citizen, that information is not protected by the Constitution, because
they’re focusing on the end outside the United States.
And that law and that framework will be considered for renewal in January sorry, in 2017. It
will expire, Congress will have an opportunity to revisit the protections around that and
make changes, because that is actually one of the programs that Edward Snowden revealed
some court decisions about, government had subsequently declassified the decisions. And
that the public had a very strong reaction to, because in this day and age, given internet
communications, most people have some kind of communication with someone outside the
United States, or some entity outside the United States.
So, knowing that those communications are not protected is very troubling to a lot of
people, because it’s not just the metadata there, it’s the content. There’s another proposal
that’s pending in front of the government on cyber security information sharing which
would allow companies to share freely with the government information related to cybersecurity
threats, but the term cybersecurity threat is actually defined very loosely, so the question
is whether what how will the government read that information about cybersecurity threats
to be able to gather information from those companies, or will those companies be able
to share large quantities of information with the government protected from liability from
people who might say, you didn’t have the appropriate Constitutionally protected checks
and balances around this information before you sent it over.
That cybersecurity legislation may move this Congress some time before the end of the year.
They’ve tried to move it a couple times, but it’s been blocked. The challenge with that
legislation is that, I don’t think people really know what it means. And it could be
a way for companies to share really important threat information to help the government
stop cybersecurity threats, or a fig leaf for the NSA to do what nobody wants them to
do, which is get everything from everyone. Until we figure out what those definitions
mean, I think the interpretation of this law is a real question mark. Given what we’ve
seen about how the NSA has interpreted laws previously to suck in large quantities of
data, I think people should be very concerned about how exactly the government is reading
the statute that Congress is considering.>>Just a quick one, because we could spend
a whole panel on it, but there is a Georgetown program, just Google cyber, there’s a lot
of work going on. There’s technology issues, how to share information. There’s policy issues
of because, really, you can help a lot by sharing information. But we want to have it
be the information that we want to have shared. By the way, a lot of people were like, they’re
going to share my email address and who I’m talking with. No, it’s mostly, what’s the
IP address of the bot net command and control server in Romania, it’s almost typically not
personally identifiable stuff. But, you’re right, the way the current draft
legislation it’s getting better and better each time, but it’s still not quite there.
Frankly, my personal concern is not so much sharing with the government, but this is a
way of companies to collude and say, I’ve been breached, here’s my order book for the
ones that I don’t want you to bid on, oh, you’re giving me who you’re bidding, good,
we’re good. So, there’s a lot of abuse that can go on.
>>MIEKE EOYANG: Right. If a company knows its cybersecurity practices are lax, it could
share that information with the government. The downstream user is negatively affected,
they could say, I don’t have liability here because I shared that information with the
government so you can’t sue me for anything. There are a lot of implications to this law
that I think people haven’t really thought through.
>>They’re allowed and compelled to turn over information, are quite different. And if it’s
encrypted, per the Going Dark comment, is that in debate, or is that not addressed here?
>>MIEKE EOYANG: So the issue of whether or not, say, for example, the government comes
to Apple and says, I need information on the iPhone 6, and the government says, I don’t
hold the information to that, here’s the keys, it’s encrypted. I’ve complied with the law
and given you something not usable from a government perspective. One of the challenges,
this goes back to the third party doctrine, the idea that individual, me, who’s sharing
the information with a third party, Google, Apple, there’s not as much protection for
the company as there would be for me. The way the government would have to do that
is come to me with a warrant and say you have to turn over your papers and effects, what
you’ve actually seen. But there isn’t a process in any of the current legislation that says
to the companies, you actually have to be able to turn it over to us in a useful fashion.
>>Just to give you warning, while the last comments in response are coming, if you want
to think about your questions, after people respond, I will open the floor to questions.
>>The current law we have is called the CALEA Communications Act.
>>Law enforcement>>Sorry. I should know that like the back
of my hand. It came out in 1994. It specifically does not include encryption. The other thing
we don’t have in the United States is data retention laws. Europe what’s that? Right.
That’s exactly right, it doesn’t include a wire or encryption. And it’s only for data
in motion, which is intercepts, and there are no laws for a search or data retention
laws in the United States, so there still are a lot of we would review them as loopholes,
I don’t know how other people would view them as. But, that does impede certain investigations,
especially now when using the example, excuse me, of Apple and the encryption, yes, they
can provide us the device or saying here’s the information, but if it’s encrypted, we
can’t use it and we’re stuck at that point, kind of.
>>SHANE TEWS: Hi. Thank you all for your time. This has been
>>Shane, who are you?>>SHANE TEWS: I’m Vice Chair of today’s event.
I wanted to thank everybody. I have a question. I realize this is complicated from both sides,
trying to keep information private, and trying to understand what’s going on when you need
to know that. From a global perspective, what do we do about attribution when it comes to
cyber crime? Certain people feel we’re in a cyber war. We’re never sure if we’re dealing
with the Chinese, the Russians, my nextdoor neighbor. While we’re trying to protect certain
element of information, attribution continues to be a huge problem. We’re not sure it was
the North Koreans who did the Sony attack. Any thoughts on that, and understanding there’s
technical and legal problems there, there’s potentially legislative issue we’re going
to get into? Thanks.>>Well, since we have project on that, but,
again, it’s not towards the IC or law enforcement. We were actually looking at for enterprises.
And this was one where technology you know, on the one hand, you’re talking about policymakers
do stuff in a vacuum from technology. And technologists tend to do stuff in a vacuum
from policy. And it is actually good to get the two to meet, because we were looking at
how do we solve this problem, and our solution was the NSA solution. Just collect everything
and look when something bad happens. We realized, we can’t do that. That’s what
got us thinking of how could we do, sort of, attribution for the rest of us. How can you
find out it was just the metadata, the traffic flows. Where did this stuff come through,
including the fact that as Nick said, these go through many different systems. So like
the attack on Estonia, you might have remembered when Russia shut them down, the attack came
from Iowa. It didn’t really come from Iowa, it came from a bunch of owned machines in
the United States. But, you know, the Estonians could have said, it looks like the U.S. got
mad at us. We didn’t, somebody else did. It is a very, very difficult problem. It’s important
to get right. There are a lot of people out there who say, you know, taking down the Sony
network, meaning I can’t play on my PlayStation, that’s an act of war which might have some
consequences.>>I think one of the challenges on attribution
is there are times when we may know who actually did it, but for reasons of security, we don’t
actually say, because being able to say, people say, how do you know? And being able to prove
that reveals too much about the ways in which we figure out who is actually doing what.
>>This is hard, because they’re humans. There are actually there are newer technical tools
we are playing with that, you do stronger authentication, get a better idea of what
machine caused the problem. But, who was in control of that machine is hard. If the human
at the other end made any one of countless>>That was one of the things we discovered.
That’s the way technologists think, we want to see whose fingers are on the keyboard.
Sometimes it’s just important to know, we don’t know who did it, but it didn’t come
from North Korea, that’s a useful answer. Computer scientists we’ll never be able to
prove whose fingers are on there.>>AUDIENCE MEMBER: I have notes. I wanted
to start with the structure of the panel. Nick and Robert, are sitting next to each
other, when it appears you don’t want any oversight, and Robert says give me more, Nick
says trust me about security when there isn’t any security. We have to really get to the
point and say, there is none, know that when you go on board. My notes, my background is
to be predictive. Not necessarily to mitigate, but to be predictive, and put officer where
is they needed to do. Robert, I’m disappointed that you haven’t
been predictive and addressed the elephant in the room, the ICANN transition on the earliest
panel. AYANA is largely behind a lot of these crimes going on with the release of the GLDs,
Bloomberg.market was a case of that, that panel that actually tracked down to Verizon.
>>Can I ask you to move to a question?>>AUDIENCE MEMBER: Well, we keep on making
statements. We’re looking at there was points I wanted to make. We’re talking about crimes
being made when we have technologies that the government set into play a long time ago.
Question, at what point do we sit down and say we created the problem, we have to maybe
go back and look at the problems and how we got to where we are today, rather than to
throw more solutions on top of it.>>You can answer that.
(Laughter.)>>Okay. Take it
that way. I mean, we have problems because
we have systems, in part, that were wonderful for their time. And the times, they have changed.
And so we have to figure out how to make large transitions. And there are people working
on that. (Laughter.)
>>Let it go at that?>>I think reimagining the way computer security
works is important, but it’s also important to help put stopgaps in the problems that
we currently have. So, people using the internet having a little bit more security now by adding
things that are proven to have a positive effect, such as HTTPS, I think these are positive
steps to go at the same time as research on next generation technologies.
>>I strongly agree. We need incremental change, we need technology and policy, push it. We
also have a group of people, of which I’m a part, looking at the next generation. And
we have to get it right there, but the cautionary tale this morning session, when was IPV6 decided
as the solution, how many years ago? Even if we had the next solution today, it is not
going to be in place and pervasive for a little while, so we have some problems to solve with
today’s internet.>>I’d like to take issue with the characterization
that I don’t believe in oversight. I believe the government should also have oversight.
>>AUDIENCE MEMBER: Well, I’m Lara from the State Department, I’ve been working on a lot
of comparative research in ow oversight works in different countries. I was wondering if
you could comment a little bit, I’m intrigued by the third way project of yours. One of
the things I’ve noticed is that a lot of countries, particularly in Europe, we can talk about
France and Germany, they have foreign intelligence and then they have domestic intelligence,
and domestic intelligence is separate from law enforcement.
So, some of the criticism we hear from the Germans is not that you’re conducting intelligence,
but that FBI combines intelligence and law enforcement under one roof, they consider
that dangerous because they’re seeing intelligence has a very liberal ability to collect the
information but limited ability to do something. NSA isn’t arresting anyone or stopping them
at the airport. That’s making them nervous. A study was done exploring the possibility
of a U.S. domestic intelligence agency. I’m just curious to hear your thinking about encryption
layered on top of this discussion about what kinds of agencies are able to obtain information,
and for what purpose, and how, with specific reference to encryption. Because I think on
the encryption thing, the needs and prerogatives of intelligence and law enforcement can be
quite different things.>>MIEKE EOYANG: So, I think that part of
the challenge with the European scene we tend to conflate this they talk to us about our
privacy rules, they think their rules are better, talking about commercial privacy,
which is more strict than ours. There’s a lot of U.S. versus Europe tension happening
because of their frustration about the access the NSA has had to data. Some is disingenuous,
a lot of Europeans rely on information provided by the NSA to identify their threats. As somebody
who served on the intelligence committee, one of the things that’s frustrating about
the Europeans when they talk about American, sort of, aggressiveness in the surveillance
space is that we actually have one we have, I would say, the most aggressive oversight
of our intelligence gathering apparatus of any democracy, and frankly, more aggressive
than the non democracies. The number of people in Congress with access
to the intelligence programs, all the things we do, we are much more intrusive into the
ways that we do the intelligence business than most other countries complaining about
what we do. Some of what I think they’re complaining about is a jealousy of American capability
because we are so much better and broader in the surveillance space that it’s much more
ubiquitous for us than it is for them. That’s frustrating, but, you know, I would match
our oversight apparatus to theirs any day of the week.
We have permanent staffs on those committees looking at these things all the time. The
German legislative committee that meets about intelligence meets like once a quarter and
has something like eight people on. We have, between the House and Senate, closer to 40.
We’ve got just of the elected officials. There is a lot more oversight that happens on our
end than their end from a comparative perspective. And I would say our legislators are better
at bringing in different policy considerations into the debate, including when but talk about
encryption, because they are also concerned about economic wellbeing, what does it mean
for industry, what’s the best solution. I think that’s part of the conversation that
we are not having about U.S. versus Europe.>>And you were mentioning that the FBI is
both a domestic intelligence and law enforcement agency, that was reviewed in 2001 after the
9/11 attacks. And people with higher pay grades than I decided that it was going to stick
together. There have been benefits to it, because sometimes, a lot of times what we’ve
seen with domestic terrorists, they are engaging in criminal acts. We saw with the 9/11 attacks
there was a lot of information that we were not, as a law enforcement agency, to receive
from our domestic intelligence side of the house, and that proved to be very, very costly
and very detrimental to national security. There are a lot of times when having the marriage
works, but, it is very complicated, very difficult, and there’s a lot of policies and procedures
in place to ensure that the line is not crossed, or is not to be crossed when we separate those
two functions.>>Thank you. My name is Chip, I’m from Cisco
Systems. And just a number of years ago in my previous job, I worked on CALEA, in the
late ’90s and early ’00s in developing how we will support our customers in our obligations.
That’s not what my comment is. It’s interesting to hear that acronym again a number of years,
it’s still relevant and important. What I really wanted to just mention is this discussion
today has been more about law enforcement versus, you know, user data rights, in a way,
but also encryption end to end. What I wanted to address just quickly is the
enterprise issues, and that is that a number of businesses and enterprises have strict
regulations on what they have to do to monitor their data within their networks. A number
of other enterprises, Cisco is one of them, that have to monitor their data networks to
watch for things like bot nets, intrusions, filtration, and other things. The Going Dark
issue is a problem for enterprise as well as for law enforcement in terms of being able
to monitor networks and protect your internal data and information from attacks and exploitation,
from employees, really. So that’s something we’re struggling with, we have to address
it, looking at it from the switch side, the router side, as well as the encryption side
and the clients. So, if you have any thoughts on that aspect, I’d be interested in hearing
them. Thank you.>>JON PEHA: We’re low on time. Anybody?
>>That’s the last question.>>I can make a comment. Absolutely, this
is something that enterprise has to deal with, as well. And as you mentioned, there’s a lot
of historical regulations around what you have to monitor inside your own network. And
basically, how the corporate network is designed. And there have been changes, and there will
continue to be changes about what the borderline between the corporate infrastructure and the
outer world is. We’re seeing more and more of this with bring your own device. That’s
sort of the touchstone issue here where people can use personal phones and devices as part
of their corporate network and logging into VPNs, and encryption is becoming more pervasive.
And some of these challenges that enterprise has to face, they have to do by adapting to
the changing environment. And I think that’s a challenge for all enterprises at this stage.
Corporate environments are changing, and it’s something we have to consider as well.
>>And it is one of those shades of gray where, you know, an enterprise, like a bank, really,
really, really wants its communication with its customers to be very, very secure. And
it’s required to record everything. And keep it pretty much forever. And there are a bunch
of hacks out there today, and there are a number of companies that made a lot of money
because there are these hacks to basically present the very secure connection outside
the enterprise in a very by design insecure connection on the inside. That will make a
very interesting policy and technology thing, which you can always pour more research dollars
into, I would think.>>With a shameless plug for more research
dollars, actually, I will also since Eric was nice enough to bring in this wonderful
prop, which I also shed enough blood, sweat and tears over, that those who want to read
more, “Risking It All: Unlocking the Back Door to the Nation’s Cybersecurity.” I want
to thank a wonderful panel for trying to get through these very hard issues in 90 minutes.