How Peter Maas Got Laura Poitras to Open Up
August 19, 2013, 6:07
Peter Maass on How DID He Got the Very Secret Laura
Poitras to Open Up
Peter Maass, a contributor to the magazine, wrote
this
week’s cover story on Laura Poitras and Glenn Greenwald, the two
journalists to whom Edward Snowden leaked material concerning N.S.A. surveillance
programs. Maass is the author of several books, most recently “Crude
World: The Violent Twilight of Oil,’’ and is working on a new book
about surveillance and privacy.
Everyone wants to talk to Snowden, and, failing that, everyone wants to
talk to the two people talking to Snowden. How did you get Greenwald and
Poitras to agree to the story?
It goes back a number of years. Laura’s second film in a trilogy about
American power, “The Oath,” had just come out. A friend of mine
who is a documentary maker recommended it to me and my wife. I was familiar
with Laura’s work but was amazed by the documentary. It was visually
beautiful and imaginative while at the same time being information-dense
and telling a good story. I knew that she was still being stopped at airports
while working on a project about surveillance, so I got in touch. A year
and a half ago we met several times for coffee or lunch, and I said, “You
know, I’d like to do a story about you.” This was long before Snowden
entered her life. She was a little bit reluctant, because any spotlight on
her makes it more difficult for her to do her work. But she agreed. I got
into another couple of stories first. Then the Snowden thing happened, and
I sent her another e-mail asking if she’d be amenable to doing the profile
now. Since she already knew me and my work, and probably also because I’d
been interested in her before, she agreed to let me do the story.
She and Greenwald weren’t worried about disclosing their location
in Rio to you, or having you watch them work with secret files?
It was understood that I wouldn’t write anything that would jeopardize
their security. I also knew they wouldn’t show me their documents or
tell me every detail about how they got them from Snowden or what they planned
to do with them. Snowden has been charged with espionage. They could be still
be charged with something. They don’t want to make public the types
of information, beyond the documents themselves, that could be used to build
a case against him or them. Basic things like where Glenn’s house is
in Rio I don’t mention in the story, just in case. I think it’s
safe to assume the U.S. government knows where Glenn lives, but other governments
and private individuals probably don’t. And we did have some explicit
conversations about what they preferred I not include.
Did their need for secrecy hinder your reporting?
When I first arrived on a Saturday morning, Laura had sent me an e-mail with
the name of the hotel where she was meeting with Glenn and the other two
Guardian reporters who were visiting to help with stories. I went straight
there from the airport and watched the four of them working on stories and
on computer-security issues. It was like an embed. I’ve done military
embeds in Iraq. It was either explicitly stated in Iraq, or just really clear,
that you didn’t write about operational matters — tactics, perimeter
security, patrol plans — that could jeopardize the present or future
security of the troops you were with. The military doesn’t show you
everything, but it is there in the room, and they are not necessarily able
or trying to hide everything. They depend somewhat on your discretion. Both
were classified environments.
Did you use encrypted messages in reporting this story?
I exchanged both encrypted and nonencrypted messages with Poitras. If something
was not supersensitive, we used normal e-mail. I thought about not bringing
my smartphone to Rio, but then I ended up bringing it. When I was with Laura
and Glenn, I for the most part left my smartphone in a secure place that
was not on my person. If it was on me, it was usually off. I didn’t
bring my own laptop to Rio. I brought a clean computer. I thought that maybe
as I came back, someone might want to take a look at what was on my computer.
Then when I returned to New York and Laura returned to Berlin, I had more
questions for her. So there were two levels of security: We used an encrypted
chat program and anonymizing software.
Through an encrypted chat via Laura, you got a chance to ask Snowden some
questions. What sense did you get of him?
I didn’t know whether he would answer any of my questions, and neither
did Laura. So I thought the best thing would be to keep them focused on the
topic of my story. I didn’t learn more about him personally, but what
was most interesting and what has gotten a lot of reaction was his surprise
about the lack of encryption that journalists use and journalists’ lack
of awareness of how their communications are so easy for organizations, including
the N.S.A., to capture. He expressed his disappointment that in the beginning
Glenn was not only not encryption savvy but wouldn’t take the steps
to become encryption savvy until Laura went to him and said, “Hey, this
is for real.” Snowden knew very well what the N.S.A. was capturing,
so it was useful to hear directly from him that encryption is a crucial step.
Tuesday, August 20, 2013
Sunday, August 18, 2013
Jon Callas on What Snowden Is Telling Us
Date: Sat, 17 Aug 2013 10:50:42 -0700
To: Bryan Bishop <kanzure[at]gmail.com>
Cc: Crypto List <cryptography[at]randombit.net>
Subject: Re: [cryptography] Reply to Zooko (in Markdown)
On Aug 17, 2013, at 12:49 AM, Bryan Bishop <kanzure[at]gmail.com> wrote:
But let's not rathole on that, and get to brass tacks.
I *cannot* provide an argument of security that can be verified on its own. This is Godel's second incompleteness theorem. A set of statements S cannot be proved consistent on its own. (Yes, that's a minor handwave.)
All is not lost, however. We can say, "Meh, good enough" and the problem is solved. Someone else can construct a *verifier* that is some set of policies (I'm using the word "policy" but it could be a program) that verifies the software. However, the verifier can only be verified by a set of policies that are constructed to verify it. The only escape is decide at some point, "meh, good enough."
I brought Ken Thompson into it because he actually constructed a rootkit that would evade detection and described it in his Turing Award lecture. It's not *just* philosophy and theoretical computer science. Thompson flat-out says, that at some point you have to trust the people who wrote the software, because if they want to hide things in the code, they can.
I hope I don't sound like a broken record, but a smart attacker isn't going to attack there, anyway. A smart attacker doesn't break crypto, or suborn releases. They do traffic analysis and make custom malware. Really. Go look at what Snowden is telling us. That is precisely what all the bad guys are doing. Verification is important, but that's not where the attacks come from (ignoring the notable exceptions, of course).
One of my tasks is to get better source releases out there. However, I also have to prioritize it with other tasks, including actual software improvements. We're working on a release that will tie together some new anti-surveillance code along with a better source release. We're testing the new source release process with some people not in our organization, as well. It will get better; it *is* getting better.
Jon
_______________________________________________
cryptography mailing list
cryptography[at]randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography
Date: Fri, 16 Aug 2013 23:04:38 -0700
To: Zooko Wilcox-OHearn <zooko[at]leastauthority.com>
Cc: cryptography[at]randombit.net
Subject: [cryptography] Reply to Zooko (in Markdown)
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Also at http://silentcircle.wordpress.com/2013/08/17/reply-to-zooko/
# Reply to Zooko
(My friend and colleague, [Zooko Wilcox-O'Hearn](https://leastauthority.com/blog/author/zooko-wilcox-ohearn.html) wrote an open letter to me and Phil [on his blog at LeastAuthority.com](https://leastauthority.com/blog/open_letter_silent_circle.html). Despite this appearing on Silent Circle's blog, I am speaking mostly for myself, only slightly for Silent Circle, and not at all for Phil.)
Zooko,
Thank you for writing and your kind words. Thank you even more for being a customer. We're a startup and without customers, we'll be out of business. I think that everyone who believes in privacy should support with their pocketbook every privacy-friendly service they can afford to. It means a lot to me that you're voting with your pocketbook for my service.
Congratulations on your new release of [LeastAuthority's S4](https://leastauthority.com) and [Tahoe-LAFS](https://tahoe-lafs.org/trac/tahoe-lafs). Just as you are a fan of my work, I am an admirer of your work on Tahoe-LAFS and consider it one of the best security innovations on the planet.
I understand your concerns, and share them. One of the highest priority tasks that we're working on is to get our source releases better organized so that they can effectively be built from [what we have on GitHub](https://github.com/SilentCircle/). It's suboptimal now. Getting the source releases is harder than one might think. We're a startup and are pulled in many directions. We're overworked and understaffed. Even in the old days at PGP, producing effective source releases took years of effort to get down pat. It often took us four to six weeks to get the sources out even when delivering one or two releases per year.
The world of app development makes this harder. We're trying to streamline our processes so that we can get a release out about every six weeks. We're not there, either.
However, even when we have source code to be an automated part of our software releases, I'm afraid you're going to be disappointed about how verifiable they are.
It's very hard, even with controlled releases, to get an exact byte-for-byte recompile of an app. Some compilers make this impossible because they randomize the branch prediction and other parts of code generation. Even when the compiler isn't making it literally impossible, without an exact copy of the exact tool chain with the same linkers, libraries, and system, the code won't be byte-for-byte the same. Worst of all, smart development shops use the *oldest* possible tool chain, not the newest one because tool sets are designed for forwards-compatibility (apps built with old tools run on the newest OS) rather than backwards-compatibility (apps built with the new tools run on older OSes). Code reliability almost requires using tool chains that are trailing-edge.
The problems run even deeper than the raw practicality. Twenty-nine years ago this month, in the August 1984 issue of "Communications of the ACM" (Vol. 27, No. 8) Ken Thompson's famous Turing Award lecture, "Reflections on Trusting Trust" was published. You can find a facsimile of the magazine article at <https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thompson.pdf> and a text-searchable copy on Thompson's own site, <http://cm.bell-labs.com/who/ken/trust.html>.
For those unfamiliar with the Turing Award, it is the most prestigious award a computer scientist can win, sometimes called the "Nobel Prize" of computing. The site for the award is at <http://amturing.acm.org>.
In Thompson's lecture, he describes a hack that he and Dennis Ritchie did in a version of UNIX in which they created a backdoor to UNIX login that allowed them to get access to any UNIX system. They also created a self-replicating program that would compile their backdoor into new versions of UNIX portably. Quite possibly, their hack existed in the wild until UNIX was recoded from the ground up with BSD and GCC.
In his summation, Thompson says:
The moral is obvious. You can't trust code that you did not totally
create yourself. (Especially code from companies that employ people
like me.) No amount of source-level verification or scrutiny will
protect you from using untrusted code. In demonstrating the
possibility of this kind of attack, I picked on the C compiler. I
could have picked on any program-handling program such as an
assembler, a loader, or even hardware microcode. As the level of
program gets lower, these bugs will be harder and harder to detect.
A well installed microcode bug will be almost impossible to detect.
Thompson's words reach out across three decades of computer science, and yet they echo Descartes from three centuries prior to Thompson. In Descartes's 1641 "Meditations," he proposes the thought experiment of an "evil demon" who deceives us by simulating the universe, our senses, and perhaps even mathematics itself. In his meditation, Descartes decides that the one thing that he knows is that he, himself, exists, and the evil demon cannot deceive him about his own existence. This is where the famous saying, "*I think, therefore I am*" (*Cogito ergo sum* in Latin) comes from.
(There are useful Descartes links at: <http://www.anselm.edu/homepage/dbanach/dcarg.htm> and <http://en.wikipedia.org/wiki/Evil_demon> and <http://en.wikipedia.org/wiki/Brain_in_a_vat>.)
When discussing thorny security problems, I often avoid security ratholes by pointing out Descartes by way of Futurama and saying, "I can't prove I'm not a head in a jar, but it's a useful assumption that I'm not."
Descartes's conundrum even finds its way into modern physics. It is presently a debatable, yet legitimate theory that our entire universe is a software simulation of a universe . Martin Savage of University of Washington <http://www.phys.washington.edu/~savage/> has an interesting paper from last November on ArXiV <http://arxiv.org/pdf/1210.1847v2.pdf>.
You can find an amusing video at <http://www.huffingtonpost.com/2012/12/24/universe-computer-simulation_n_2339109.html> in which Savage even opines that our descendants are simulating us to understand where they came from. I suppose this means we should be nice to our kids because they might have root.
Savage tries to devise an experiment to show that you're actually in a simulation, and as a mathematical logician I think he's ignoring things like math. The problem is isomorphic to writing code that can detect it's on a virtual machine. If the virtual machine isn't trying to evade, then it's certainly possible (if not probable -- after all, the simulators might want us to figure out that we're in a simulation). Unless, of course, they don't, in which case we're back not only to Descartes, but Godel's two Incompleteness Theorems and their cousin, The Halting Problem.
While I'm at it, I highly, highly recommend Scott Aaronson's new book, "Quantum Computing Since Democritus" <http://www.scottaaronson.com/democritus/> which I believe is so important a book that I bought the Dead Tree Edition of it. ([Jenny Lawson](http://thebloggess.com) has already autographed my Kindle.)
Popping the stack back to security, the bottom line is that you're asking for something very, very hard and asking for a solution to an old philosophical problem as well as suggesting I should prove Godel wrong. I'm flattered by the confidence in my abilities, but I believe you're asking for the impossible. Or perhaps I'm programmed to think that.
This limitation doesn't apply to just *my* code. It applies to *your* code, and it applies to all of us. (Tahoe's architecture makes it amazingly resilient, but it's not immune.) It isn't just mind-blowing philosophy mixed up with Ken Thompson's Greatest Hack.
Whenever we run an app, we're trusting it. We're also trusting the operating system that it runs on, the random number generator, the entropy sources, and so on. You're trusting the CPU and its microcode. You're trusting the bootloader, be it EFI or whatever as well as [SMM](http://en.wikipedia.org/wiki/System_Management_Mode) on Intel processors -- which could have completely undetectable code running, doing things that are scarily like Descartes's evil demon. The platform-level threats are so broad that I could bore people for another paragraph or two just enumerating them.
You're perhaps trusting really daft things like [modders who slow down entropy gathering](http://hackaday.com/2013/01/04/is-entropy-slowing-down-your-android-device/) and [outright bugs](http://android-developers.blogspot.com/2013/08/some-securerandom-thoughts.html).
Ironically, the attack vector you suggest (a hacked application) is one of the harder ways for an attacker to feed you bad code. On mobile devices, apps are digitally signed and delivered by app stores. Those app stores have a vetting process that makes *targeted* code delivery hard. Yes, someone could hack us, hack Google or Apple, or all of us, but it's very, very hard to deliver bad code to a *specific* person through this vector, and even harder if you want to do it undetectably.
In contrast, targeted malware is easy to deploy. Exploits are sold openly in exploit markets, and can be bundled up in targeted advertising. Moreover, this *has* happened, and is known to be a mechanism that's been used by the FBI, German Federal Police, the Countries Starting With the Letter 'I' (as a friend puts it), and everyone's favorite The People's Liberation Army. During Arab Spring, a now-defunct government just procured some Javascript malware and dropped it in some browsers to send them passwords on non-SSL sites.
Thus, I think that while your concern does remind me to polish up my source code deployment, if we assume an attacker like a state actor that targets people and systems, there are smarter ways for them to act.
I spend a lot of time thinking, "*If I were them, what would I do?*" If you think about what's possible, you spend too much time on low-probability events. Give yourself that thought experiment. Ask yourself what you'd do if you were the PLA, or NSA, or a country starting with an 'I.' Give yourself a budget in several orders of magnitude. A grand, ten grand, a hundred grand, a million bucks. What would you do to hack yourself? What would you do to hack your users without hacking you? That's what I think about.
Over the years, I've become a radical on usability. I believe that usability is all. It's easy to forget it now, but PGP was a triumph because you didn't have to be a cryptographer, you only had to be a techie. We progressed PGP so that you could be non-technical and get by, and then we created PGP Universal which was designed to allow complete ease of use with a trusted staff. That trusted staff was the fly in the ointment of Silent Mail and the crux of why we shut it down -- we created it because of usability concerns and killed it because of security concerns. Things that were okay ideas in May 2013 were suddenly not good ideas in August. I'm sure you've noted when using our service our belief in usability. Without usability that is similar to the non-secure equivalent, we are nothing because the users will just not be secure.
I also stress Silent Circle is a *service*, not an app. This is hard to remember and even we are not as good at it as we need to be. The service is there to provide its users with a secure analogue of the phone and texting apps they're used to. The difference is that instead of having utterly no security, they have a very high degree of it.
Moreover, our design is such to minimize the trust you need to place in us. Our network includes ourselves as a threat, which is unusual. You're one of the very few other people who do something similar. We have technology and policy that makes an attack on *us* to be unattractive to the adversary. You will soon see some improvements to the service that improve our resistance to traffic analysis.
The flip side of that, however, is that it means that the device is the most attractive attack point. We can't help but trust the OS (from RNG to sandbox), bootloader, hardware, etc.
Improvements in our transparently (like code releases) compete with tight resources for improvements in the service and apps. My decisions in deploying those resources reflect my bias that I'd rather have an A grade in the service with a B grade in code releases than an A in code releases and a B service. Yes, it makes it harder for you and others, but I have to look at myself in the mirror and my emphasis is on service quality first, reporting just after that. Over time, we'll get better. We've not yet been running for a year. Continuous improvement works.
I'm going to sum up with the subtitle of the ACM article of Ken Thompson's speech. It's not on his site, but it is on the facsimile article:
To what extent should one trust a statement that a program is free
of Trojan horses? Perhaps it is more important to trust the people
who wrote the software.
Thank you very much for your trust in us, the people. Earning and deserving your trust is something we do every day.
Regards,
Jon
To: Bryan Bishop <kanzure[at]gmail.com>
Cc: Crypto List <cryptography[at]randombit.net>
Subject: Re: [cryptography] Reply to Zooko (in Markdown)
On Aug 17, 2013, at 12:49 AM, Bryan Bishop <kanzure[at]gmail.com> wrote:
On Sat, Aug 17, 2013 at 1:04 AM, Jon Callas <jon[at]callas.org> wrote:Maybe. The obvious counterexample is a compiler that doesn't deterministically generate code, but there's lots and lots of hair in there, including potential problems in distributing the tool chain itself, including copyrighted tools, libraries, etc.
Would providing (signed) build vm images solve the problem of distributing your toolchain?
- It's very hard, even with controlled releases, to get an exact byte-for-byte recompile of an app. Some compilers make this impossible because they randomize the branch prediction and other parts of code generation. Even when the compiler isn't making it literally impossible, without an exact copy of the exact tool chain with the same linkers, libraries, and system, the code won't be byte-for-byte the same. Worst of all, smart development shops use the *oldest* possible tool chain, not the newest one because tool sets are designed for forwards-compatibility (apps built with old tools run on the newest OS) rather than backwards-compatibility (apps built with the new tools run on older OSes). Code reliability almost requires using tool chains that are trailing-edge.
But let's not rathole on that, and get to brass tacks.
I *cannot* provide an argument of security that can be verified on its own. This is Godel's second incompleteness theorem. A set of statements S cannot be proved consistent on its own. (Yes, that's a minor handwave.)
All is not lost, however. We can say, "Meh, good enough" and the problem is solved. Someone else can construct a *verifier* that is some set of policies (I'm using the word "policy" but it could be a program) that verifies the software. However, the verifier can only be verified by a set of policies that are constructed to verify it. The only escape is decide at some point, "meh, good enough."
I brought Ken Thompson into it because he actually constructed a rootkit that would evade detection and described it in his Turing Award lecture. It's not *just* philosophy and theoretical computer science. Thompson flat-out says, that at some point you have to trust the people who wrote the software, because if they want to hide things in the code, they can.
I hope I don't sound like a broken record, but a smart attacker isn't going to attack there, anyway. A smart attacker doesn't break crypto, or suborn releases. They do traffic analysis and make custom malware. Really. Go look at what Snowden is telling us. That is precisely what all the bad guys are doing. Verification is important, but that's not where the attacks come from (ignoring the notable exceptions, of course).
One of my tasks is to get better source releases out there. However, I also have to prioritize it with other tasks, including actual software improvements. We're working on a release that will tie together some new anti-surveillance code along with a better source release. We're testing the new source release process with some people not in our organization, as well. It will get better; it *is* getting better.
Jon
_______________________________________________
cryptography mailing list
cryptography[at]randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography
Date: Fri, 16 Aug 2013 23:04:38 -0700
To: Zooko Wilcox-OHearn <zooko[at]leastauthority.com>
Cc: cryptography[at]randombit.net
Subject: [cryptography] Reply to Zooko (in Markdown)
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Also at http://silentcircle.wordpress.com/2013/08/17/reply-to-zooko/
# Reply to Zooko
(My friend and colleague, [Zooko Wilcox-O'Hearn](https://leastauthority.com/blog/author/zooko-wilcox-ohearn.html) wrote an open letter to me and Phil [on his blog at LeastAuthority.com](https://leastauthority.com/blog/open_letter_silent_circle.html). Despite this appearing on Silent Circle's blog, I am speaking mostly for myself, only slightly for Silent Circle, and not at all for Phil.)
Zooko,
Thank you for writing and your kind words. Thank you even more for being a customer. We're a startup and without customers, we'll be out of business. I think that everyone who believes in privacy should support with their pocketbook every privacy-friendly service they can afford to. It means a lot to me that you're voting with your pocketbook for my service.
Congratulations on your new release of [LeastAuthority's S4](https://leastauthority.com) and [Tahoe-LAFS](https://tahoe-lafs.org/trac/tahoe-lafs). Just as you are a fan of my work, I am an admirer of your work on Tahoe-LAFS and consider it one of the best security innovations on the planet.
I understand your concerns, and share them. One of the highest priority tasks that we're working on is to get our source releases better organized so that they can effectively be built from [what we have on GitHub](https://github.com/SilentCircle/). It's suboptimal now. Getting the source releases is harder than one might think. We're a startup and are pulled in many directions. We're overworked and understaffed. Even in the old days at PGP, producing effective source releases took years of effort to get down pat. It often took us four to six weeks to get the sources out even when delivering one or two releases per year.
The world of app development makes this harder. We're trying to streamline our processes so that we can get a release out about every six weeks. We're not there, either.
However, even when we have source code to be an automated part of our software releases, I'm afraid you're going to be disappointed about how verifiable they are.
It's very hard, even with controlled releases, to get an exact byte-for-byte recompile of an app. Some compilers make this impossible because they randomize the branch prediction and other parts of code generation. Even when the compiler isn't making it literally impossible, without an exact copy of the exact tool chain with the same linkers, libraries, and system, the code won't be byte-for-byte the same. Worst of all, smart development shops use the *oldest* possible tool chain, not the newest one because tool sets are designed for forwards-compatibility (apps built with old tools run on the newest OS) rather than backwards-compatibility (apps built with the new tools run on older OSes). Code reliability almost requires using tool chains that are trailing-edge.
The problems run even deeper than the raw practicality. Twenty-nine years ago this month, in the August 1984 issue of "Communications of the ACM" (Vol. 27, No. 8) Ken Thompson's famous Turing Award lecture, "Reflections on Trusting Trust" was published. You can find a facsimile of the magazine article at <https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thompson.pdf> and a text-searchable copy on Thompson's own site, <http://cm.bell-labs.com/who/ken/trust.html>.
For those unfamiliar with the Turing Award, it is the most prestigious award a computer scientist can win, sometimes called the "Nobel Prize" of computing. The site for the award is at <http://amturing.acm.org>.
In Thompson's lecture, he describes a hack that he and Dennis Ritchie did in a version of UNIX in which they created a backdoor to UNIX login that allowed them to get access to any UNIX system. They also created a self-replicating program that would compile their backdoor into new versions of UNIX portably. Quite possibly, their hack existed in the wild until UNIX was recoded from the ground up with BSD and GCC.
In his summation, Thompson says:
The moral is obvious. You can't trust code that you did not totally
create yourself. (Especially code from companies that employ people
like me.) No amount of source-level verification or scrutiny will
protect you from using untrusted code. In demonstrating the
possibility of this kind of attack, I picked on the C compiler. I
could have picked on any program-handling program such as an
assembler, a loader, or even hardware microcode. As the level of
program gets lower, these bugs will be harder and harder to detect.
A well installed microcode bug will be almost impossible to detect.
Thompson's words reach out across three decades of computer science, and yet they echo Descartes from three centuries prior to Thompson. In Descartes's 1641 "Meditations," he proposes the thought experiment of an "evil demon" who deceives us by simulating the universe, our senses, and perhaps even mathematics itself. In his meditation, Descartes decides that the one thing that he knows is that he, himself, exists, and the evil demon cannot deceive him about his own existence. This is where the famous saying, "*I think, therefore I am*" (*Cogito ergo sum* in Latin) comes from.
(There are useful Descartes links at: <http://www.anselm.edu/homepage/dbanach/dcarg.htm> and <http://en.wikipedia.org/wiki/Evil_demon> and <http://en.wikipedia.org/wiki/Brain_in_a_vat>.)
When discussing thorny security problems, I often avoid security ratholes by pointing out Descartes by way of Futurama and saying, "I can't prove I'm not a head in a jar, but it's a useful assumption that I'm not."
Descartes's conundrum even finds its way into modern physics. It is presently a debatable, yet legitimate theory that our entire universe is a software simulation of a universe . Martin Savage of University of Washington <http://www.phys.washington.edu/~savage/> has an interesting paper from last November on ArXiV <http://arxiv.org/pdf/1210.1847v2.pdf>.
You can find an amusing video at <http://www.huffingtonpost.com/2012/12/24/universe-computer-simulation_n_2339109.html> in which Savage even opines that our descendants are simulating us to understand where they came from. I suppose this means we should be nice to our kids because they might have root.
Savage tries to devise an experiment to show that you're actually in a simulation, and as a mathematical logician I think he's ignoring things like math. The problem is isomorphic to writing code that can detect it's on a virtual machine. If the virtual machine isn't trying to evade, then it's certainly possible (if not probable -- after all, the simulators might want us to figure out that we're in a simulation). Unless, of course, they don't, in which case we're back not only to Descartes, but Godel's two Incompleteness Theorems and their cousin, The Halting Problem.
While I'm at it, I highly, highly recommend Scott Aaronson's new book, "Quantum Computing Since Democritus" <http://www.scottaaronson.com/democritus/> which I believe is so important a book that I bought the Dead Tree Edition of it. ([Jenny Lawson](http://thebloggess.com) has already autographed my Kindle.)
Popping the stack back to security, the bottom line is that you're asking for something very, very hard and asking for a solution to an old philosophical problem as well as suggesting I should prove Godel wrong. I'm flattered by the confidence in my abilities, but I believe you're asking for the impossible. Or perhaps I'm programmed to think that.
This limitation doesn't apply to just *my* code. It applies to *your* code, and it applies to all of us. (Tahoe's architecture makes it amazingly resilient, but it's not immune.) It isn't just mind-blowing philosophy mixed up with Ken Thompson's Greatest Hack.
Whenever we run an app, we're trusting it. We're also trusting the operating system that it runs on, the random number generator, the entropy sources, and so on. You're trusting the CPU and its microcode. You're trusting the bootloader, be it EFI or whatever as well as [SMM](http://en.wikipedia.org/wiki/System_Management_Mode) on Intel processors -- which could have completely undetectable code running, doing things that are scarily like Descartes's evil demon. The platform-level threats are so broad that I could bore people for another paragraph or two just enumerating them.
You're perhaps trusting really daft things like [modders who slow down entropy gathering](http://hackaday.com/2013/01/04/is-entropy-slowing-down-your-android-device/) and [outright bugs](http://android-developers.blogspot.com/2013/08/some-securerandom-thoughts.html).
Ironically, the attack vector you suggest (a hacked application) is one of the harder ways for an attacker to feed you bad code. On mobile devices, apps are digitally signed and delivered by app stores. Those app stores have a vetting process that makes *targeted* code delivery hard. Yes, someone could hack us, hack Google or Apple, or all of us, but it's very, very hard to deliver bad code to a *specific* person through this vector, and even harder if you want to do it undetectably.
In contrast, targeted malware is easy to deploy. Exploits are sold openly in exploit markets, and can be bundled up in targeted advertising. Moreover, this *has* happened, and is known to be a mechanism that's been used by the FBI, German Federal Police, the Countries Starting With the Letter 'I' (as a friend puts it), and everyone's favorite The People's Liberation Army. During Arab Spring, a now-defunct government just procured some Javascript malware and dropped it in some browsers to send them passwords on non-SSL sites.
Thus, I think that while your concern does remind me to polish up my source code deployment, if we assume an attacker like a state actor that targets people and systems, there are smarter ways for them to act.
I spend a lot of time thinking, "*If I were them, what would I do?*" If you think about what's possible, you spend too much time on low-probability events. Give yourself that thought experiment. Ask yourself what you'd do if you were the PLA, or NSA, or a country starting with an 'I.' Give yourself a budget in several orders of magnitude. A grand, ten grand, a hundred grand, a million bucks. What would you do to hack yourself? What would you do to hack your users without hacking you? That's what I think about.
Over the years, I've become a radical on usability. I believe that usability is all. It's easy to forget it now, but PGP was a triumph because you didn't have to be a cryptographer, you only had to be a techie. We progressed PGP so that you could be non-technical and get by, and then we created PGP Universal which was designed to allow complete ease of use with a trusted staff. That trusted staff was the fly in the ointment of Silent Mail and the crux of why we shut it down -- we created it because of usability concerns and killed it because of security concerns. Things that were okay ideas in May 2013 were suddenly not good ideas in August. I'm sure you've noted when using our service our belief in usability. Without usability that is similar to the non-secure equivalent, we are nothing because the users will just not be secure.
I also stress Silent Circle is a *service*, not an app. This is hard to remember and even we are not as good at it as we need to be. The service is there to provide its users with a secure analogue of the phone and texting apps they're used to. The difference is that instead of having utterly no security, they have a very high degree of it.
Moreover, our design is such to minimize the trust you need to place in us. Our network includes ourselves as a threat, which is unusual. You're one of the very few other people who do something similar. We have technology and policy that makes an attack on *us* to be unattractive to the adversary. You will soon see some improvements to the service that improve our resistance to traffic analysis.
The flip side of that, however, is that it means that the device is the most attractive attack point. We can't help but trust the OS (from RNG to sandbox), bootloader, hardware, etc.
Improvements in our transparently (like code releases) compete with tight resources for improvements in the service and apps. My decisions in deploying those resources reflect my bias that I'd rather have an A grade in the service with a B grade in code releases than an A in code releases and a B service. Yes, it makes it harder for you and others, but I have to look at myself in the mirror and my emphasis is on service quality first, reporting just after that. Over time, we'll get better. We've not yet been running for a year. Continuous improvement works.
I'm going to sum up with the subtitle of the ACM article of Ken Thompson's speech. It's not on his site, but it is on the facsimile article:
To what extent should one trust a statement that a program is free
of Trojan horses? Perhaps it is more important to trust the people
who wrote the software.
Thank you very much for your trust in us, the people. Earning and deserving your trust is something we do every day.
Regards,
Jon
Jon Callas on What Snowden Is Telling Us
Date: Sat, 17 Aug 2013 10:50:42 -0700
To: Bryan Bishop <kanzure[at]gmail.com>
Cc: Crypto List <cryptography[at]randombit.net>
Subject: Re: [cryptography] Reply to Zooko (in Markdown)
On Aug 17, 2013, at 12:49 AM, Bryan Bishop <kanzure[at]gmail.com> wrote:
But let's not rathole on that, and get to brass tacks.
I *cannot* provide an argument of security that can be verified on its own. This is Godel's second incompleteness theorem. A set of statements S cannot be proved consistent on its own. (Yes, that's a minor handwave.)
All is not lost, however. We can say, "Meh, good enough" and the problem is solved. Someone else can construct a *verifier* that is some set of policies (I'm using the word "policy" but it could be a program) that verifies the software. However, the verifier can only be verified by a set of policies that are constructed to verify it. The only escape is decide at some point, "meh, good enough."
I brought Ken Thompson into it because he actually constructed a rootkit that would evade detection and described it in his Turing Award lecture. It's not *just* philosophy and theoretical computer science. Thompson flat-out says, that at some point you have to trust the people who wrote the software, because if they want to hide things in the code, they can.
I hope I don't sound like a broken record, but a smart attacker isn't going to attack there, anyway. A smart attacker doesn't break crypto, or suborn releases. They do traffic analysis and make custom malware. Really. Go look at what Snowden is telling us. That is precisely what all the bad guys are doing. Verification is important, but that's not where the attacks come from (ignoring the notable exceptions, of course).
One of my tasks is to get better source releases out there. However, I also have to prioritize it with other tasks, including actual software improvements. We're working on a release that will tie together some new anti-surveillance code along with a better source release. We're testing the new source release process with some people not in our organization, as well. It will get better; it *is* getting better.
Jon
_______________________________________________
cryptography mailing list
cryptography[at]randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography
Date: Fri, 16 Aug 2013 23:04:38 -0700
To: Zooko Wilcox-OHearn <zooko[at]leastauthority.com>
Cc: cryptography[at]randombit.net
Subject: [cryptography] Reply to Zooko (in Markdown)
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Also at http://silentcircle.wordpress.com/2013/08/17/reply-to-zooko/
# Reply to Zooko
(My friend and colleague, [Zooko Wilcox-O'Hearn](https://leastauthority.com/blog/author/zooko-wilcox-ohearn.html) wrote an open letter to me and Phil [on his blog at LeastAuthority.com](https://leastauthority.com/blog/open_letter_silent_circle.html). Despite this appearing on Silent Circle's blog, I am speaking mostly for myself, only slightly for Silent Circle, and not at all for Phil.)
Zooko,
Thank you for writing and your kind words. Thank you even more for being a customer. We're a startup and without customers, we'll be out of business. I think that everyone who believes in privacy should support with their pocketbook every privacy-friendly service they can afford to. It means a lot to me that you're voting with your pocketbook for my service.
Congratulations on your new release of [LeastAuthority's S4](https://leastauthority.com) and [Tahoe-LAFS](https://tahoe-lafs.org/trac/tahoe-lafs). Just as you are a fan of my work, I am an admirer of your work on Tahoe-LAFS and consider it one of the best security innovations on the planet.
I understand your concerns, and share them. One of the highest priority tasks that we're working on is to get our source releases better organized so that they can effectively be built from [what we have on GitHub](https://github.com/SilentCircle/). It's suboptimal now. Getting the source releases is harder than one might think. We're a startup and are pulled in many directions. We're overworked and understaffed. Even in the old days at PGP, producing effective source releases took years of effort to get down pat. It often took us four to six weeks to get the sources out even when delivering one or two releases per year.
The world of app development makes this harder. We're trying to streamline our processes so that we can get a release out about every six weeks. We're not there, either.
However, even when we have source code to be an automated part of our software releases, I'm afraid you're going to be disappointed about how verifiable they are.
It's very hard, even with controlled releases, to get an exact byte-for-byte recompile of an app. Some compilers make this impossible because they randomize the branch prediction and other parts of code generation. Even when the compiler isn't making it literally impossible, without an exact copy of the exact tool chain with the same linkers, libraries, and system, the code won't be byte-for-byte the same. Worst of all, smart development shops use the *oldest* possible tool chain, not the newest one because tool sets are designed for forwards-compatibility (apps built with old tools run on the newest OS) rather than backwards-compatibility (apps built with the new tools run on older OSes). Code reliability almost requires using tool chains that are trailing-edge.
The problems run even deeper than the raw practicality. Twenty-nine years ago this month, in the August 1984 issue of "Communications of the ACM" (Vol. 27, No. 8) Ken Thompson's famous Turing Award lecture, "Reflections on Trusting Trust" was published. You can find a facsimile of the magazine article at <https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thompson.pdf> and a text-searchable copy on Thompson's own site, <http://cm.bell-labs.com/who/ken/trust.html>.
For those unfamiliar with the Turing Award, it is the most prestigious award a computer scientist can win, sometimes called the "Nobel Prize" of computing. The site for the award is at <http://amturing.acm.org>.
In Thompson's lecture, he describes a hack that he and Dennis Ritchie did in a version of UNIX in which they created a backdoor to UNIX login that allowed them to get access to any UNIX system. They also created a self-replicating program that would compile their backdoor into new versions of UNIX portably. Quite possibly, their hack existed in the wild until UNIX was recoded from the ground up with BSD and GCC.
In his summation, Thompson says:
The moral is obvious. You can't trust code that you did not totally
create yourself. (Especially code from companies that employ people
like me.) No amount of source-level verification or scrutiny will
protect you from using untrusted code. In demonstrating the
possibility of this kind of attack, I picked on the C compiler. I
could have picked on any program-handling program such as an
assembler, a loader, or even hardware microcode. As the level of
program gets lower, these bugs will be harder and harder to detect.
A well installed microcode bug will be almost impossible to detect.
Thompson's words reach out across three decades of computer science, and yet they echo Descartes from three centuries prior to Thompson. In Descartes's 1641 "Meditations," he proposes the thought experiment of an "evil demon" who deceives us by simulating the universe, our senses, and perhaps even mathematics itself. In his meditation, Descartes decides that the one thing that he knows is that he, himself, exists, and the evil demon cannot deceive him about his own existence. This is where the famous saying, "*I think, therefore I am*" (*Cogito ergo sum* in Latin) comes from.
(There are useful Descartes links at: <http://www.anselm.edu/homepage/dbanach/dcarg.htm> and <http://en.wikipedia.org/wiki/Evil_demon> and <http://en.wikipedia.org/wiki/Brain_in_a_vat>.)
When discussing thorny security problems, I often avoid security ratholes by pointing out Descartes by way of Futurama and saying, "I can't prove I'm not a head in a jar, but it's a useful assumption that I'm not."
Descartes's conundrum even finds its way into modern physics. It is presently a debatable, yet legitimate theory that our entire universe is a software simulation of a universe . Martin Savage of University of Washington <http://www.phys.washington.edu/~savage/> has an interesting paper from last November on ArXiV <http://arxiv.org/pdf/1210.1847v2.pdf>.
You can find an amusing video at <http://www.huffingtonpost.com/2012/12/24/universe-computer-simulation_n_2339109.html> in which Savage even opines that our descendants are simulating us to understand where they came from. I suppose this means we should be nice to our kids because they might have root.
Savage tries to devise an experiment to show that you're actually in a simulation, and as a mathematical logician I think he's ignoring things like math. The problem is isomorphic to writing code that can detect it's on a virtual machine. If the virtual machine isn't trying to evade, then it's certainly possible (if not probable -- after all, the simulators might want us to figure out that we're in a simulation). Unless, of course, they don't, in which case we're back not only to Descartes, but Godel's two Incompleteness Theorems and their cousin, The Halting Problem.
While I'm at it, I highly, highly recommend Scott Aaronson's new book, "Quantum Computing Since Democritus" <http://www.scottaaronson.com/democritus/> which I believe is so important a book that I bought the Dead Tree Edition of it. ([Jenny Lawson](http://thebloggess.com) has already autographed my Kindle.)
Popping the stack back to security, the bottom line is that you're asking for something very, very hard and asking for a solution to an old philosophical problem as well as suggesting I should prove Godel wrong. I'm flattered by the confidence in my abilities, but I believe you're asking for the impossible. Or perhaps I'm programmed to think that.
This limitation doesn't apply to just *my* code. It applies to *your* code, and it applies to all of us. (Tahoe's architecture makes it amazingly resilient, but it's not immune.) It isn't just mind-blowing philosophy mixed up with Ken Thompson's Greatest Hack.
Whenever we run an app, we're trusting it. We're also trusting the operating system that it runs on, the random number generator, the entropy sources, and so on. You're trusting the CPU and its microcode. You're trusting the bootloader, be it EFI or whatever as well as [SMM](http://en.wikipedia.org/wiki/System_Management_Mode) on Intel processors -- which could have completely undetectable code running, doing things that are scarily like Descartes's evil demon. The platform-level threats are so broad that I could bore people for another paragraph or two just enumerating them.
You're perhaps trusting really daft things like [modders who slow down entropy gathering](http://hackaday.com/2013/01/04/is-entropy-slowing-down-your-android-device/) and [outright bugs](http://android-developers.blogspot.com/2013/08/some-securerandom-thoughts.html).
Ironically, the attack vector you suggest (a hacked application) is one of the harder ways for an attacker to feed you bad code. On mobile devices, apps are digitally signed and delivered by app stores. Those app stores have a vetting process that makes *targeted* code delivery hard. Yes, someone could hack us, hack Google or Apple, or all of us, but it's very, very hard to deliver bad code to a *specific* person through this vector, and even harder if you want to do it undetectably.
In contrast, targeted malware is easy to deploy. Exploits are sold openly in exploit markets, and can be bundled up in targeted advertising. Moreover, this *has* happened, and is known to be a mechanism that's been used by the FBI, German Federal Police, the Countries Starting With the Letter 'I' (as a friend puts it), and everyone's favorite The People's Liberation Army. During Arab Spring, a now-defunct government just procured some Javascript malware and dropped it in some browsers to send them passwords on non-SSL sites.
Thus, I think that while your concern does remind me to polish up my source code deployment, if we assume an attacker like a state actor that targets people and systems, there are smarter ways for them to act.
I spend a lot of time thinking, "*If I were them, what would I do?*" If you think about what's possible, you spend too much time on low-probability events. Give yourself that thought experiment. Ask yourself what you'd do if you were the PLA, or NSA, or a country starting with an 'I.' Give yourself a budget in several orders of magnitude. A grand, ten grand, a hundred grand, a million bucks. What would you do to hack yourself? What would you do to hack your users without hacking you? That's what I think about.
Over the years, I've become a radical on usability. I believe that usability is all. It's easy to forget it now, but PGP was a triumph because you didn't have to be a cryptographer, you only had to be a techie. We progressed PGP so that you could be non-technical and get by, and then we created PGP Universal which was designed to allow complete ease of use with a trusted staff. That trusted staff was the fly in the ointment of Silent Mail and the crux of why we shut it down -- we created it because of usability concerns and killed it because of security concerns. Things that were okay ideas in May 2013 were suddenly not good ideas in August. I'm sure you've noted when using our service our belief in usability. Without usability that is similar to the non-secure equivalent, we are nothing because the users will just not be secure.
I also stress Silent Circle is a *service*, not an app. This is hard to remember and even we are not as good at it as we need to be. The service is there to provide its users with a secure analogue of the phone and texting apps they're used to. The difference is that instead of having utterly no security, they have a very high degree of it.
Moreover, our design is such to minimize the trust you need to place in us. Our network includes ourselves as a threat, which is unusual. You're one of the very few other people who do something similar. We have technology and policy that makes an attack on *us* to be unattractive to the adversary. You will soon see some improvements to the service that improve our resistance to traffic analysis.
The flip side of that, however, is that it means that the device is the most attractive attack point. We can't help but trust the OS (from RNG to sandbox), bootloader, hardware, etc.
Improvements in our transparently (like code releases) compete with tight resources for improvements in the service and apps. My decisions in deploying those resources reflect my bias that I'd rather have an A grade in the service with a B grade in code releases than an A in code releases and a B service. Yes, it makes it harder for you and others, but I have to look at myself in the mirror and my emphasis is on service quality first, reporting just after that. Over time, we'll get better. We've not yet been running for a year. Continuous improvement works.
I'm going to sum up with the subtitle of the ACM article of Ken Thompson's speech. It's not on his site, but it is on the facsimile article:
To what extent should one trust a statement that a program is free
of Trojan horses? Perhaps it is more important to trust the people
who wrote the software.
Thank you very much for your trust in us, the people. Earning and deserving your trust is something we do every day.
Regards,
Jon
To: Bryan Bishop <kanzure[at]gmail.com>
Cc: Crypto List <cryptography[at]randombit.net>
Subject: Re: [cryptography] Reply to Zooko (in Markdown)
On Aug 17, 2013, at 12:49 AM, Bryan Bishop <kanzure[at]gmail.com> wrote:
On Sat, Aug 17, 2013 at 1:04 AM, Jon Callas <jon[at]callas.org> wrote:Maybe. The obvious counterexample is a compiler that doesn't deterministically generate code, but there's lots and lots of hair in there, including potential problems in distributing the tool chain itself, including copyrighted tools, libraries, etc.
Would providing (signed) build vm images solve the problem of distributing your toolchain?
- It's very hard, even with controlled releases, to get an exact byte-for-byte recompile of an app. Some compilers make this impossible because they randomize the branch prediction and other parts of code generation. Even when the compiler isn't making it literally impossible, without an exact copy of the exact tool chain with the same linkers, libraries, and system, the code won't be byte-for-byte the same. Worst of all, smart development shops use the *oldest* possible tool chain, not the newest one because tool sets are designed for forwards-compatibility (apps built with old tools run on the newest OS) rather than backwards-compatibility (apps built with the new tools run on older OSes). Code reliability almost requires using tool chains that are trailing-edge.
But let's not rathole on that, and get to brass tacks.
I *cannot* provide an argument of security that can be verified on its own. This is Godel's second incompleteness theorem. A set of statements S cannot be proved consistent on its own. (Yes, that's a minor handwave.)
All is not lost, however. We can say, "Meh, good enough" and the problem is solved. Someone else can construct a *verifier* that is some set of policies (I'm using the word "policy" but it could be a program) that verifies the software. However, the verifier can only be verified by a set of policies that are constructed to verify it. The only escape is decide at some point, "meh, good enough."
I brought Ken Thompson into it because he actually constructed a rootkit that would evade detection and described it in his Turing Award lecture. It's not *just* philosophy and theoretical computer science. Thompson flat-out says, that at some point you have to trust the people who wrote the software, because if they want to hide things in the code, they can.
I hope I don't sound like a broken record, but a smart attacker isn't going to attack there, anyway. A smart attacker doesn't break crypto, or suborn releases. They do traffic analysis and make custom malware. Really. Go look at what Snowden is telling us. That is precisely what all the bad guys are doing. Verification is important, but that's not where the attacks come from (ignoring the notable exceptions, of course).
One of my tasks is to get better source releases out there. However, I also have to prioritize it with other tasks, including actual software improvements. We're working on a release that will tie together some new anti-surveillance code along with a better source release. We're testing the new source release process with some people not in our organization, as well. It will get better; it *is* getting better.
Jon
_______________________________________________
cryptography mailing list
cryptography[at]randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography
Date: Fri, 16 Aug 2013 23:04:38 -0700
To: Zooko Wilcox-OHearn <zooko[at]leastauthority.com>
Cc: cryptography[at]randombit.net
Subject: [cryptography] Reply to Zooko (in Markdown)
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Also at http://silentcircle.wordpress.com/2013/08/17/reply-to-zooko/
# Reply to Zooko
(My friend and colleague, [Zooko Wilcox-O'Hearn](https://leastauthority.com/blog/author/zooko-wilcox-ohearn.html) wrote an open letter to me and Phil [on his blog at LeastAuthority.com](https://leastauthority.com/blog/open_letter_silent_circle.html). Despite this appearing on Silent Circle's blog, I am speaking mostly for myself, only slightly for Silent Circle, and not at all for Phil.)
Zooko,
Thank you for writing and your kind words. Thank you even more for being a customer. We're a startup and without customers, we'll be out of business. I think that everyone who believes in privacy should support with their pocketbook every privacy-friendly service they can afford to. It means a lot to me that you're voting with your pocketbook for my service.
Congratulations on your new release of [LeastAuthority's S4](https://leastauthority.com) and [Tahoe-LAFS](https://tahoe-lafs.org/trac/tahoe-lafs). Just as you are a fan of my work, I am an admirer of your work on Tahoe-LAFS and consider it one of the best security innovations on the planet.
I understand your concerns, and share them. One of the highest priority tasks that we're working on is to get our source releases better organized so that they can effectively be built from [what we have on GitHub](https://github.com/SilentCircle/). It's suboptimal now. Getting the source releases is harder than one might think. We're a startup and are pulled in many directions. We're overworked and understaffed. Even in the old days at PGP, producing effective source releases took years of effort to get down pat. It often took us four to six weeks to get the sources out even when delivering one or two releases per year.
The world of app development makes this harder. We're trying to streamline our processes so that we can get a release out about every six weeks. We're not there, either.
However, even when we have source code to be an automated part of our software releases, I'm afraid you're going to be disappointed about how verifiable they are.
It's very hard, even with controlled releases, to get an exact byte-for-byte recompile of an app. Some compilers make this impossible because they randomize the branch prediction and other parts of code generation. Even when the compiler isn't making it literally impossible, without an exact copy of the exact tool chain with the same linkers, libraries, and system, the code won't be byte-for-byte the same. Worst of all, smart development shops use the *oldest* possible tool chain, not the newest one because tool sets are designed for forwards-compatibility (apps built with old tools run on the newest OS) rather than backwards-compatibility (apps built with the new tools run on older OSes). Code reliability almost requires using tool chains that are trailing-edge.
The problems run even deeper than the raw practicality. Twenty-nine years ago this month, in the August 1984 issue of "Communications of the ACM" (Vol. 27, No. 8) Ken Thompson's famous Turing Award lecture, "Reflections on Trusting Trust" was published. You can find a facsimile of the magazine article at <https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thompson.pdf> and a text-searchable copy on Thompson's own site, <http://cm.bell-labs.com/who/ken/trust.html>.
For those unfamiliar with the Turing Award, it is the most prestigious award a computer scientist can win, sometimes called the "Nobel Prize" of computing. The site for the award is at <http://amturing.acm.org>.
In Thompson's lecture, he describes a hack that he and Dennis Ritchie did in a version of UNIX in which they created a backdoor to UNIX login that allowed them to get access to any UNIX system. They also created a self-replicating program that would compile their backdoor into new versions of UNIX portably. Quite possibly, their hack existed in the wild until UNIX was recoded from the ground up with BSD and GCC.
In his summation, Thompson says:
The moral is obvious. You can't trust code that you did not totally
create yourself. (Especially code from companies that employ people
like me.) No amount of source-level verification or scrutiny will
protect you from using untrusted code. In demonstrating the
possibility of this kind of attack, I picked on the C compiler. I
could have picked on any program-handling program such as an
assembler, a loader, or even hardware microcode. As the level of
program gets lower, these bugs will be harder and harder to detect.
A well installed microcode bug will be almost impossible to detect.
Thompson's words reach out across three decades of computer science, and yet they echo Descartes from three centuries prior to Thompson. In Descartes's 1641 "Meditations," he proposes the thought experiment of an "evil demon" who deceives us by simulating the universe, our senses, and perhaps even mathematics itself. In his meditation, Descartes decides that the one thing that he knows is that he, himself, exists, and the evil demon cannot deceive him about his own existence. This is where the famous saying, "*I think, therefore I am*" (*Cogito ergo sum* in Latin) comes from.
(There are useful Descartes links at: <http://www.anselm.edu/homepage/dbanach/dcarg.htm> and <http://en.wikipedia.org/wiki/Evil_demon> and <http://en.wikipedia.org/wiki/Brain_in_a_vat>.)
When discussing thorny security problems, I often avoid security ratholes by pointing out Descartes by way of Futurama and saying, "I can't prove I'm not a head in a jar, but it's a useful assumption that I'm not."
Descartes's conundrum even finds its way into modern physics. It is presently a debatable, yet legitimate theory that our entire universe is a software simulation of a universe . Martin Savage of University of Washington <http://www.phys.washington.edu/~savage/> has an interesting paper from last November on ArXiV <http://arxiv.org/pdf/1210.1847v2.pdf>.
You can find an amusing video at <http://www.huffingtonpost.com/2012/12/24/universe-computer-simulation_n_2339109.html> in which Savage even opines that our descendants are simulating us to understand where they came from. I suppose this means we should be nice to our kids because they might have root.
Savage tries to devise an experiment to show that you're actually in a simulation, and as a mathematical logician I think he's ignoring things like math. The problem is isomorphic to writing code that can detect it's on a virtual machine. If the virtual machine isn't trying to evade, then it's certainly possible (if not probable -- after all, the simulators might want us to figure out that we're in a simulation). Unless, of course, they don't, in which case we're back not only to Descartes, but Godel's two Incompleteness Theorems and their cousin, The Halting Problem.
While I'm at it, I highly, highly recommend Scott Aaronson's new book, "Quantum Computing Since Democritus" <http://www.scottaaronson.com/democritus/> which I believe is so important a book that I bought the Dead Tree Edition of it. ([Jenny Lawson](http://thebloggess.com) has already autographed my Kindle.)
Popping the stack back to security, the bottom line is that you're asking for something very, very hard and asking for a solution to an old philosophical problem as well as suggesting I should prove Godel wrong. I'm flattered by the confidence in my abilities, but I believe you're asking for the impossible. Or perhaps I'm programmed to think that.
This limitation doesn't apply to just *my* code. It applies to *your* code, and it applies to all of us. (Tahoe's architecture makes it amazingly resilient, but it's not immune.) It isn't just mind-blowing philosophy mixed up with Ken Thompson's Greatest Hack.
Whenever we run an app, we're trusting it. We're also trusting the operating system that it runs on, the random number generator, the entropy sources, and so on. You're trusting the CPU and its microcode. You're trusting the bootloader, be it EFI or whatever as well as [SMM](http://en.wikipedia.org/wiki/System_Management_Mode) on Intel processors -- which could have completely undetectable code running, doing things that are scarily like Descartes's evil demon. The platform-level threats are so broad that I could bore people for another paragraph or two just enumerating them.
You're perhaps trusting really daft things like [modders who slow down entropy gathering](http://hackaday.com/2013/01/04/is-entropy-slowing-down-your-android-device/) and [outright bugs](http://android-developers.blogspot.com/2013/08/some-securerandom-thoughts.html).
Ironically, the attack vector you suggest (a hacked application) is one of the harder ways for an attacker to feed you bad code. On mobile devices, apps are digitally signed and delivered by app stores. Those app stores have a vetting process that makes *targeted* code delivery hard. Yes, someone could hack us, hack Google or Apple, or all of us, but it's very, very hard to deliver bad code to a *specific* person through this vector, and even harder if you want to do it undetectably.
In contrast, targeted malware is easy to deploy. Exploits are sold openly in exploit markets, and can be bundled up in targeted advertising. Moreover, this *has* happened, and is known to be a mechanism that's been used by the FBI, German Federal Police, the Countries Starting With the Letter 'I' (as a friend puts it), and everyone's favorite The People's Liberation Army. During Arab Spring, a now-defunct government just procured some Javascript malware and dropped it in some browsers to send them passwords on non-SSL sites.
Thus, I think that while your concern does remind me to polish up my source code deployment, if we assume an attacker like a state actor that targets people and systems, there are smarter ways for them to act.
I spend a lot of time thinking, "*If I were them, what would I do?*" If you think about what's possible, you spend too much time on low-probability events. Give yourself that thought experiment. Ask yourself what you'd do if you were the PLA, or NSA, or a country starting with an 'I.' Give yourself a budget in several orders of magnitude. A grand, ten grand, a hundred grand, a million bucks. What would you do to hack yourself? What would you do to hack your users without hacking you? That's what I think about.
Over the years, I've become a radical on usability. I believe that usability is all. It's easy to forget it now, but PGP was a triumph because you didn't have to be a cryptographer, you only had to be a techie. We progressed PGP so that you could be non-technical and get by, and then we created PGP Universal which was designed to allow complete ease of use with a trusted staff. That trusted staff was the fly in the ointment of Silent Mail and the crux of why we shut it down -- we created it because of usability concerns and killed it because of security concerns. Things that were okay ideas in May 2013 were suddenly not good ideas in August. I'm sure you've noted when using our service our belief in usability. Without usability that is similar to the non-secure equivalent, we are nothing because the users will just not be secure.
I also stress Silent Circle is a *service*, not an app. This is hard to remember and even we are not as good at it as we need to be. The service is there to provide its users with a secure analogue of the phone and texting apps they're used to. The difference is that instead of having utterly no security, they have a very high degree of it.
Moreover, our design is such to minimize the trust you need to place in us. Our network includes ourselves as a threat, which is unusual. You're one of the very few other people who do something similar. We have technology and policy that makes an attack on *us* to be unattractive to the adversary. You will soon see some improvements to the service that improve our resistance to traffic analysis.
The flip side of that, however, is that it means that the device is the most attractive attack point. We can't help but trust the OS (from RNG to sandbox), bootloader, hardware, etc.
Improvements in our transparently (like code releases) compete with tight resources for improvements in the service and apps. My decisions in deploying those resources reflect my bias that I'd rather have an A grade in the service with a B grade in code releases than an A in code releases and a B service. Yes, it makes it harder for you and others, but I have to look at myself in the mirror and my emphasis is on service quality first, reporting just after that. Over time, we'll get better. We've not yet been running for a year. Continuous improvement works.
I'm going to sum up with the subtitle of the ACM article of Ken Thompson's speech. It's not on his site, but it is on the facsimile article:
To what extent should one trust a statement that a program is free
of Trojan horses? Perhaps it is more important to trust the people
who wrote the software.
Thank you very much for your trust in us, the people. Earning and deserving your trust is something we do every day.
Regards,
Jon
Saturday, August 17, 2013
NSA - NSA Deputy Director on NSA Core Values 2009-13
14 August 2013
NSA Deputy Director on NSA Core Values 2009-13
Related: NSA/CSS Strategy and Core Values
Video of Inglis answering these questions at the URL below.
http://www.nsa.gov/about/values/core_values.shtml
Date Posted: Jan 15, 2009 | Last Modified: Jan 10, 2013 | Last Reviewed: Jan 10, 2013
NSA/CSS Core Values with NSA's Deputy Director, John C. Inglis
Hello, I'm Chris Inglis, the Deputy Director of the National Security Agency. Thank you for visiting with us on NSA.gov. I'd like to spend a moment talking about NSA's core values – core values that are important to us because, as federal servants, we know that at the end of the day, it's not simply important that we deliver something of value to the nation, but it's also very, very important that we've done it exactly the right way.
Our core values, I hope you wouldn't be surprised, are respect for the law, honesty, integrity, and transparency.
Those values are important to us, public servants, members of the NSA workforce, because each of us takes an oath of office to the Constitution, and the Constitution that we take an oath of office to is one, as you know, that speaks not simply to national security, but to all the values that we hold near and dear – privacy, civil liberties, the right to free speech. All of those values are things that then govern the way we do our business as much as what we deliver at the end of the day. Because we're Americans too – we come from the same communities, we go to the same schools, we raise our families in the same communities that you live in. And what you care about, we do as well.
Q1. What is more important – civil liberties or national security?
A1. I'm often asked the question, "What's more important – civil liberties or national security?" It's a false question; it's a false choice. At the end of the day, we must do both, and they are not irreconcilable. We have to find a way to ensure that we support the entirety of the Constitution – that was the intention of the framers of the Constitution, and that's what we do on a daily basis at the National Security Agency.
Q2. What does "compliance" mean?
A2. The word compliance has many meanings, but at the National Security Agency, we try to effect that the following way: we first hire people who understand that lawfulness is a fundamental attribute. We ensure that the people that we bring enjoy the values that we hold near and dear. We then understand what the rules are that pertain to our business, and we try to master the spirit and the mechanics of those rules, in all of the procedures that we bring to bear. We ensure that there's accountability, such that when people take certain actions, when they apply certain authorities, that, at the end of the day, there's a check and a balance on that, to make sure that it worked out exactly the way we intended. And then, as a matter of course, we report on our activities. When, on occasion, we do make a mistake, we report that, and not simply to ourselves, but to those who oversee us, both within the Executive branch and the Legislative branch, and when necessary, to the courts themselves.
Q3. What does "respect for law" mean at NSA?
A3. Respect for the law at NSA means that we understand both the spirit and mechanics of the law, and that we fully embody in our actions a respect for both.
Q4. Given the nature of today's communications, how does NSA ensure that it is legally conducting its SIGINT mission?
A4. Given the nature of today's communications, the pervasive convergence we see in those communications, where everything is connected to everything, NSA has to ensure its compliance through a variety of mechanisms. We first work very hard to understand the nature of the telecommunications domain. We also work very hard to understand what our explicit authorities are in traversing that domain in the hunt for foreign intelligence. And finally, we, from the moment we design our systems, to employing those systems, to sorting through, sifting through what we might get from those systems, ensure that at every step of the process we worry not simply about what we've obtained, but whether we had the authority to obtain it and whether we've treated it in exactly the right way.
Q5. What type of oversight is in place to make sure Agency employees don't cross the line when it comes to the rights of US citizens?
A5. The oversight that's in place to make sure that the Agency does not cross the line, that it is entirely lawful in the conduct of its activities, is multifaceted and overlapping. First we ensure that we hire employees that have a respect for the law. We don't hire just anyone; we're not simply after people who have technical competence; we want to make sure we hire people who enjoy our values, who will support fully the Constitution. Second, we put procedures in place to ensure that people understand what the rules are and that there's accountability to stay within those boundaries. Finally, we report our activities, not simply to ourselves but to overseers within the Executive branch, the Legislative branch, and when necessary, the Judicial branch, and so that there is a full transparency to all those who provide oversight, and we do enjoy a rich oversight at the National Security Agency.
Q6. What are the rules for retaining data on a US person?
A6. So, (I'm) often asked the question about, "what are the rules for retaining data on a U.S. person." I'll answer that question, but the more interesting question is, "what are the rules that allow me to get that data in the first place?" Those rules are very carefully constructed; we have to have explicit authority, not implied authority, but explicit authority to go after anything in cyberspace, and therefore, if I was to target communications, I need to make sure that I can trace that authority back to an explicit law or court warrant. At that point, I have to make a decision as to whether this in fact was responsive to the explicit authority that I had; I may collect information that's incidental to that. It may have seemed to me up front that I would get information responsive to my authority, but I didn't. I have an obligation to purge that data, I have an obligation to not retain that data. So that at the end of the day, those things that I've gone after I simply didn't have the authority for, but it's the authority plus… it played out just the way I had imagined, I got exactly what I was authorized to get, and I retain only that data.
Q7. How is NSA transparent?
A7. (I'm) often asked the question about, "How is NSA transparent?" Some might read that question to be, "Does NSA put all of its secrets in the public domain?" Of course we don't. There are secrets we hold that you would want us to keep, secrets that the President should know, that people who stand in harm's way should know, but that would be a danger if we released those to our adversaries. But at the same time, we must remain transparent. And the way we do that is we ensure that there is external oversight that is rich - some might say pervasive – across the National Security Agency, and that we are fully responsive to helping them understand what we do, how we've done it, and what the results are. And in that way, they then in turn can turn to the American public and say, "We know what they do, we know what resources they bring to bear, we know what authorities they bring to bear, and they have been transparent to those of us who have the authority and responsibility to ensure."
Q8. What is NSA's intelligence mission?
A8. The United States, of course, has many organizations conducting intelligence. Sometimes those distinctions are based on the discipline that's brought to bear, whether it's human intelligence or imagery intelligence or, in our case, signals intelligence, and sometimes those distinctions are based upon the domain within which that intelligence work takes place. NSA, of course, is a signals intelligence organization; we conduct intelligence by looking for the communications of our adversaries. The second, and very important, distinction is that NSA is a foreign intelligence organization. The intelligence that we are authorized to collect, and that we report on, is intelligence that bears on foreign adversaries, foreign threats, more often than not, located therefore in foreign domains.
NSA Deputy Director on NSA Core Values 2009-13
Related: NSA/CSS Strategy and Core Values
Video of Inglis answering these questions at the URL below.
http://www.nsa.gov/about/values/core_values.shtml
Date Posted: Jan 15, 2009 | Last Modified: Jan 10, 2013 | Last Reviewed: Jan 10, 2013
NSA/CSS Core Values with NSA's Deputy Director, John C. Inglis
Hello, I'm Chris Inglis, the Deputy Director of the National Security Agency. Thank you for visiting with us on NSA.gov. I'd like to spend a moment talking about NSA's core values – core values that are important to us because, as federal servants, we know that at the end of the day, it's not simply important that we deliver something of value to the nation, but it's also very, very important that we've done it exactly the right way.
Our core values, I hope you wouldn't be surprised, are respect for the law, honesty, integrity, and transparency.
Those values are important to us, public servants, members of the NSA workforce, because each of us takes an oath of office to the Constitution, and the Constitution that we take an oath of office to is one, as you know, that speaks not simply to national security, but to all the values that we hold near and dear – privacy, civil liberties, the right to free speech. All of those values are things that then govern the way we do our business as much as what we deliver at the end of the day. Because we're Americans too – we come from the same communities, we go to the same schools, we raise our families in the same communities that you live in. And what you care about, we do as well.
Q1. What is more important – civil liberties or national security?
A1. I'm often asked the question, "What's more important – civil liberties or national security?" It's a false question; it's a false choice. At the end of the day, we must do both, and they are not irreconcilable. We have to find a way to ensure that we support the entirety of the Constitution – that was the intention of the framers of the Constitution, and that's what we do on a daily basis at the National Security Agency.
Q2. What does "compliance" mean?
A2. The word compliance has many meanings, but at the National Security Agency, we try to effect that the following way: we first hire people who understand that lawfulness is a fundamental attribute. We ensure that the people that we bring enjoy the values that we hold near and dear. We then understand what the rules are that pertain to our business, and we try to master the spirit and the mechanics of those rules, in all of the procedures that we bring to bear. We ensure that there's accountability, such that when people take certain actions, when they apply certain authorities, that, at the end of the day, there's a check and a balance on that, to make sure that it worked out exactly the way we intended. And then, as a matter of course, we report on our activities. When, on occasion, we do make a mistake, we report that, and not simply to ourselves, but to those who oversee us, both within the Executive branch and the Legislative branch, and when necessary, to the courts themselves.
Q3. What does "respect for law" mean at NSA?
A3. Respect for the law at NSA means that we understand both the spirit and mechanics of the law, and that we fully embody in our actions a respect for both.
Q4. Given the nature of today's communications, how does NSA ensure that it is legally conducting its SIGINT mission?
A4. Given the nature of today's communications, the pervasive convergence we see in those communications, where everything is connected to everything, NSA has to ensure its compliance through a variety of mechanisms. We first work very hard to understand the nature of the telecommunications domain. We also work very hard to understand what our explicit authorities are in traversing that domain in the hunt for foreign intelligence. And finally, we, from the moment we design our systems, to employing those systems, to sorting through, sifting through what we might get from those systems, ensure that at every step of the process we worry not simply about what we've obtained, but whether we had the authority to obtain it and whether we've treated it in exactly the right way.
Q5. What type of oversight is in place to make sure Agency employees don't cross the line when it comes to the rights of US citizens?
A5. The oversight that's in place to make sure that the Agency does not cross the line, that it is entirely lawful in the conduct of its activities, is multifaceted and overlapping. First we ensure that we hire employees that have a respect for the law. We don't hire just anyone; we're not simply after people who have technical competence; we want to make sure we hire people who enjoy our values, who will support fully the Constitution. Second, we put procedures in place to ensure that people understand what the rules are and that there's accountability to stay within those boundaries. Finally, we report our activities, not simply to ourselves but to overseers within the Executive branch, the Legislative branch, and when necessary, the Judicial branch, and so that there is a full transparency to all those who provide oversight, and we do enjoy a rich oversight at the National Security Agency.
Q6. What are the rules for retaining data on a US person?
A6. So, (I'm) often asked the question about, "what are the rules for retaining data on a U.S. person." I'll answer that question, but the more interesting question is, "what are the rules that allow me to get that data in the first place?" Those rules are very carefully constructed; we have to have explicit authority, not implied authority, but explicit authority to go after anything in cyberspace, and therefore, if I was to target communications, I need to make sure that I can trace that authority back to an explicit law or court warrant. At that point, I have to make a decision as to whether this in fact was responsive to the explicit authority that I had; I may collect information that's incidental to that. It may have seemed to me up front that I would get information responsive to my authority, but I didn't. I have an obligation to purge that data, I have an obligation to not retain that data. So that at the end of the day, those things that I've gone after I simply didn't have the authority for, but it's the authority plus… it played out just the way I had imagined, I got exactly what I was authorized to get, and I retain only that data.
Q7. How is NSA transparent?
A7. (I'm) often asked the question about, "How is NSA transparent?" Some might read that question to be, "Does NSA put all of its secrets in the public domain?" Of course we don't. There are secrets we hold that you would want us to keep, secrets that the President should know, that people who stand in harm's way should know, but that would be a danger if we released those to our adversaries. But at the same time, we must remain transparent. And the way we do that is we ensure that there is external oversight that is rich - some might say pervasive – across the National Security Agency, and that we are fully responsive to helping them understand what we do, how we've done it, and what the results are. And in that way, they then in turn can turn to the American public and say, "We know what they do, we know what resources they bring to bear, we know what authorities they bring to bear, and they have been transparent to those of us who have the authority and responsibility to ensure."
Q8. What is NSA's intelligence mission?
A8. The United States, of course, has many organizations conducting intelligence. Sometimes those distinctions are based on the discipline that's brought to bear, whether it's human intelligence or imagery intelligence or, in our case, signals intelligence, and sometimes those distinctions are based upon the domain within which that intelligence work takes place. NSA, of course, is a signals intelligence organization; we conduct intelligence by looking for the communications of our adversaries. The second, and very important, distinction is that NSA is a foreign intelligence organization. The intelligence that we are authorized to collect, and that we report on, is intelligence that bears on foreign adversaries, foreign threats, more often than not, located therefore in foreign domains.
Snowden Family Suspects WikiLeaks and Greenwald
Snowden’s full statement to The Huffington Post is below:
Snowden Family Suspects WikiLeaks and Greenwald
http://online.wsj.com/article/SB10001424127887324823804579014611497378326.html
Edward Snowden Talks With His Father
But Lawyers for Both Sides Disapproved of the Internet Chat
By LUKAS I. ALPERT
MOSCOW—Former National Security Agency contractor Edward Snowden and his father spoke for the first time since late May early Thursday, going against the wishes of their lawyers and reflecting growing rifts among family and advisers of the fugitive leaker of U.S. surveillance documents.
Those disagreements include increasingly public bickering over the makeup of Mr. Snowden's legal defense team, and who has standing to speak for him, among the three camps closest to him: the antisecrecy group WikiLeaks, journalist Glenn Greenwald and his father's legal team.
Mr. Snowden and his father, Lon Snowden, spoke for about two hours via an encrypted Internet chat program, said two lawyers who helped arrange the contact. The elder Mr. Snowden participated in the chat from the Washington, D.C., office of his attorney, Bruce Fein, and was connected to his son with the help of Ben Wizner, an attorney with the American Civil Liberties Union, who is involved in coordinating Mr. Snowden's legal defense in the U.S. What they discussed wasn't disclosed.
A person close to the situation said Lon Snowden participated in the chat against the advice of his lawyer, Mr. Fein, who nonetheless helped arrange it.
"For starters, we don't really know who this guy is on the other end," this person said. "The other issue is that [Edward Snowden's Russian lawyer Anatoly] Kucherena has no idea that this occurred, as he is on vacation. Everything we have done has been through Kucherena because Ed's safety is in the hands of the Russians right now and that's not something we felt was appropriate to do while he was away."
When informed of the conversation, Mr. Kucherena said he had urged his client not to speak with his father electronically or over the phone and advised them not to contact each other again until they can meet in person.
"I understand it's a relationship between a father and a son," he said.
Mr. Snowden has been staying in an undisclosed location in Russia since being granted temporary political asylum on Aug. 1. Before that, he was stuck inside the transit zone at a Moscow airport for five weeks after fleeing Hong Kong when the U.S. unsealed criminal espionage charges against him.
More fractious is the relationship among Lon Snowden, WikiLeaks and Mr. Greenwald. Mr. Fein's wife and spokeswoman, Mattie Fein, said Lon Snowden's legal team doesn't trust the intentions of Mr. Greenwald or WikiLeaks and worry they are giving Edward Snowden bad advice.
"The thing we have been most concerned about is that the people who have influence over Ed will try to use him for their own means," Ms. Fein said. "These guys have their own agenda here and we aren't so sure that it has Ed's best interest in mind."
Mr. Greenwald called the Feins' concerns ridiculous and said they had no standing in the matter as they have never had direct contact with Mr. Snowden.
"They have no connection to Ed," Mr. Greenwald said. "Snowden is not 14 years old. He is a very strong-willed, independent, autonomous adult and is making all his own choices about who he deals with and who represents him."
Ms. Fein said she was only voicing the concerns of Mr. Snowden's father, who wanted to make sure his son ended up with the best available legal defense and worried that the team being put together was focused on promoting the interests of WikiLeaks founder Julian Assange.
On Aug. 9, WikiLeaks started a "Journalistic Source Protection Defence Fund," to raise money for Mr. Snowden, saying he had endorsed it. So far, the fund has raised $12,011, according to WikiLeaks' website.
WikiLeaks also recently began selling Edward Snowden merchandise, including T-shirts and coffee mugs, via its online store. WikiLeaks didn't respond to questions about the fund or allegations made by with Lon Snowden's legal team.
On Sunday, Ms. Fein says she was called by a producer at a U.S. television network she didn't specify saying Mr. Greenwald had been shopping around an exclusive interview with Mr. Snowden for seven figures.
She said she warned the producer that she would cut off access to Mr. Snowden's father, who has appeared regularly on television, to anyone who agreed to Mr. Greenwald's terms. A few hours later, she said she received a furious email from Mr. Greenwald, calling her a liar and denying he had made such an offer.
Mr. Greenwald calls the accusation that he was shopping an interview "defamatory," but did admit to having informal discussions with NBC about producing an interview he would conduct himself and licensing it to them for $50,000.
"There were no negotiations. I didn't shop anything around. I didn't go to NBC, they called me and asked and made these offers," he said. "By the time we paid the crew and got ourselves to Moscow and stayed there for two-three days, we would end up losing money, or maybe breaking even."
A spokeswoman for NBC didn't immediately respond to a request for comment.
He said he decided against the idea because it would distract from the public discussion about surveillance and privacy that has emerged since Mr. Snowden leaked details of the U.S. programs.
—Jeanne Whalen and Paul Sonne contributed to this article.
Write to Lukas I. Alpert at lukas.alpert@dowjones.com
It has come to my attention that news organizations seeking information regarding my current situation have, due to the difficulty in contacting me directly, been misled by individuals associated with my father into printing false claims about my situation. I would like to correct the record: I've been fortunate to have legal advice from an international team of some of the finest lawyers in the world, and to work with journalists whose integrity and courage are beyond question. There is no conflict amongst myself and any of the individuals or organizations with whom I have been involved.15 August 2013
Neither my father, his lawyer Bruce Fein, nor his wife Mattie Fein represent me in any way. None of them have been or are involved in my current situation, and this will not change in the future. I ask journalists to understand that they do not possess any special knowledge regarding my situation or future plans, and not to exploit the tragic vacuum of my father's emotional compromise for the sake of tabloid news.
Thank you.
Snowden Family Suspects WikiLeaks and Greenwald
http://online.wsj.com/article/SB10001424127887324823804579014611497378326.html
Edward Snowden Talks With His Father
But Lawyers for Both Sides Disapproved of the Internet Chat
By LUKAS I. ALPERT
MOSCOW—Former National Security Agency contractor Edward Snowden and his father spoke for the first time since late May early Thursday, going against the wishes of their lawyers and reflecting growing rifts among family and advisers of the fugitive leaker of U.S. surveillance documents.
Those disagreements include increasingly public bickering over the makeup of Mr. Snowden's legal defense team, and who has standing to speak for him, among the three camps closest to him: the antisecrecy group WikiLeaks, journalist Glenn Greenwald and his father's legal team.
Mr. Snowden and his father, Lon Snowden, spoke for about two hours via an encrypted Internet chat program, said two lawyers who helped arrange the contact. The elder Mr. Snowden participated in the chat from the Washington, D.C., office of his attorney, Bruce Fein, and was connected to his son with the help of Ben Wizner, an attorney with the American Civil Liberties Union, who is involved in coordinating Mr. Snowden's legal defense in the U.S. What they discussed wasn't disclosed.
A person close to the situation said Lon Snowden participated in the chat against the advice of his lawyer, Mr. Fein, who nonetheless helped arrange it.
"For starters, we don't really know who this guy is on the other end," this person said. "The other issue is that [Edward Snowden's Russian lawyer Anatoly] Kucherena has no idea that this occurred, as he is on vacation. Everything we have done has been through Kucherena because Ed's safety is in the hands of the Russians right now and that's not something we felt was appropriate to do while he was away."
When informed of the conversation, Mr. Kucherena said he had urged his client not to speak with his father electronically or over the phone and advised them not to contact each other again until they can meet in person.
"I understand it's a relationship between a father and a son," he said.
Mr. Snowden has been staying in an undisclosed location in Russia since being granted temporary political asylum on Aug. 1. Before that, he was stuck inside the transit zone at a Moscow airport for five weeks after fleeing Hong Kong when the U.S. unsealed criminal espionage charges against him.
More fractious is the relationship among Lon Snowden, WikiLeaks and Mr. Greenwald. Mr. Fein's wife and spokeswoman, Mattie Fein, said Lon Snowden's legal team doesn't trust the intentions of Mr. Greenwald or WikiLeaks and worry they are giving Edward Snowden bad advice.
"The thing we have been most concerned about is that the people who have influence over Ed will try to use him for their own means," Ms. Fein said. "These guys have their own agenda here and we aren't so sure that it has Ed's best interest in mind."
Mr. Greenwald called the Feins' concerns ridiculous and said they had no standing in the matter as they have never had direct contact with Mr. Snowden.
"They have no connection to Ed," Mr. Greenwald said. "Snowden is not 14 years old. He is a very strong-willed, independent, autonomous adult and is making all his own choices about who he deals with and who represents him."
Ms. Fein said she was only voicing the concerns of Mr. Snowden's father, who wanted to make sure his son ended up with the best available legal defense and worried that the team being put together was focused on promoting the interests of WikiLeaks founder Julian Assange.
On Aug. 9, WikiLeaks started a "Journalistic Source Protection Defence Fund," to raise money for Mr. Snowden, saying he had endorsed it. So far, the fund has raised $12,011, according to WikiLeaks' website.
WikiLeaks also recently began selling Edward Snowden merchandise, including T-shirts and coffee mugs, via its online store. WikiLeaks didn't respond to questions about the fund or allegations made by with Lon Snowden's legal team.
On Sunday, Ms. Fein says she was called by a producer at a U.S. television network she didn't specify saying Mr. Greenwald had been shopping around an exclusive interview with Mr. Snowden for seven figures.
She said she warned the producer that she would cut off access to Mr. Snowden's father, who has appeared regularly on television, to anyone who agreed to Mr. Greenwald's terms. A few hours later, she said she received a furious email from Mr. Greenwald, calling her a liar and denying he had made such an offer.
Mr. Greenwald calls the accusation that he was shopping an interview "defamatory," but did admit to having informal discussions with NBC about producing an interview he would conduct himself and licensing it to them for $50,000.
"There were no negotiations. I didn't shop anything around. I didn't go to NBC, they called me and asked and made these offers," he said. "By the time we paid the crew and got ourselves to Moscow and stayed there for two-three days, we would end up losing money, or maybe breaking even."
A spokeswoman for NBC didn't immediately respond to a request for comment.
He said he decided against the idea because it would distract from the public discussion about surveillance and privacy that has emerged since Mr. Snowden leaked details of the U.S. programs.
—Jeanne Whalen and Paul Sonne contributed to this article.
Write to Lukas I. Alpert at lukas.alpert@dowjones.com
Tuesday, August 13, 2013
NSA Strategy and Core Values
http://www.nsa.gov/about/_files/nsacss_strategy.pdf
Lawfulness–We will adhere to the spirit and the letter of the Constitution and the laws and regulations of the United States.
Honesty–We will be truthful with each other, and honor the public’s need for openness, balanced against national security interests.
Integrity–We will behave honorably and apply good judgment as we would if our activities were under intense public scrutiny.
Fairness–We will ensure equal opportunity and fairness in Agency policies, programs, and practices.
Accountability–We will be accountable for our actions and take responsibility for our decisions, practicing wise stewardship of public resources and placing prudent judgment over expediency.
Loyalty–We will be loyal to the nation, the mission, and each other, weighing ideas solely on the merits and ensuring that decisions enjoy vigorous debate while being made, followed by unified implementation.
Collaboration–We will cooperate with others in a respectful and open-minded manner, to our mutual success.
Innovation–We will seek new ways to accomplish our mission, planning for the future based on what we’ve learned from the past, and thinking ahead to the best of our ability to avoid unintended consequences.
Learning–We will acquire and transfer knowledge, provide the resources and training necessary for our people to remain at the forefront of technology, and individually pursue continuous learning.
Enhance Cyber Security. Provide intelligence and information assurance products and services that will help uncover, prevent, mitigate, or counter attempts to compromise information or information technology that is critical to national interests.
Provide Tactical Advantage. Collaborate and securely share information with customers and mission partners, in the places and at the speed that maximize the operational impact of cryptologic activities.
Provide Strategic Advantage. Detect early indications of emerging or potential strategic threats to U.S. political, economic, or military interests despite the efforts of sophisticated adversaries to deny such warning, and provide critical U.S. and allied networks with resilience against attack.
Thwart Terrorists. Uncover violent extremists, their locations, plans, organizations and operations, and help deny violent extremists the ability to use cyberspace or other information technology to directly attack or disrupt U.S. interests, radicalize new extremists, or to otherwise advance their cause by any means including cyberspace or other information technology.
Contain, Control, and Protect Strategic Weapons. Uncover foreign efforts to develop or proliferate strategic weapons, provide cryptographic products and processes to secure U.S. nuclear weapons, and help keep proliferators from using information technology to their advantage.
Foresee Future Needs. Preclude strategic surprise by anticipating the operational landscape and identifying future target and technology trends – integrating breakthrough research advances and partner effects into the future mission landscape while providing a predictive awareness of technology perishability to effectively influence investment priorities.
Create Research Breakthroughs and Transfer Technologies. Discover, develop, and demonstrate scientific and research breakthroughs in sufficient scale, scope, and pace for the NSA/CSS and our partners to gain, extend, and maintain our technical advantages over current and emerging adversaries–using technology transfer processes to deliver mission capability directly to mission teams and/or indirectly through technology capability development teams.
Develop and Mature Emerging Technologies. Leverage research breakthroughs and distributed operations to ensure our mission success and provide improved scalability, integration, precision, and assured information sharing–enhancing Information Assurance products and services, and SIGINT analysis, dissemination, and mission management in order to realize the full benefits of current and forecast collection capabilities.
Harden the Infrastructure. Harden the security of hardware and software components critical to our national interests by developing, deploying, and continuously modernizing a highly assured and resilient technology undercarriage with associated standards to enable combined operations.
Deliver and Sustain Mission Capabilities. In response to validated requirements, acquire and operationalize products and systems that provide capabilities to meet mission needs and that are consistent with the current and evolving Enterprise Cryptologic Architecture.
Leverage Partnerships. Develop and enhance U.S. Government, foreign, academic, and commercial partnerships to obtain access, expertise, and perspective to overcome cryptologic challenges while fostering cooperation between partners to advance common goals, make optimal use of resources, influence standards, and drive collaboration and secure information sharing.
Build the Workforce. Sustain and improve a comprehensive recruitment, hiring, retention and reward strategy that keeps pace with the national demand for diverse talent.
Focus on Leadership. Identify, develop and sustain collaborative and accountable leaders who strengthen mission results and enable employees to realize their fullest potential.
Accelerate Learning and Achieve Mission Results. Educate, train, and develop an agile and collaborative enterprise-wide workforce with the skills and competencies necessary to meet current and emerging missions.
Provide Security, Counterintelligence, and Force Protection. Safeguard the workforce and worldwide assets and operations against traditional and emerging threats.
Modernize Facilities that Support Workforce Resilience. Recapitalize physical infrastructure to promote a modern, world-class work environment that safeguards the health, wellness, safety and quality of life of our employees.
Support Responsive Business Operations. Provide disciplined, repeatable, and transparent business processes, functions and support to mission operations at the speed of mission change.
Achieve Auditability. Improve the integration of business management functions to produce auditable financial statements, which will engender the trust of our overseers and enhance the agility of corporate decision processes.
Deliver Acquisition Excellence. Ensure timely and agile delivery of cost-effective capabilities through innovative, disciplined methods of acquisition, capitalizing on the strength of industry, and gaining efficiencies in contract administration.
Implement Business Automation and Integration. Improve the performance of all functions within the business segment of the architecture by refining processes, ensuring use of standards, increasing the use of automation, and enabling effective employment of common business solutions for ease of information exchange within and between IC Agencies.
Improve Performance-Based Budgeting. Clearly demonstrate the relationship between NSA/CSS strategy, mission performance, and investment decisions through improved integration of performance analysis and documentation within the planning, programming, budgeting, execution and assessment processes.
Demonstrate Stewardship. Maintain the Nation’s trust by ensuring our actions, our processes, and our systems are consistent with the Constitution of the United States, authorized by law, and within the framework of applicable regulations by focusing on awareness, development, application, and prompt updates of standards, policies, and other mandates.
Exercise Integrity. Provide the necessary processes that will demand the highest standards of our people, processes and systems and ensure our people strive to exercise sound judgment, place honor above expediency, and avoid even the appearance of impropriety.
Ensure Accountability. Take responsibility for outcomes related to our decisions, actions, and policies, and strive for precision in our people, processes and systems by appropriately implementing management controls and monitoring for the acceptance of responsibility.
Advance Transparency. Remain committed to honesty and visibility with our overseers and stakeholders to ensure the Nation’s trust is based on an accurate understanding of NSA/CSS activities, supporting oversight by providing all relevant facts in compliance with applicable laws and regulations and educating the workforce on the expectations of our overseers and stakeholders
For more information, visit NSA/CSS on the web www.nsa.gov
http://www.nsa.gov/about/_files/CoreValues.pdf
national security agency
Every day, we provide valuable intelligence on issues of concern to all Americans—such as international terrorism, cyber crime, narcotics trafficking, and the proliferation of weapons of mass destruction. Our customers range from U.S. decisionmakers to service personnel who are in harm’s way.
For us, collaboration is built into the very fabric of who we are. A component of both the U.S. Defense Department and the U.S. Intelligence Community, we also partner with other Federal organizations—including the U.S. Department of Homeland Security and U.S. Cyber Command—to safeguard national security information and systems. Cybersecurity is a team sport. No one organization has the resources to do the job alone.
Just as we are committed to protecting the Nation, we are equally committed to protecting the privacy rights of the American people. It is not an either/or proposition. We must do both, and we do, every day.
This guide highlights our core values and vision. We always stand ready to serve—for the good of the Nation.
KEITH B. ALEXANDER
General, U.S. Army
Director, NSA/Chief, CSS
_____
Remarkable people with remarkable skills form the heart of the National Security Agency. We are an adaptive, forward-leaning organization. We can and must outthink, outwork, and defeat our adversaries’ new ideas. This is possible only with a dedicated, talented workforce that supports the defense of the United States and all that it stands for. NSA’s core values underpin this commitment. First and foremost is our respect for the law. Everything that we undertake in our missions is grounded in our adherence to the U.S. Constitution and compliance with U.S. laws and regulations that govern our activities. Honesty and integrity are core values. We recognize that national leaders and the American people at large have placed great trust in us, and we strive at all times to be deserving of that trust. In addition, we embrace transparency to the fullest extent possible. We never forget that we, too, are Americans and that every activity we engage in is aimed at ensuring the safety, security, and liberty of our fellow citizens. NSA is a unique national asset because of our employees, many of whom have made the ultimate sacrifice for our Nation. NSA’s people not only matter, they make all the difference.
JOHN C. INGLIS
Deputy Director
National Security Agency
National Vigilance Park
National Vigilance Park honors the “silent warriors” who performed crucial intelligence-gathering missions during the Cold War. The achievements and sacrifices of these military personnel are a proud part of the NSA/CSS legacy.
National Cryptologic Museum
The National Cryptologic Museum houses thousands of cryptologic artifacts that serve to illustrate the history of the cryptologic profession. This museum is free and open to the public.
Defending Our Nation Securing Our Future
NSA Office of Public Affairs, 301-688-6524, www.nsa.gov
NSA/CSS Strategy
June 2010Our Vision
Global Cryptologic Dominance through Responsive Presence and Network AdvantageOur Mission
The National Security Agency/Central Security Service (NSA/CSS) leads the U.S. Government in cryptology that encompasses both Signals Intelligence (SIGINT) and Information Assurance (IA) products and services, and enables Computer Network Operations (CNO) in order to gain a decision advantage for the Nation and our allies under all circumstances.Core Values
We will protect national security interests by adhering to the highest standards of behavior.Lawfulness–We will adhere to the spirit and the letter of the Constitution and the laws and regulations of the United States.
Honesty–We will be truthful with each other, and honor the public’s need for openness, balanced against national security interests.
Integrity–We will behave honorably and apply good judgment as we would if our activities were under intense public scrutiny.
Fairness–We will ensure equal opportunity and fairness in Agency policies, programs, and practices.
Accountability–We will be accountable for our actions and take responsibility for our decisions, practicing wise stewardship of public resources and placing prudent judgment over expediency.
Loyalty–We will be loyal to the nation, the mission, and each other, weighing ideas solely on the merits and ensuring that decisions enjoy vigorous debate while being made, followed by unified implementation.
Collaboration–We will cooperate with others in a respectful and open-minded manner, to our mutual success.
Innovation–We will seek new ways to accomplish our mission, planning for the future based on what we’ve learned from the past, and thinking ahead to the best of our ability to avoid unintended consequences.
Learning–We will acquire and transfer knowledge, provide the resources and training necessary for our people to remain at the forefront of technology, and individually pursue continuous learning.
Goal 1–Succeeding in Today’s Operations
Enable wise policymaking, effective national security action, and U.S. freedom of action in cyberspace by exploiting foreign use of electronic signals and systems and securing information systems used by the U.S. and its allies, while protecting privacy and civil liberties.Enhance Cyber Security. Provide intelligence and information assurance products and services that will help uncover, prevent, mitigate, or counter attempts to compromise information or information technology that is critical to national interests.
Provide Tactical Advantage. Collaborate and securely share information with customers and mission partners, in the places and at the speed that maximize the operational impact of cryptologic activities.
Provide Strategic Advantage. Detect early indications of emerging or potential strategic threats to U.S. political, economic, or military interests despite the efforts of sophisticated adversaries to deny such warning, and provide critical U.S. and allied networks with resilience against attack.
Thwart Terrorists. Uncover violent extremists, their locations, plans, organizations and operations, and help deny violent extremists the ability to use cyberspace or other information technology to directly attack or disrupt U.S. interests, radicalize new extremists, or to otherwise advance their cause by any means including cyberspace or other information technology.
Contain, Control, and Protect Strategic Weapons. Uncover foreign efforts to develop or proliferate strategic weapons, provide cryptographic products and processes to secure U.S. nuclear weapons, and help keep proliferators from using information technology to their advantage.
Goal 2–Preparing for the Future
Deliver next generation capabilities and solutions that meet the challenges of tomorrow and drive solutions from invention to operation in support of national security and U.S. Government missions.Foresee Future Needs. Preclude strategic surprise by anticipating the operational landscape and identifying future target and technology trends – integrating breakthrough research advances and partner effects into the future mission landscape while providing a predictive awareness of technology perishability to effectively influence investment priorities.
Create Research Breakthroughs and Transfer Technologies. Discover, develop, and demonstrate scientific and research breakthroughs in sufficient scale, scope, and pace for the NSA/CSS and our partners to gain, extend, and maintain our technical advantages over current and emerging adversaries–using technology transfer processes to deliver mission capability directly to mission teams and/or indirectly through technology capability development teams.
Develop and Mature Emerging Technologies. Leverage research breakthroughs and distributed operations to ensure our mission success and provide improved scalability, integration, precision, and assured information sharing–enhancing Information Assurance products and services, and SIGINT analysis, dissemination, and mission management in order to realize the full benefits of current and forecast collection capabilities.
Harden the Infrastructure. Harden the security of hardware and software components critical to our national interests by developing, deploying, and continuously modernizing a highly assured and resilient technology undercarriage with associated standards to enable combined operations.
Deliver and Sustain Mission Capabilities. In response to validated requirements, acquire and operationalize products and systems that provide capabilities to meet mission needs and that are consistent with the current and evolving Enterprise Cryptologic Architecture.
Leverage Partnerships. Develop and enhance U.S. Government, foreign, academic, and commercial partnerships to obtain access, expertise, and perspective to overcome cryptologic challenges while fostering cooperation between partners to advance common goals, make optimal use of resources, influence standards, and drive collaboration and secure information sharing.
Goal 3–Enhancing and Leading an Expert Workforce
Attract, develop and engage an exceptional, diverse workforce prepared to overcome our cryptologic challenges.Build the Workforce. Sustain and improve a comprehensive recruitment, hiring, retention and reward strategy that keeps pace with the national demand for diverse talent.
Focus on Leadership. Identify, develop and sustain collaborative and accountable leaders who strengthen mission results and enable employees to realize their fullest potential.
Accelerate Learning and Achieve Mission Results. Educate, train, and develop an agile and collaborative enterprise-wide workforce with the skills and competencies necessary to meet current and emerging missions.
Provide Security, Counterintelligence, and Force Protection. Safeguard the workforce and worldwide assets and operations against traditional and emerging threats.
Modernize Facilities that Support Workforce Resilience. Recapitalize physical infrastructure to promote a modern, world-class work environment that safeguards the health, wellness, safety and quality of life of our employees.
Goal 4–Implementing Best Business Practices
Provide timely data to inform optimal strategic and tactical investment decisions while ensuring organizational accountability for executing those decisions and realizing the associated performance improvement.Support Responsive Business Operations. Provide disciplined, repeatable, and transparent business processes, functions and support to mission operations at the speed of mission change.
Achieve Auditability. Improve the integration of business management functions to produce auditable financial statements, which will engender the trust of our overseers and enhance the agility of corporate decision processes.
Deliver Acquisition Excellence. Ensure timely and agile delivery of cost-effective capabilities through innovative, disciplined methods of acquisition, capitalizing on the strength of industry, and gaining efficiencies in contract administration.
Implement Business Automation and Integration. Improve the performance of all functions within the business segment of the architecture by refining processes, ensuring use of standards, increasing the use of automation, and enabling effective employment of common business solutions for ease of information exchange within and between IC Agencies.
Improve Performance-Based Budgeting. Clearly demonstrate the relationship between NSA/CSS strategy, mission performance, and investment decisions through improved integration of performance analysis and documentation within the planning, programming, budgeting, execution and assessment processes.
Goal 5–Manifesting Principled Performance
Accomplishing our missions with a commitment to a principled and steadfast approach to performance through compliance, lawfulness, and protection of public trust must be paramount.Demonstrate Stewardship. Maintain the Nation’s trust by ensuring our actions, our processes, and our systems are consistent with the Constitution of the United States, authorized by law, and within the framework of applicable regulations by focusing on awareness, development, application, and prompt updates of standards, policies, and other mandates.
Exercise Integrity. Provide the necessary processes that will demand the highest standards of our people, processes and systems and ensure our people strive to exercise sound judgment, place honor above expediency, and avoid even the appearance of impropriety.
Ensure Accountability. Take responsibility for outcomes related to our decisions, actions, and policies, and strive for precision in our people, processes and systems by appropriately implementing management controls and monitoring for the acceptance of responsibility.
Advance Transparency. Remain committed to honesty and visibility with our overseers and stakeholders to ensure the Nation’s trust is based on an accurate understanding of NSA/CSS activities, supporting oversight by providing all relevant facts in compliance with applicable laws and regulations and educating the workforce on the expectations of our overseers and stakeholders
For more information, visit NSA/CSS on the web www.nsa.gov
http://www.nsa.gov/about/_files/CoreValues.pdf
national security agency
central security service
core values clear vision
The mission of the National Security Agency (NSA) and its military component, the Central Security Service (CSS), is focused on saving lives, defending vital networks, and exploiting the foreign communications of adversaries. On behalf of the dedicated employees of America’s cryptologic organization, I appreciate your taking the time to learn about this unique national asset.Every day, we provide valuable intelligence on issues of concern to all Americans—such as international terrorism, cyber crime, narcotics trafficking, and the proliferation of weapons of mass destruction. Our customers range from U.S. decisionmakers to service personnel who are in harm’s way.
For us, collaboration is built into the very fabric of who we are. A component of both the U.S. Defense Department and the U.S. Intelligence Community, we also partner with other Federal organizations—including the U.S. Department of Homeland Security and U.S. Cyber Command—to safeguard national security information and systems. Cybersecurity is a team sport. No one organization has the resources to do the job alone.
Just as we are committed to protecting the Nation, we are equally committed to protecting the privacy rights of the American people. It is not an either/or proposition. We must do both, and we do, every day.
This guide highlights our core values and vision. We always stand ready to serve—for the good of the Nation.
KEITH B. ALEXANDER
General, U.S. Army
Director, NSA/Chief, CSS
succeeding in TODAY’S OPERATIONS
By collecting foreign signals, we provide policymakers and warfighters with information that gives them a decisive edge to keep our Nation safe. We protect both information and information technology that are essential to U.S. interests. We also defend critical U.S. and Allied networks against attack. We help our customers identify and correct vulnerabilities in technology and operations. Together, these activities—collecting foreign signals intelligence (SIGINT), providing information assurance (IA), and enabling computer network operations (CNO)—are the crux of NSA’s work.SAVING lives
NSA/CSS provides indications and warnings of impending terrorist attacks and operational planning abroad. Fast action is key. Our technology allows us to deliver actionable intelligence to troops in near real-time or better. We share information with agencies that are responsible for safeguarding our homeland and protecting our citizens around the world.operating as a RESPONSIBLE CITIZEN
We carry out missions in ways that are consistent with the Nation’s values and laws. Earning the American public’s trust is paramount. Our employees are held to the highest standards of accountability and lawfulness, which are reinforced by recurrent training at all levels. Our compliance processes are a part of a broader oversight structure in which all three branches of the U.S. Government play key roles.building upon our RICH HERITAGE
We confront today’s increasingly complex challenges with determination and creativity, just as we’ve tackled past issues. An understanding of our history enriches our future. Our legacy of cutting-edge technology has developed communications solutions for generations.preparing for the FUTURE
Our goals are to: Improve and modernize the security of sensitive information systems and produce timely and actionable intelligence. Create and integrate research breakthroughs and transfer technologies in support of national security and U.S. Government missions. Collaborate in developing education programs at the elementary, secondary, and college levels.CYBERSECURITY: a team sport
NSA/CSS partners with Federal organizations, private industry, and academia to develop the capabilities to safeguard national security information. Cybersecurity is a team sport. Providing technical expertise to the U.S. Department of Homeland Security, U.S. Cyber Command, and other Federal Government entities is one of our essential duties. To stay ahead of adversaries, we constantly adjust and improve our defenses to protect the United States in this digital age. NSA/CSS provides intelligence as well as innovative products and services to confront cyber threats in an everchanging global environment._____
Remarkable people with remarkable skills form the heart of the National Security Agency. We are an adaptive, forward-leaning organization. We can and must outthink, outwork, and defeat our adversaries’ new ideas. This is possible only with a dedicated, talented workforce that supports the defense of the United States and all that it stands for. NSA’s core values underpin this commitment. First and foremost is our respect for the law. Everything that we undertake in our missions is grounded in our adherence to the U.S. Constitution and compliance with U.S. laws and regulations that govern our activities. Honesty and integrity are core values. We recognize that national leaders and the American people at large have placed great trust in us, and we strive at all times to be deserving of that trust. In addition, we embrace transparency to the fullest extent possible. We never forget that we, too, are Americans and that every activity we engage in is aimed at ensuring the safety, security, and liberty of our fellow citizens. NSA is a unique national asset because of our employees, many of whom have made the ultimate sacrifice for our Nation. NSA’s people not only matter, they make all the difference.
JOHN C. INGLIS
Deputy Director
National Security Agency
National Vigilance Park
National Vigilance Park honors the “silent warriors” who performed crucial intelligence-gathering missions during the Cold War. The achievements and sacrifices of these military personnel are a proud part of the NSA/CSS legacy.
National Cryptologic Museum
The National Cryptologic Museum houses thousands of cryptologic artifacts that serve to illustrate the history of the cryptologic profession. This museum is free and open to the public.
Defending Our Nation Securing Our Future
NSA Office of Public Affairs, 301-688-6524, www.nsa.gov
Obama: Global SIGINT Collection and Communications Tech
Obama: Global SIGINT Collection and Communications Tech
http://www.whitehouse.gov/the-press-office/2013/08/12/presidential-memorandum-
reviewing-our-global-signals-intelligence-collec
The White House
Office of the Press Secretary
For Immediate Release
August 12, 2013
Presidential Memorandum -- Reviewing Our Global Signals Intelligence Collection and Communications Technologies
SUBJECT: Reviewing Our Global Signals Intelligence Collection and Communications Technologies
The United States, like all nations, gathers intelligence in order to protect its national interests and to defend itself, its citizens, and its partners and allies from threats to our security. The United States cooperates closely with many countries on intelligence matters and these intelligence relationships have helped to ensure our common security.
Recent years have brought unprecedented and rapid advancements in communications technologies, particularly with respect to global telecommunications. These technological advances have brought with them both great opportunities and significant risks for our Intelligence Community: opportunity in the form of enhanced technical capabilities that can more precisely and readily identify threats to our security, and risks in the form of insider and cyber threats.
I believe it is important to take stock of how these technological advances alter the environment in which we conduct our intelligence mission. To this end, by the authority vested in me as President by the Constitution and the laws of the United States of America, I am directing you to establish a Review Group on Intelligence and Communications Technologies (Review Group).
The Review Group will assess whether, in light of advancements in communications technologies, the United States employs its technical collection capabilities in a manner that optimally protects our national security and advances our foreign policy while appropriately accounting for other policy considerations, such as the risk of unauthorized disclosure and our need to maintain the public trust. Within 60 days of its establishment, the Review Group will brief their interim findings to me through the Director of National Intelligence (DNI), and the Review Group will provide a final report and recommendations to me through the DNI no later than December 15, 2013.
You are hereby authorized and directed to publish this memorandum in the Federal Register.
BARACK OBAMA
http://www.whitehouse.gov/the-press-office/2013/08/12/presidential-memorandum-
reviewing-our-global-signals-intelligence-collec
The White House
Office of the Press Secretary
For Immediate Release
August 12, 2013
Presidential Memorandum -- Reviewing Our Global Signals Intelligence Collection and Communications Technologies
SUBJECT: Reviewing Our Global Signals Intelligence Collection and Communications Technologies
The United States, like all nations, gathers intelligence in order to protect its national interests and to defend itself, its citizens, and its partners and allies from threats to our security. The United States cooperates closely with many countries on intelligence matters and these intelligence relationships have helped to ensure our common security.
Recent years have brought unprecedented and rapid advancements in communications technologies, particularly with respect to global telecommunications. These technological advances have brought with them both great opportunities and significant risks for our Intelligence Community: opportunity in the form of enhanced technical capabilities that can more precisely and readily identify threats to our security, and risks in the form of insider and cyber threats.
I believe it is important to take stock of how these technological advances alter the environment in which we conduct our intelligence mission. To this end, by the authority vested in me as President by the Constitution and the laws of the United States of America, I am directing you to establish a Review Group on Intelligence and Communications Technologies (Review Group).
The Review Group will assess whether, in light of advancements in communications technologies, the United States employs its technical collection capabilities in a manner that optimally protects our national security and advances our foreign policy while appropriately accounting for other policy considerations, such as the risk of unauthorized disclosure and our need to maintain the public trust. Within 60 days of its establishment, the Review Group will brief their interim findings to me through the Director of National Intelligence (DNI), and the Review Group will provide a final report and recommendations to me through the DNI no later than December 15, 2013.
You are hereby authorized and directed to publish this memorandum in the Federal Register.
BARACK OBAMA
Q. & A.: Edward Snowden Speaks to Peter Maass
Q. & A.: Edward Snowden Speaks to Peter Maass
Interview by PETER MAASS
Published: August 13, 2013
In the course of reporting his profile of Laura Poitras, Peter Maass conducted an encrypted question-and-answer session, for which Poitras served as intermediary, with Edward J. Snowden. Below is a full transcript of that conversation.
Peter Maass: Why did you seek out Laura and Glenn, rather than journalists from major American news outlets (N.Y.T., W.P., W.S.J. etc.)? In particular, why Laura, a documentary filmmaker?
Edward Snowden: After 9/11, many of the most important news outlets in America abdicated their role as a check to power — the journalistic responsibility to challenge the excesses of government — for fear of being seen as unpatriotic and punished in the market during a period of heightened nationalism. From a business perspective, this was the obvious strategy, but what benefited the institutions ended up costing the public dearly. The major outlets are still only beginning to recover from this cold period.
Laura and Glenn are among the few who reported fearlessly on controversial topics throughout this period, even in the faceof withering personal criticism, and resulted in Laura specifically becoming targeted by the very programs involved in the recent disclosures. She had demonstrated the courage, personal experience and skill needed to handle what is probably the most dangerous assignment any journalist can be given — reporting on the secret misdeeds of the most powerful government in the world — making her an obvious choice.
P.M.: Was there a moment during your contact with Laura when you realized you could trust her? What was that moment, what caused it?
E.S.: We came to a point in the verification and vetting process where I discovered Laura was more suspicious of me than I was of her, and I’m famously paranoid. The combination of her experience and her exacting focus on detail and process gave her a natural talent for security, and that’s a refreshing trait to discover in someone who is likely to come under intense scrutiny in the future, as normally one would have to work very hard to get them to take the risks seriously.
With that putting me at ease, it became easier to open up without fearing the invested trust would be mishandled, and I think it’sthe only way she ever managed to get me on camera. I personally hate cameras and being recorded, but at some point in the working process, I realized I was unconsciously trusting her not to hang me even with my naturally unconsidered remarks. She’s good.
P.M.: Were you surprised that Glenn did not respond to your requests and instructions for encrypted communication?
E.S.: Yes and no. I know journalists are busy and had assumed being taken seriously would be a challenge, especially given the paucity of detail I could initially offer. At the same time, this is 2013, and a journalist who regularly reported on the concentration and excess of state power. I was surprised to realize that there were people in news organizations who didn’t recognize any unencrypted message sent over the Internet is being delivered to every intelligence service in the world. In the wake of this year’s disclosures, it should be clear that unencrypted journalist-source communication is unforgivably reckless.
P.M.: When you first met Laura and Glenn in Hong Kong, what was your initial reaction? Were you surprised by anything in the way they worked and interacted with you?
E.S.: I think they were annoyed that I was younger than they expected, and I was annoyed they had arrived too early, which complicated the initial verification. As soon as we were behind close doors, however, I think everyone was reassured by the obsessive attention to precaution and bona fides. I was particularly impressed by Glenn’s ability to operate without sleep for days at a time.
P.M.: Laura started filming you from nearly the start. Were you surprised by that? Why or why not?
E.S.: Definitely surprised. As one might imagine, normally spies allergically avoid contact with reporters or media, so I was a virgin source — everything was a surprise. Had I intended to skulk away anonymously, I think it would have been far harder to work with Laura, but we all knew what was at stake. The weight of the situation actually made it easier to focus on what was in the public interest rather than our own. I think we all knew there was no going back once she turned that camera on, and the ultimate outcome would be decided by the world.
A version of this article appeared in print on August 18, 2013, on page MM22 of the Sunday Magazine with the headline: Snowden’s People.
Interview by PETER MAASS
Published: August 13, 2013
In the course of reporting his profile of Laura Poitras, Peter Maass conducted an encrypted question-and-answer session, for which Poitras served as intermediary, with Edward J. Snowden. Below is a full transcript of that conversation.
Peter Maass: Why did you seek out Laura and Glenn, rather than journalists from major American news outlets (N.Y.T., W.P., W.S.J. etc.)? In particular, why Laura, a documentary filmmaker?
Edward Snowden: After 9/11, many of the most important news outlets in America abdicated their role as a check to power — the journalistic responsibility to challenge the excesses of government — for fear of being seen as unpatriotic and punished in the market during a period of heightened nationalism. From a business perspective, this was the obvious strategy, but what benefited the institutions ended up costing the public dearly. The major outlets are still only beginning to recover from this cold period.
Laura and Glenn are among the few who reported fearlessly on controversial topics throughout this period, even in the faceof withering personal criticism, and resulted in Laura specifically becoming targeted by the very programs involved in the recent disclosures. She had demonstrated the courage, personal experience and skill needed to handle what is probably the most dangerous assignment any journalist can be given — reporting on the secret misdeeds of the most powerful government in the world — making her an obvious choice.
P.M.: Was there a moment during your contact with Laura when you realized you could trust her? What was that moment, what caused it?
E.S.: We came to a point in the verification and vetting process where I discovered Laura was more suspicious of me than I was of her, and I’m famously paranoid. The combination of her experience and her exacting focus on detail and process gave her a natural talent for security, and that’s a refreshing trait to discover in someone who is likely to come under intense scrutiny in the future, as normally one would have to work very hard to get them to take the risks seriously.
With that putting me at ease, it became easier to open up without fearing the invested trust would be mishandled, and I think it’sthe only way she ever managed to get me on camera. I personally hate cameras and being recorded, but at some point in the working process, I realized I was unconsciously trusting her not to hang me even with my naturally unconsidered remarks. She’s good.
P.M.: Were you surprised that Glenn did not respond to your requests and instructions for encrypted communication?
E.S.: Yes and no. I know journalists are busy and had assumed being taken seriously would be a challenge, especially given the paucity of detail I could initially offer. At the same time, this is 2013, and a journalist who regularly reported on the concentration and excess of state power. I was surprised to realize that there were people in news organizations who didn’t recognize any unencrypted message sent over the Internet is being delivered to every intelligence service in the world. In the wake of this year’s disclosures, it should be clear that unencrypted journalist-source communication is unforgivably reckless.
P.M.: When you first met Laura and Glenn in Hong Kong, what was your initial reaction? Were you surprised by anything in the way they worked and interacted with you?
E.S.: I think they were annoyed that I was younger than they expected, and I was annoyed they had arrived too early, which complicated the initial verification. As soon as we were behind close doors, however, I think everyone was reassured by the obsessive attention to precaution and bona fides. I was particularly impressed by Glenn’s ability to operate without sleep for days at a time.
P.M.: Laura started filming you from nearly the start. Were you surprised by that? Why or why not?
E.S.: Definitely surprised. As one might imagine, normally spies allergically avoid contact with reporters or media, so I was a virgin source — everything was a surprise. Had I intended to skulk away anonymously, I think it would have been far harder to work with Laura, but we all knew what was at stake. The weight of the situation actually made it easier to focus on what was in the public interest rather than our own. I think we all knew there was no going back once she turned that camera on, and the ultimate outcome would be decided by the world.
A version of this article appeared in print on August 18, 2013, on page MM22 of the Sunday Magazine with the headline: Snowden’s People.
Saturday, August 10, 2013
Special Collection Service (SCS) CANADA'S EQUIVALENT of NSA
Special Collection Service (SCS)
CANADA'S EQUIVALENT of NSA
|
|
Looking North. |
Looking North. |
Looking South |
Looking South. |
Looking South. |
Looking East. |
Looking West. |
Looking East. |
Looking East. |
Obama On NSA 9 August 2013
Obama On NSA 9 August 2013
A sends video link: http://www.c-span.org/flvPop.aspx?id=10737440808
Two related NSA and DoJ reports released 9 August 2013:
2013-0979.pdf DoJ: NSA Bulk Collection of Telephone Metadata August 9, 2013 2013-0978.pdf NSA: Missions, Authorities, Oversight, Partners August 9, 2013
http://www.whitehouse.gov/the-press-office/2013/08/09/remarks-president-press-conference
[Excerpts pertaining to NSA.]
The White House
Office of the Press Secretary
For Immediate Release
August 09, 2013
Remarks by the President in a Press Conference
3:09 P.M. EDT
THE PRESIDENT: Good afternoon, everybody. Please have a seat.
Over the past few weeks, I’ve been talking about what I believe should be our number-one priority as a country -- building a better bargain for the middle class and for Americans who want to work their way into the middle class. At the same time, I’m focused on my number-one responsibility as Commander-in-Chief, and that's keeping the American people safe. And in recent days, we’ve been reminded once again about the threats to our nation.
As I said at the National Defense University back in May, in meeting those threats we have to strike the right balance between protecting our security and preserving our freedoms. And as part of this rebalancing, I called for a review of our surveillance programs. Unfortunately, rather than an orderly and lawful process to debate these issues and come up with appropriate reforms, repeated leaks of classified information have initiated the debate in a very passionate, but not always fully informed way.
Now, keep in mind that as a senator, I expressed a healthy skepticism about these programs, and as President, I’ve taken steps to make sure they have strong oversight by all three branches of government and clear safeguards to prevent abuse and protect the rights of the American people. But given the history of abuse by governments, it’s right to ask questions about surveillance -- particularly as technology is reshaping every aspect of our lives.
I’m also mindful of how these issues are viewed overseas, because American leadership around the world depends upon the example of American democracy and American openness -- because what makes us different from other countries is not simply our ability to secure our nation, it’s the way we do it -- with open debate and democratic process.
In other words, it’s not enough for me, as President, to have confidence in these programs. The American people need to have confidence in them as well. And that's why, over the last few weeks, I’ve consulted members of Congress who come at this issue from many different perspectives. I’ve asked the Privacy and Civil Liberties Oversight Board to review where our counterterrorism efforts and our values come into tension, and I directed my national security team to be more transparent and to pursue reforms of our laws and practices.
And so, today, I’d like to discuss four specific steps -- not all inclusive, but some specific steps that we’re going to be taking very shortly to move the debate forward.
First, I will work with Congress to pursue appropriate reforms to Section 215 of the Patriot Act -- the program that collects telephone records. As I’ve said, this program is an important tool in our effort to disrupt terrorist plots. And it does not allow the government to listen to any phone calls without a warrant. But given the scale of this program, I understand the concerns of those who would worry that it could be subject to abuse. So after having a dialogue with members of Congress and civil libertarians, I believe that there are steps we can take to give the American people additional confidence that there are additional safeguards against abuse.
For instance, we can take steps to put in place greater oversight, greater transparency, and constraints on the use of this authority. So I look forward to working with Congress to meet those objectives.
Second, I’ll work with Congress to improve the public’s confidence in the oversight conducted by the Foreign Intelligence Surveillance Court, known as the FISC. The FISC was created by Congress to provide judicial review of certain intelligence activities so that a federal judge must find that our actions are consistent with the Constitution. However, to build greater confidence, I think we should consider some additional changes to the FISC.
One of the concerns that people raise is that a judge reviewing a request from the government to conduct programmatic surveillance only hears one side of the story -- may tilt it too far in favor of security, may not pay enough attention to liberty. And while I’ve got confidence in the court and I think they’ve done a fine job, I think we can provide greater assurances that the court is looking at these issues from both perspectives -- security and privacy.
So, specifically, we can take steps to make sure civil liberties concerns have an independent voice in appropriate cases by ensuring that the government’s position is challenged by an adversary.
Number three, we can, and must, be more transparent. So I’ve directed the intelligence community to make public as much information about these programs as possible. We’ve already declassified unprecedented information about the NSA, but we can go further. So at my direction, the Department of Justice will make public the legal rationale for the government’s collection activities under Section 215 of the Patriot Act. The NSA is taking steps to put in place a full-time civil liberties and privacy officer, and released information that details its mission, authorities, and oversight. And finally, the intelligence community is creating a website that will serve as a hub for further transparency, and this will give Americans and the world the ability to learn more about what our intelligence community does and what it doesn’t do, how it carries out its mission, and why it does so.
Fourth, we’re forming a high-level group of outside experts to review our entire intelligence and communications technologies. We need new thinking for a new era. We now have to unravel terrorist plots by finding a needle in the haystack of global telecommunications. And meanwhile, technology has given governments -- including our own -- unprecedented capability to monitor communications.
So I am tasking this independent group to step back and review our capabilities -- particularly our surveillance technologies. And they’ll consider how we can maintain the trust of the people, how we can make sure that there absolutely is no abuse in terms of how these surveillance technologies are used, ask how surveillance impacts our foreign policy -- particularly in an age when more and more information is becoming public. And they will provide an interim report in 60 days and a final report by the end of this year, so that we can move forward with a better understanding of how these programs impact our security, our privacy, and our foreign policy.
So all these steps are designed to ensure that the American people can trust that our efforts are in line with our interests and our values. And to others around the world, I want to make clear once again that America is not interested in spying on ordinary people. Our intelligence is focused, above all, on finding the information that’s necessary to protect our people, and -- in many cases -- protect our allies.
It’s true we have significant capabilities. What’s also true is we show a restraint that many governments around the world don't even think to do, refuse to show -- and that includes, by the way, some of America’s most vocal critics. We shouldn’t forget the difference between the ability of our government to collect information online under strict guidelines and for narrow purposes, and the willingness of some other governments to throw their own citizens in prison for what they say online.
And let me close with one additional thought. The men and women of our intelligence community work every single day to keep us safe because they love this country and believe in our values. They're patriots. And I believe that those who have lawfully raised their voices on behalf of privacy and civil liberties are also patriots who love our country and want it to live up to our highest ideals. So this is how we’re going to resolve our differences in the United States -- through vigorous public debate, guided by our Constitution, with reverence for our history as a nation of laws, and with respect for the facts.
So, with that, I’m going to take some questions. And let’s see who we’ve got here. We’re going to start with Julie Pace of AP.
Q Thank you, Mr. President. I wanted to ask about some of the foreign policy fallout from the disclosure of the NSA programs that you discussed. Your spokesman said yesterday that there’s no question that the U.S. relationship with Russia has gotten worse since Vladimir Putin took office. How much of that decline do you attribute directly to Mr. Putin, given that you seem to have had a good working relationship with his predecessor? Also will there be any additional punitive measures taken against Russia for granting asylum to Edward Snowden? Or is canceling the September summit really all you can do given the host of issues the U.S. needs Russian cooperation for? Thank you.
THE PRESIDENT: Good. I think there’s always been some tension in the U.S.-Russian relationship after the fall of the Soviet Union. There’s been cooperation in some areas; there’s been competition in others.
It is true that in my first four years, in working with President Medvedev, we made a lot of progress. We got START done -- or START II done. We were able to cooperate together on Iran sanctions. They provided us help in terms of supplying our troops in Afghanistan. We were able to get Russia into the WTO -- which is not just good for Russia, it’s good for our companies and businesses because they're more likely then to follow international norms and rules. So there's been a lot of good work that has been done and that is going to continue to be done. What's also true is, is that when President Putin -- who was prime minister when Medvedev was president -- came back into power I think we saw more rhetoric on the Russian side that was anti-American, that played into some of the old stereotypes about the Cold War contests between the United States and Russia. And I've encouraged Mr. Putin to think forward as opposed to backwards on those issues -- with mixed success.
And I think the latest episode is just one more in a number of emerging differences that we've seen over the last several months around Syria, around human rights issues, where it is probably appropriate for us to take a pause, reassess where it is that Russia is going, what our core interests are, and calibrate the relationship so that we're doing things that are good for the United States and hopefully good for Russia as well, but recognizing that there just are going to be some differences and we're not going to be able to completely disguise them.
And that's okay. Keep in mind that although I'm not attending the summit, I'll still be going to St. Petersburg because Russia is hosting the G20. That's important business in terms of our economy and our jobs and all the issues that are of concern to Americans.
I know that one question that's been raised is how do we approach the Olympics. I want to just make very clear right now I do not think it's appropriate to boycott the Olympics. We've got a bunch of Americans out there who are training hard, who are doing everything they can to succeed. Nobody is more offended than me by some of the anti-gay and lesbian legislation that you've been seeing in Russia. But as I said just this week, I've spoken out against that not just with respect to Russia but a number of other countries where we continue to do work with them, but we have a strong disagreement on this issue.
And one of the things I'm really looking forward to is maybe some gay and lesbian athletes bringing home the gold or silver or bronze, which I think would go a long way in rejecting the kind of attitudes that we're seeing there. And if Russia doesn't have gay or lesbian athletes, then it probably makes their team weaker.
Q Are there going to be any additional punitive measures for Russia, beyond canceling the summit?
THE PRESIDENT: Keep in mind that our decision to not participate in the summit was not simply around Mr. Snowden. It had to do with the fact that, frankly, on a whole range of issues where we think we can make some progress, Russia has not moved. And so we don't consider that strictly punitive.
We're going to assess where the relationship can advance U.S. interests and increase peace and stability and prosperity around the world. Where it can, we’re going to keep on working with them. Where we have differences, we’re going to say so clearly. And my hope is, is that over time, Mr. Putin and Russia recognize that rather than a zero-sum competition, in fact, if the two countries are working together we can probably advance the betterment of both peoples.
Chuck Todd.
Q Thank you, Mr. President. Given that you just announced a whole bunch of reforms based on essentially the leaks that Edward Snowden made on all of these surveillance programs, is that change -- is your mindset changed about him? Is he now more a whistle-blower than he is a hacker, as you called him at one point, or somebody that shouldn’t be filed charges? And should he be provided more protection? Is he a patriot? You just used those words. And then just to follow up on the personal -- I want to follow up on a personal --
THE PRESIDENT: Okay, I want to make sure -- everybody is asking one question it would be helpful.
Q No, I understand. It was a part of a question that you didn’t answer. Can you get stuff done with Russia, big stuff done, without having a good personal relationship with Putin?
THE PRESIDENT: I don’t have a bad personal relationship with Putin. When we have conversations, they’re candid, they’re blunt; oftentimes, they’re constructive. I know the press likes to focus on body language and he’s got that kind of slouch, looking like the bored kid in the back of the classroom. But the truth is, is that when we’re in conversations together, oftentimes it’s very productive.
So the issue here really has to do with where do they want to take Russia -- it’s substantive on a policy front. And --
Q (Inaudible.)
THE PRESIDENT: No. Right now, this is just a matter of where Mr. Putin and the Russian people want to go. I think if they are looking forward into the 21st century and how they can advance their economy, and make sure that some of our joint concerns around counterterrorism are managed effectively, then I think we can work together. If issues are framed as if the U.S. is for it then Russia should be against it, or we’re going to be finding ways where we can poke each other at every opportunity, then probably we don’t get as much stuff done.
See, now I’ve forgotten your first question, which presumably was the more important one. No, I don’t think Mr. Snowden was a patriot. As I said in my opening remarks, I called for a thorough review of our surveillance operations before Mr. Snowden made these leaks.
My preference -- and I think the American people’s preference -- would have been for a lawful, orderly examination of these laws, a thoughtful fact-based debate that would then lead us to a better place. Because I never made claims that all the surveillance technologies that have developed since the time some of these laws had been put in place somehow didn't require potentially some additional reforms. That's exactly what I called for.
So the fact is, is that Mr. Snowden has been charged with three felonies. If, in fact, he believes that what he did was right, then, like every American citizen, he can come here, appear before the court with a lawyer and make his case. If the concern was that somehow this was the only way to get this information out to the public, I signed an executive order well before Mr. Snowden leaked this information that provided whistleblower protection to the intelligence community -- for the first time. So there were other avenues available for somebody whose conscience was stirred and thought that they needed to question government actions.
But having said that, once the leaks have happened, what we’ve seen is information come out in dribs and in drabs, sometimes coming out sideways. Once the information is out, the administration comes in, tries to correct the record. But by that time, it’s too late or we’ve moved on, and a general impression has, I think, taken hold not only among the American public but also around the world that somehow we’re out there willy-nilly just sucking in information on everybody and doing what we please with it.
That's not the case. Our laws specifically prohibit us from surveilling U.S. persons without a warrant. And there are a whole range of safeguards that have been put in place to make sure that that basic principle is abided by.
But what is clear is that whether, because of the instinctive bias of the intelligence community to keep everything very close -- and probably what’s a fair criticism is my assumption that if we had checks and balances from the courts and Congress, that that traditional system of checks and balances would be enough to give people assurance that these programs were run probably -- that assumption I think proved to be undermined by what happened after the leaks. I think people have questions about this program.
And so, as a consequence, I think it is important for us to go ahead and answer these questions. What I’m going to be pushing the IC to do is rather than have a trunk come out here and leg come out there and a tail come out there, let’s just put the whole elephant out there so people know exactly what they're looking at. Let’s examine what is working, what’s not, are there additional protections that can be put in place, and let’s move forward.
And there’s no doubt that Mr. Snowden’s leaks triggered a much more rapid and passionate response than would have been the case if I had simply appointed this review board to go through, and I had sat down with Congress and we had worked this thing through. It would have been less exciting. It would not have generated as much press. I actually think we would have gotten to the same place, and we would have done so without putting at risk our national security and some very vital ways that we are able to get intelligence that we need to secure the country.
[Q&A on Federal Reserve chairman omitted.]
Carol Lee.
Q Thank you, Mr. President.
I wanted to ask you about your evolution on the surveillance issues. I mean, part of what you’re talking about today is restoring the public trust. And the public has seen you evolve from when you were in the U.S. Senate to now. And even as recently as June, you said that the process was such that people should be comfortable with it, and now you’re saying you’re making these reforms and people should be comfortable with those. So why should the public trust you on this issue, and why did you change your position multiple times?
THE PRESIDENT: Well, I think it’s important to say, Carol, first of all, I haven’t evolved in my assessment of the actual programs. I consistently have said that when I came into office I evaluated them. Some of these programs I had been critical of when I was in the Senate. When I looked through specifically what was being done, my determination was that the two programs in particular that had been at issue, 215 and 702, offered valuable intelligence that helps us protect the American people and they're worth preserving. What we also saw was that some bolts needed to be tightened up on some of the programs, so we initiated some additional oversight, reforms, compliance officers, audits and so forth.
And if you look at the reports -- even the disclosures that Mr. Snowden has put forward -- all the stories that have been written, what you're not reading about is the government actually abusing these programs and listening in on people's phone calls or inappropriately reading people's emails. What you're hearing about is the prospect that these could be abused. Now, part of the reason they're not abused is because these checks are in place, and those abuses would be against the law and would be against the orders of the FISC.
Having said that, though, if you are outside of the intelligence community, if you are the ordinary person and you start seeing a bunch of headlines saying, U.S.-Big Brother looking down on you, collecting telephone records, et cetera, well, understandably, people would be concerned. I would be, too, if I wasn't inside the government.
And so in light of the changed environment where a whole set of questions have been raised, some in the most sensationalized manner possible, where these leaks are released drip by drip, one a week, to kind of maximize attention and see if they can catch us at some imprecision on something -- in light of that, it makes sense for us to go ahead, lay out what exactly we're doing, have a discussion with Congress, have a discussion with industry -- which is also impacted by this -- have a discussion with civil libertarians, and see can we do this better.
I think the main thing I want to emphasize is I don't have an interest and the people at the NSA don't have an interest in doing anything other than making sure that where we can prevent a terrorist attack, where we can get information ahead of time, that we're able to carry out that critical task. We do not have an interest in doing anything other than that. And we've tried to set up a system that is as failsafe as so far at least we've been able to think of to make sure that these programs are not abused.
But people may have better ideas and people may want to jigger slightly sort of the balance between the information that we can get versus the incremental encroachments on privacy that if haven't already taken place might take place in a future administration, or as technologies develop further.
And the other thing that’s happening is, is that as technology develops further, technology itself may provide us some additional safeguards. So, for example, if people don’t have confidence that the law, the checks and balances of the court and Congress are sufficient to give us confidence that government is not snooping, well, maybe we can embed technologies in there that prevent the snooping regardless of what government wants to do. I mean, there may be some technological fixes that provide another layer of assurance.
And so those are the kinds of things that I’m looking forward to having a conversation about.
Q Can you understand, though, why some people might not trust what you're saying right now about wanting to --
THE PRESIDENT: No, I can’t.
Q -- that they should be comfortable with the process?
THE PRESIDENT: Well, the fact that I said that the programs are operating in a way that prevents abuse, that continues to be true, without the reforms. The question is how do I make the American people more comfortable.
If I tell Michelle that I did the dishes -- now, granted, in the White House I don’t do the dishes that much -- (laughter) -- but back in the day -- and she’s a little skeptical, well, I’d like her to trust me, but maybe I need to bring her back and show her the dishes and not just have her take my word for it.
And so the program is -- I am comfortable that the program currently is not being abused. I’m comfortable that if the American people examined exactly what was taking place, how it was being used, what the safeguards were, that they would say, you know what, these folks are following the law and doing what they say they’re doing.
But it is absolutely true that with the expansion of technology -- this is an area that’s moving very quickly -- with the revelations that have depleted public trust, that if there are some additional things that we can do to build that trust back up, then we should do them.
Jonathan Karl.
Q Thank you, Mr. President. You have said that core al Qaeda has been decimated, that its leaders are on the run. Now that we’ve seen this terror threat that has resulted in embassies closed throughout the Arab world, much of Africa, do you still believe that al Qaeda has been decimated? And if I can ask in the interest of transparency, can you tell us about these drone strikes that we’ve seen over the last couple of weeks in Yemen?
THE PRESIDENT: What I said in the same National Defense University speech back in May that I referred to earlier is that core al Qaeda is on its heels, has been decimated. But what I also said was that al Qaeda and other extremists have metastasized into regional groups that can pose significant dangers.
And I’d refer you back to that speech just back in May where I said specifically that although they are less likely to be able to carry out spectacular homeland attacks like 9/11, they have the capacity to go after our embassies. They have the capacity, potentially, to go after our businesses. They have the capacity to be destabilizing and disruptive in countries where the security apparatus is weak. And that’s exactly what we are seeing right now.
So it’s entirely consistent to say that this tightly organized and relatively centralized al Qaeda that attacked us on 9/11 has been broken apart and is very weak and does not have a lot of operational capacity, and to say we still have these regional organizations like AQAP that can pose a threat, that can drive potentially a truck bomb into an embassy wall and can kill some people.
And so that requires us, then, to make sure that we have a strategy that is strengthening those partners so that they’ve got their own capacity to deal with what are potentially manageable regional threats if these countries are a little bit stronger and have more effective CT and so forth. It means that we’ve got to continue to be vigilant and go after known terrorists who are potentially carrying out plots or are going to strengthen their capacity over time -- because they’re always testing the boundaries of, well, maybe we can try this, maybe we can do that. So this is a ongoing process. We are not going to completely eliminate terrorism. What we can do is to weaken it and to strengthen our partnerships in such a way that it does not pose the kind of horrible threat that we saw on 9/11.
And I’m not going to discuss specific operations that have taken place. Again, in my speech in May, I was very specific about how we make these determinations about potential lethal strikes, so I would refer you to that speech.
Q So you won’t even confirm that we carried out drone strikes in Yemen?
THE PRESIDENT: I will not have a discussion about operational issues.
Ed Henry.
Q I hope you would defend me as well.
THE PRESIDENT: I would.
Q Okay, thank you. I want to ask you about two important dates that are coming up. October 1st you’ve got to implement your signature health care law. You recently decided on your own to delay a key part of that. And I wonder, if you pick and choose what parts of the law to implement, couldn’t your successor down the road pick and choose whether they’ll implement your law and keep it in place?
And on September 11th we’ll have the first anniversary of Benghazi. And you said on September 12th, “Make no mistake, we’ll bring to justice the killers who attacked our people.” Eleven months later, where are they, sir?
THE PRESIDENT: Well, I also said that we’d get bin Laden, and I didn’t get him in 11 months. So we have informed, I think, the public that there’s a sealed indictment. It’s sealed for a reason. But we are intent on capturing those who carried out this attack, and we’re going to stay on it until we get them.
Q And you’re close to having suspects in custody?
THE PRESIDENT: I will leave it at that. But this remains a top priority for us. Anybody who attacks Americans, anybody who kills, tragically, four Americans who were serving us in a very dangerous place, we’re going to do everything we can to get those who carried out those attacks.
[Q&A on health care and the budget omitted.]
END 4:00 P.M. EDT
Subscribe to:
Posts (Atom)