15 July 2013
Crypto-Gram July 15, 2013
This is mirrored as particularly informative about the Snowden NSA revelations.
Schneier on dangerous US offensive cyberwar policy
and the perdurable mathematical and engineering
challenges of cryptography should be studied.
Crypto-Gram for security disguises, ruses and disinformation.
July 15, 2013
by Bruce Schneier
Chief Security Technology Officer, BT
A free monthly newsletter providing summaries, analyses, insights, and
commentaries on security: computer and otherwise.
For back issues, or to subscribe, visit
You can read this issue on the web at
These same essays and news items appear in the "Schneier on Security" blog
along with a lively and intelligent comment section. An RSS feed is available.
In this issue:
There's one piece of blowback that isn't being discussed -- aside from the
fact that Snowden has killed the chances of any liberal arts major getting
a DoD job for at least a decade -- and that's how the massive NSA surveillance
of the Internet affects the US's role in Internet governance.
Ron Deibert makes this point:
But there are unintended consequences of the NSA scandal that will undermine
U.S. foreign policy interests -- in particular, the "Internet Freedom" agenda
espoused by the U.S. State Department and its allies.
The revelations that have emerged will undoubtedly trigger a reaction abroad
as policymakers and ordinary users realize the huge disadvantages of their
dependence on U.S.-controlled networks in social media, cloud computing,
and telecommunications, and of the formidable resources that are deployed
by U.S. national security agencies to mine and monitor those networks.
Writing about the new Internet nationalism, I talked about the ITU meeting
in Dubai last fall, and the attempt of some countries to wrest control of
the Internet from the US. That movement just got a huge PR boost. Now, when
countries like Russia and Iran say the US is simply too untrustworthy to
manage the Internet, no one will be able to argue.
We can't fight for Internet freedom around the world, then turn around and
destroy it back home. Even if we don't see the contradiction, the rest of
the world does.
The new Internet nationalism:
There's been some interesting speculation that the NSA is storing everyone's
phone calls, and not just metadata. The first link, below, is definitely
I expressed skepticism about this just a month ago. My assumption had always
been that everyone's compressed voice calls are just too much data to move
around and store. Now, I don't know.
There's a bit of a conspiracy-theory air to all of this speculation, but
underestimating what the NSA will do is a mistake. General Alexander has
told members of Congress that they *can* record the contents of phone calls.
And they have the technical capability.
I believe that, to the extent that the NSA is analyzing and storing
conversations, they're doing speech-to-text as close to the source as possible
and working with that. Even if you have to store the audio for conversations
in foreign languages, or for snippets of conversations the conversion software
is unsure of, it's a lot fewer bits to move around and deal with.
And, by the way, I hate the term "metadata." What's wrong with "traffic
analysis," which is what we've always called that sort of thing?
My previous skepticism:
In an excellent essay about privacy and secrecy, law professor Daniel Solove
makes an important point. There are two types of NSA secrecy being discussed.
It's easy to confuse them, but they're very different.
Of course, if the government is trying to gather data about a particular
suspect, keeping the specifics of surveillance efforts secret will decrease
the likelihood of that suspect altering his or her behavior.
But secrecy at the level of an individual suspect is different from keeping
the very existence of massive surveillance programs secret. The public must
know about the general outlines of surveillance activities in order to evaluate
whether the government is achieving the appropriate balance between privacy
and security. What kind of information is gathered? How is it used? How securely
is it kept? What kind of oversight is there? Are these activities even legal?
These questions can't be answered, and the government can't be held accountable,
if surveillance programs are completely classified.
This distinction is also becoming important as Snowden keeps talking. There
are a lot of articles about Edward Snowden cooperating with the Chinese
government. I have no idea if this is true -- Snowden denies it -- or if
it's part of an American smear campaign designed to change the debate from
the NSA surveillance programs to the whistleblower's actions. (It worked
against Assange.) In anticipation of the inevitable questions, I want to
change a previous assessment statement: I consider Snowden a hero for
whistleblowing on the existence and details of the NSA surveillance programs,
but not for revealing specific operational secrets to the Chinese government.
Charles Pierce wishes Snowden would stop talking. I agree; the more this
story is about him the less it is about the NSA. Stop giving interviews and
let the documents do the talking.
Back to Daniel Solove, this excellent 2011 essay on the value of privacy
is making the rounds again. And it should.
Many commentators had been using the metaphor of George Orwell's "1984" to
describe the problems created by the collection and use of personal data.
I contended that the Orwell metaphor, which focuses on the harms of surveillance
(such as inhibition and social control) might be apt to describe law
enforcement's monitoring of citizens. But much of the data gathered in computer
databases is not particularly sensitive, such as one's race, birth date,
gender, address, or marital status. Many people do not care about concealing
the hotels they stay at, the cars they own or rent, or the kind of beverages
they drink. People often do not take many steps to keep such information
secret. Frequently, though not always, people's activities would not be inhibited
if others knew this information.
I suggested a different metaphor to capture the problems: Franz Kafka's "The
Trial," which depicts a bureaucracy with inscrutable purposes that uses people's
information to make important decisions about them, yet denies the people
the ability to participate in how their information is used. The problems
captured by the Kafka metaphor are of a different sort than the problems
caused by surveillance. They often do not result in inhibition or chilling.
Instead, they are problems of information processing -- the storage, use,
or analysis of data -- rather than information collection. They affect the
power relationships between people and the institutions of the modern state.
They not only frustrate the individual by creating a sense of helplessness
and powerlessness, but they also affect social structure by altering the
kind of relationships people have with the institutions that make important
decisions about their lives.
The whole essay is worth reading, as is -- I hope -- my essay on the value
of privacy from 2006.
I have come to believe that the solution to all of this is regulation. And
it's not going to be the regulation of data collection; it's going to be
the regulation of data use.
Blog entry URL:
Snowden and the Chinese:
My previous Snowden essays:
Charles Pierce on Snowden:
Solove's 2011 essay:
A good rebuttal to the "nothing to hide" argument:
I have signed a petition calling on the NSA to "suspend its domestic surveillance
program pending public comment." This is what's going on:
In a request today to National Security Agency director Keith Alexander and
Defense Secretary Chuck Hagel, the group argues that the NSA's recently revealed
domestic surveillance program is "unlawful" because the agency neglected
to request public comments first. A federal appeals court previously ruled
that was necessary in a lawsuit involving airport body scanners.
"In simple terms, a line has been crossed," Marc Rotenberg, executive director
of the Electronic Privacy Information Center, told CNET. "The agency's function
has been transformed, and we think the public should have an opportunity
to say something about that."
It's an ambitious -- and untested -- legal argument. No court appears to
have ever ruled that the Administrative Procedure Act, which can require
agencies to solicit public comment, has applied to the supersecret intelligence
community. The APA explicitly excludes from judicial review, for instance,
"military authority exercised in the field in time of war."
EPIC is relying on a July 2011 decision (PDF) it obtained from the U.S. Court
of Appeals for the D.C. Circuit dealing with installing controversial full-body
scanners at airports. The Transportation Security Agency, the court said,
was required to obtain comment on a rule that "substantively affects the
This isn't an empty exercise. While it's unlikely that a judge will order
the NSA to suspend the program pending public approval, the process will
put pressure on Washington to subject the NSA to more oversight, and pressure
the NSA into more transparency. We've used these tactics before. Two decades
ago, EPIC launched a similar petition against the Clipper Chip, a process
that eventually led to the Clinton administration and the FBI abandoning
the effort. And EPIC's more recent action against TSA full-body scanners
is one of the reasons we have privacy safeguards on the millimeter wave scanners
they are still using.
The more people who sign this petition, this, the clearer the message it
sends to Washington: a message that people care about the privacy of their
telephone records, Internet transactions, and online communications. Secret
judges should not be allowed to use secret interpretations of secret laws
to authorize the NSA to engage in domestic surveillance. Sooner or later,
a court is going to recognize that. Until then, the more noise the better.
Add your voice here. It just might work.
This article, on the cozy relationship between the commercial personal-data
industry and the intelligence industry, has new information on the security
Skype, the Internet-based calling service, began its own secret program,
Project Chess, to explore the legal and technical issues in making Skype
calls readily available to intelligence agencies and law enforcement officials,
according to people briefed on the program who asked not to be named to avoid
trouble with the intelligence agencies.
Project Chess, which has never been previously disclosed, was small, limited
to fewer than a dozen people inside Skype, and was developed as the company
had sometimes contentious talks with the government over legal issues, said
one of the people briefed on the project. The project began about five years
ago, before most of the company was sold by its parent, eBay, to outside
investors in 2009. Microsoft acquired Skype in an $8.5 billion deal that
was completed in October 2011.
A Skype executive denied last year in a blog post that recent changes in
the way Skype operated were made at the behest of Microsoft to make snooping
easier for law enforcement. It appears, however, that Skype figured out how
to cooperate with the intelligence community before Microsoft took over the
company, according to documents leaked by Edward J. Snowden, a former contractor
for the N.S.A. One of the documents about the Prism program made public by
Mr. Snowden says Skype joined Prism on Feb. 6, 2011.
Reread that Skype denial from last July, knowing that at the time the company
knew that they were giving the NSA access to customer communications. Notice
how it is precisely worded to be technically accurate, yet leave the reader
with the wrong conclusion. This is where we are with all the tech companies
right now; we can't trust their denials, just as we can't trust the NSA --
or the FBI -- when it denies programs, capabilities, or practices.
Back in January, we wondered whom Skype lets spy on their users. Now we know.
The article quoted:
We can't trust the NSA:
My post from last January:
This quote is from the Spring 1997 issue of "CRYPTOLOG," the internal NSA
newsletter. The writer is William J. Black, Jr., the Director's Special Assistant
for Information Warfare.
Specifically, the focus is on the potential abuse of the Government's
applications of this new information technology that will result in an invasion
of personal privacy. For us, this is difficult to understand. We *are* "the
government," and we have no interest in invading the personal privacy of
This is from a Seymour Hersh "New Yorker" interview with NSA Director General
Michael Hayden in 1999:
When I asked Hayden about the agency's capability for unwarranted spying
on private citizens -- in the unlikely event, of course, that the agency
could somehow get the funding, the computer scientists, and the knowledge
to begin making sense out of the Internet -- his response was heated. "I'm
a kid from Pittsburgh with two sons and a daughter who are closet libertarians,"
he said. "I am not interested in doing anything that threatens the American
people, and threatens the future of this agency. I can't emphasize enough
to you how careful we are. We have to be so careful -- to make sure that
America is never distrustful of the power and security we can provide."
It's easy to assume that both Black and Hayden were lying, but I believe
them. I believe that, 15 years ago, the NSA was entirely focused on intercepting
communications outside the US.
What changed? What caused the NSA to abandon its non-US charter and start
spying on Americans? From what I've read, and from a bunch of informal
conversations with NSA employees, it was the 9/11 terrorist attacks. That's
when everything changed, the gloves came off, and all the rules were thrown
out the window. That the NSA's interests coincided with the business model
of the Internet is just a -- lucky, in their view -- coincidence.
A few weeks ago, the "Guardian" published two new Snowden documents. These
outline how the NSA's data-collection procedures allow it to collect lots
of data on Americans, and how the FISA court fails to provide oversight over
The documents are complicated, but I strongly recommend that people read
both the "Guardian" analysis and the EFF analysis -- and possibly the "USA
Frustratingly, this has not become a major news story. It isn't being widely
reported in the media, and most people don't know about it. At this point,
the only aspect of the Snowden story that is in the news is the personal
story. The press seems to have had its fill of the far more important policy
I don't know what there is that can be done about this, but it's how we all
More Snowden documents analyzed by the "Guardian" -- two articles -- discuss
how the NSA collected e-mails and data on Internet activity of both Americans
and foreigners. The program might have ended in 2011, or it might have continued
under a different name. This is the program that resulted in that bizarre
tale of Bush officials confronting then-Attorney General John Ashcroft in
his hospital room; the "New York Times" story discusses that. What's interesting
is that the NSA collected this data under one legal pretense. When that
justification evaporated, they searched around until they found another pretense.
This story is being picked up a bit more than the previous story, but it's
obvious that the press is fatiguing of this whole thing. Without the Ashcroft
human interest bit, it would be just another story of the NSA eavesdropping
on Americans -- and that's lasts week's news.
"Final Report on Project C-43." This finally explains what John Ellis was
talking about in "The Possibility of Non-Secret Encryption" when he dropped
a tantalizing hint about wartime work at Bell Labs.
Details of NSA data requests from US corporations.
John Mueller and Mark Stewart ask the important questions about the NSA
surveillance programs: why were they secret, what have they accomplished,
and what do they cost?
This essay attempts to figure out if they accomplished anything.
This essay attempts to figure out if they can be effective at all.
Companies allow US intelligence to exploit vulnerabilities before they patch
them. No word on whether these companies would delay a patch if asked nicely
-- or if there's any way the government can require them to. Anyone feel
safer because of this?
A fine piece: "A Love Letter to the NSA Agent who is Monitoring my Online
A similar sentiment is expressed in this video.
Lessons from Japan's response to terrorism by Aum Shinrikyo.
The future of satellite surveillance is pretty scary -- and cool.
Remember, it's not any one thing that's worrisome; it's everything together.
Interesting story of a spear phishing attack against the "Financial Times."
Ron Beckstrom gives a talk (video and transcript) about "Mutually Assured
Destruction," "Mutually Assured Disruption," and "Mutually Assured Dependence."
Great story on the cracking of the Kryptos Sculpture at the CIA headquarters.
Interesting article on the history of, and the relationship between, secrecy
and privacy: "As a matter of historical analysis, the relationship between
secrecy and privacy can be stated in an axiom: the defense of privacy follows,
and never precedes, the emergence of new technologies for the exposure of
secrets. In other words, the case for privacy always comes too late. The
horse is out of the barn. The post office has opened your mail. Your photograph
is on Facebook. Google already knows that, notwithstanding your demographic,
you hate kale."
Lessons from biological security.
I recommend his book, "Learning from the Octopus: How Secrets from Nature
Can Help Us Fight Terrorist Attacks, Natural Disasters, and Disease."
This is an interesting article about a new breed of malware that also hijacks
the victim's phone text messaging system, to intercept one-time passwords
sent via that channel.
Adding a remote kill switch to cell phones would deter theft.
Here we can see how the rise of the surveillance state permeates everything
about computer security. On the face of it, this is a good idea. Assuming
it works -- that 1) it's not possible for thieves to resurrect phones in
order to resell them, and 2) that it's not possible to turn this system into
a denial-of-service attack tool -- it would deter crime. The general category
of security is "benefit denial," like ink tags attached to garments in retail
stores and car radios that no longer function if removed. But given what
we now know, do we trust that the government wouldn't abuse this system and
kill phones for other reasons? Do we trust that media companies won't kill
phones it decided were sharing copyrighted materials? Do we trust that phone
companies won't kill phones from delinquent customers? What might have been
a straightforward security system becomes a dangerous tool of control, when
you don't trust those in power.
The NSA has published some new symmetric algorithms: SIMON and SPECK
It's always fascinating to study NSA-designed ciphers. I was particularly
interested in the algorithms' similarity to Threefish, and how they improved
on what we did. I was most impressed with their key schedule. I am *always*
impressed with how the NSA does key schedules. And I enjoyed the discussion
of requirements. Missing, of course, is any cryptanalytic analysis.
This is a really good paper describing the unique threat model of children
in the home, and the sorts of security philosophies that are effective in
dealing with them. Stuart Schechter, "The User IS the Enemy, and (S)he Keeps
Reaching for that Bright Shiny Power Button!" Definitely worth reading.
The US Department of Defense is blocking sites that are reporting about the
Snowden documents. I presume they're not censoring sites that are smearing
him personally. Note that the DoD is only blocking those sites on its own
network, not on the Internet at large. The blocking is being done by automatic
filters, presumably the same ones used to block porn or other sites it deems
Interesting law journal article: "Privacy Protests: Surveillance Evasion
and Fourth Amendment Suspicion," by Elizabeth E. Joh.
Read this while thinking about the lack of any legal notion of civil disobedience
Here's a transcript of a panel discussion about NSA surveillance. There's
a lot worth reading here, but I want to link to Bob Litt's opening remarks.
He's the General Counsel for ODNI, and he has a lot to say about the programs
revealed so far in the Snowden documents.
As always, the fundamental issue is trust. If you believe Litt, this is all
very comforting. If you don't, it's more lies and misdirection. Taken at
face value, it explains why so many tech executives were able to say they
had never heard of PRISM: it's the internal NSA name for the database, and
not the name of the program. I also note that Litt uses the word "collect"
to mean what it actually means, and not the way his boss, Director of National
Intelligence James Clapper, Jr., used it to deliberately lie to Congress.
How Apple continues to make security invisible.
iOS security white paper:
Evgeny Morozov makes a point about surveillance and big data: it just looks
for useful correlations without worrying about causes, and leads people to
implement "fixes" based simply on those correlations -- rather than understanding
and correcting the underlying causes.
A philosophical perspective on the value of privacy.
This study concludes that there is a benefit to forcing companies to undergo
privacy audits: "The results show that there are empirical regularities
consistent with the privacy disclosures in the audited financial statements
having some effect. Companies disclosing privacy risks are less likely to
incur a breach of privacy related to unintentional disclosure of privacy
information; while companies suffering a breach of privacy related to credit
cards are more likely to disclose privacy risks afterwards. Disclosure after
a breach is negatively related to privacy breaches related to hacking, and
disclosure before a breach is positively related to breaches concerning insider
This is a really interesting article on secret languages. It starts by talking
about a "cant" dictionary of 16th-century thieves' argot, and ends up talking
about secret languages in general.
Nice history of Project SHAMROCK, the NSA's illegal domestic surveillance
program from the 1970s. It targeted telegrams.
We don't know what they mean, but there are a bunch of NSA code names on
LinkedIn profiles: ANCHORY, AMHS, NUCLEON, TRAFFICTHIEF, ARCMAP, SIGNAV,
COASTLINE, DISHFIRE, FASTSCOPE, OCTAVE/CONTRAOCTAVE, PINWALE, UTT, WEBCANDID,
MICHIGAN, PLUS, ASSOCIATION, MAINWAY, FASCIA, OCTSKYWARD, INTELINK, METRICS,
This is a *really* interesting article on something I've never thought about
before: how free games trick players into paying for stuff.
Today, the United States is conducting offensive cyberwar actions around
More than passively eavesdropping, we're penetrating and damaging foreign
networks for both espionage and to ready them for attack. We're creating
custom-designed Internet weapons, pretargeted and ready to be "fired" against
some piece of another country's electronic infrastructure on a moment's notice.
This is much worse than what we're accusing China of doing to us. We're pursuing
policies that are both expensive and destabilizing and aren't making the
Internet any safer. We're reacting from fear, and causing other countries
to counter-react from fear. We're ignoring resilience in favor of offense.
Welcome to the cyberwar arms race, an arms race that will define the Internet
in the 21st century.
Presidential Policy Directive 20, issued last October and released by Edward
Snowden, outlines US cyberwar policy. Most of it isn't very interesting,
but there are two paragraphs about "Offensive Cyber Effect Operations," or
OCEO, that are intriguing:
OECO can offer unique and unconventional capabilities to advance US national
objectives around the world with little or no warning to the adversary or
target and with potential effects ranging from subtle to severely damaging.
The development and sustainment of OCEO capabilities, however, may require
considerable time and effort if access and tools for a specific target do
not already exist.
The United States Government shall identify potential targets of national
importance where OCEO can offer a favorable balance of effectiveness and
risk as compared with other instruments of national power, establish and
maintain OCEO capabilities integrated as appropriate with other US offensive
capabilities, and execute those capabilities in a manner consistent with
the provisions of this directive.
These two paragraphs, and another paragraph about OCEO, are the only parts
of the document classified "top secret." And that's because what they're
saying is very dangerous.
Cyberattacks have the potential to be both immediate and devastating. They
can disrupt communications systems, disable national infrastructure, or,
as in the case of Stuxnet, destroy nuclear reactors; but only if they've
been created and targeted beforehand. Before launching cyberattacks against
another country, we have to go through several steps.
We have to study the details of the computer systems they're running and
determine the vulnerabilities of those systems. If we can't find exploitable
vulnerabilities, we need to create them: leaving "back doors," in hacker
speak. Then we have to build new cyberweapons designed specifically to attack
Sometimes we have to embed the hostile code in those networks -- these are
called "logic bombs" -- to be unleashed in the future. And we have to keep
penetrating those foreign networks, because computer systems always change
and we need to ensure that the cyberweapons are still effective.
Like our nuclear arsenal during the Cold War, our cyberweapons arsenal must
be pretargeted and ready to launch.
That's what Obama directed the US Cyber Command to do. We can see glimpses
of how effective we are in Snowden's allegations that the NSA is currently
penetrating foreign networks around the world: "We hack network backbones
-- like huge Internet routers, basically -- that give us access to the
communications of hundreds of thousands of computers without having to hack
every single one."
The NSA and the US Cyber Command are basically the same thing. They're both
at Fort Meade in Maryland, and they're both led by Gen. Keith Alexander.
The same people who hack network backbones are also building weapons to destroy
those backbones. At a March Senate briefing, Alexander boasted of creating
more than a dozen offensive cyber units.
Longtime NSA watcher James Bamford reached the same conclusion in his recent
profile of Alexander and the US Cyber Command (written before the Snowden
revelations). He discussed some of the many cyberweapons the US purchases:
According to Defense News' C4ISR Journal and Bloomberg Businessweek, Endgame
also offers its intelligence clients -- agencies like Cyber Command, the
NSA, the CIA, and British intelligence -- a unique map showing them exactly
where their targets are located. Dubbed Bonesaw, the map displays the geolocation
and digital address of basically every device connected to the Internet around
the world, providing what's called network situational awareness. The client
locates a region on the password-protected web-based map, then picks a country
and city -- say, Beijing, China. Next the client types in the name of the
target organization, such as the Ministry of Public Security's No. 3 Research
Institute, which is responsible for computer security -- or simply enters
its address, 6 Zhengyi Road. The map will then display what software is running
on the computers inside the facility, what types of malware some may contain,
and a menu of custom-designed exploits that can be used to secretly gain
entry. It can also pinpoint those devices infected with malware, such as
the Conficker worm, as well as networks turned into botnets and zombies --
the equivalent of a back door left open...
The buying and using of such a subscription by nation-states could be seen
as an act of war. 'If you are engaged in reconnaissance on an adversary's
systems, you are laying the electronic battlefield and preparing to use it'
wrote Mike Jacobs, a former NSA director for information assurance, in a
McAfee report on cyberwarfare. 'In my opinion, these activities constitute
acts of war, or at least a prelude to future acts of war.' The question is,
who else is on the secretive company's client list? Because there is as of
yet no oversight or regulation of the cyberweapons trade, companies in the
cyber-industrial complex are free to sell to whomever they wish. "It should
be illegal," said the former senior intelligence official involved in
cyberwarfare. "I knew about Endgame when I was in intelligence. The intelligence
community didn't like it, but they're the largest consumer of that business."
That's the key question: How much of what the United States is currently
doing is an act of war by international definitions? Already we're accusing
China of penetrating our systems in order to map "military capabilities that
could be exploited during a crisis." What PPD-20 and Snowden describe is
much worse, and certainly China, and other countries, are doing the same.
All of this mapping of vulnerabilities and keeping them secret for offensive
use makes the Internet less secure, and these pretargeted, ready-to-unleash
cyberweapons are destabilizing forces on international relationships. Rooting
around other countries' networks, analyzing vulnerabilities, creating back
doors, and leaving logic bombs could easily be construed as acts of war.
And all it takes is one overachieving national leader for this all to tumble
into actual war.
It's time to stop the madness. Yes, our military needs to invest in cyberwar
capabilities, but we also need international rules of cyberwar, more transparency
from our own government on what we are and are not doing, international
cooperation between governments, and viable cyberweapons treaties. Yes, these
are difficult. Yes, it's a long, slow process. Yes, there won't be international
consensus, certainly not in the beginning. But even with all of those problems,
it's a better path to go down than the one we're on now.
We can start by taking most of the money we're investing in offensive cyberwar
capabilities and spend them on national cyberspace resilience. MAD, mutually
assured destruction, made sense because there were two superpowers opposing
each other. On the Internet there are all sorts of different powers, from
nation-states to much less organized groups. An arsenal of cyberweapons begs
to be used, and, as we learned from Stuxnet, there's always collateral damage
to innocents when they are. We're much safer with a strong defense than with
a counterbalancing offense.
This essay originally appeared on CNN.com. It had the title "Has U.S. Started
an Internet War?" -- which I had nothing to do with. Almost always, editors
choose titles for my essay without asking my opinion -- or telling me beforehand.
Cyberwar arms race:
Presidential Policy Directive 20:
EPIC's suit from last October:
James Bamford's writing:
US accuses China:
Here's an essay on the NSA's -- or Cyber Command's -- TAO: the Office of
Tailored Access Operations. This is the group in charge of hacking China.
None of this is new. Read this Seymour Hersh article on this subject from
On his blog, Scott Adams suggests that it might be possible to identify
sociopaths based on their interactions on social media.
My hypothesis is that science will someday be able to identify sociopaths
and terrorists by their patterns of Facebook and Internet use. I'll bet normal
people interact with Facebook in ways that sociopaths and terrorists couldn't
Anyone can post fake photos and acquire lots of friends who are actually
acquaintances. But I'll bet there are so many patterns and tendencies of
"normal" use on Facebook that a terrorist wouldn't be able to successfully
Okay, but so what? Imagine you had such an amazingly accurate test...then
what? Do we investigate those who test positive, even though there's no suspicion
that they've actually done anything? Do we follow them around? Subject them
to additional screening at airports? Throw them in jail because we *know*
the streets will be safer because of it? Do we want to live in a "Minority
The problem isn't just that such a system is wrong, it's that the mathematics
of testing makes this sort of thing pretty ineffective in practice. It's
called the "base rate fallacy." Suppose you have a test that's 90% accurate
in identifying both sociopaths and non-sociopaths. If you assume that 4%
of people are sociopaths, then the chance of someone who tests positive actually
being a sociopath is 26%. (For every thousand people tested, 90% of the 40
sociopaths will test positive, but so will 10% of the 960 non-sociopaths.)
You have postulate a test with an amazing 99% accuracy -- only a 1% false
positive rate -- even to have an 80% chance of someone testing positive actually
being a sociopath.
This fallacy isn't new. It's the same thinking that caused us to intern
Japanese-Americans during World War II, stop people in their cars because
they're black, and frisk them at airports because they're Muslim. It's the
same thinking behind massive NSA surveillance programs like PRISM. It's one
of the things that scares me about police DNA databases.
Many authors have written stories about thoughtcrime. Who has written about
The 4% number:
BTW, if you want to meet an actual sociopath, I recommend this book and blog.