25 May 2015
What should GCHQ do? Part 3
From: "t byfield" <tbyfield[at]panix.com>
Date: Mon, 25 May 2015 00:39:32 -0400
Subject: Re: <nettime> What should GCHQ do?
You make some excellent points, Morlock, but -- if I understand your first
sentence correctly -- most of what I said doesn't follow from an ironclad
assumption that there are two sides. A simple proof: the DIY approach you
advocate would have the same effect of ~privatizing records. Why? Because
monolithic and DIY approaches both cast unknown third-party readers as
'attackers.' Unfortunately, third-party readers also include regulators,
the public, scholars, historians, serendipity. I think erasing the past by
encrypting it is functionally equivalent to erasing the past for political
purposes -- the difference mainly boils down to motive, which as we know
is less durable than concrete outcomes.
Many communications and records have no real need to be very secure. That
is, in many cases, if there is a need, it's often external and systemic --
for example, the need to lock down *all* I/O in a given domain in order to
prevent indirect attacks (for example, spearfishing a mid-level employee
in order to compromise a system that has access to another system ad nauseam,
in order to achieve some high-level goal).
Also, I think you're mistaken that the widespread use of idiosyncratic crypto
would have much of an impact on state actors. Of course the bulk of their
currently implemented systems are tailored to the use of standardized
cryptography. But those same actors are quite capable of accurately analyzing
unknown objects, and of doing so on a large scale. ('Objects' includes arbitrary,
incomplete, and/or noisy portions of streams, *all* activity in a given frequency
range, and so on.) They'd certainly be able to keep pace with the adoption
of idiosyncratic crypto. The moment it becomes 'too expensive' to rely on
the known-crypto approach, state actors -- being *state* actors -- will just
revalue the currency, as it were, by switching over to more flexible, exploratory
systems. The 'increase the cost' argument may be one of the few things less
durable than motives.
But this is just quibbling. I think your main point is that reducing these
questions to two sides is a mistake. One implication of that, which you didn't
explore, is what we're seeing: the dissolution of this area of the state
into a 'community' -- a plurality of more or less connected, more or less
official entities. As that progresses, we'll see (or maybe it'll be there
but we won't see it) a stratification based on different levels of resources
and access. Some will have the horsepower to break whatever you implement,
others won't. The risk is that civil society -- regulators, the public, scholars,
historians, serendipity, etc -- will have the least.
Date: Sun, 24 May 2015 22:14:02 -0700
From: Rob Myers <rob[at]robmyers.org>
Subject: Re: <nettime> What should GCHQ do?
On 24/05/15 07:09 PM, t byfield wrote:
> I'm skeptical about crypto absolutism because one of its first effects
> would be, in effect, to *privatize* everything. 'Public' would be
> reduced to whatever was cracked or leaked, as if Wikileaks and Snowden
> were the norm rather than the harrowing exception. And that would
> not just to social or communicative records but also -- as anyone
> lost a key or a password knows -- to one's own records.
This is true to the same extent that it's true that paper can burn or get
wet. Or be eaten by termites.
Imagine if you had to read about whatever is 'public' on paper rather than
hear it in the town square. Or if the government could destroy historical
records with furnaces. Or if written information could be sent around the
world secretly in diplomatic cases without the public ever hearing it.
Literary absolutism is just clerical determinism. It destroyed the public
realm millennia ago.
Date: Mon, 25 May 2015 09:27:15 +0100 (BST)
From: William Waites <ww[at]styx.org>
Subject: Re: <nettime> What should GCHQ do?
On Sun, 24 May 2015 22:09:00 -0400, "t byfield"
> I'm skeptical about crypto absolutism because one of its first
> effects would be, in effect, to *privatize* everything. 'Public'
> would be reduced to whatever was cracked or leaked
As was pointed out to me on IRC, and I agree and tried to include this point,
the main problem is that most people cannot accurately distinguish between
public and private when it comes to communication. The way the network treats
their data often does not match their intentions.
Most often this happens in the direction of mistakenly making something public
that was intended to be private such a message between you and your spouse.
It can happen in the other direction too, but the situation is not symmetric:
you can publish things that were once private but you cannot unpublish things.
> But I do think that the growing 'moral' push toward secure
> communications is troubling, and that preserving 'insecure'
> communications channels as a legitimate choice is vital.
Publishing something -- making it public -- is one thing. This message is
public. However the act of publishing, and the act of reading can be private.
In sending this message, some details about exactly where and how and by
whom it was sent are obscured. In my case it doesn't really matter much.
I even put my real name on it and anyone who wants to find me can easily
do so. But for some people -- the prototypical example being journalists
in a hostile place -- it matters very much. By arranging for it to be difficult
see, on the wire, what is going on we help them because it means they do
not stand out. That's the moral argument.
Insecure channels generally are still opaque to most people. The only ones
who benefit from them are those in a privileged position to watch what is
happening on the wire. There is no practical difference to the reader or
author if a message is transmitted over a secure or an insecure channel.
It only matters to someone else who might be watching.
Storage is a little different, but only a little. If you store your information
on a computer that you control then there is not much benefit to encryption.
Unless it is possible for someone else to come to control it without your
permission, and there are many ways that this can happen. If you store your
information on somebody else's computer then you had better trust them and
transitively anyone else who is in a position to see their computer. Or you
can ``trust the math and the engineers'' as you put it.
But the thing is, you don't have to just trust the math. You can check it
for yourself. You can check the implementations by the engineers. That's
difficult and impractical for most people but it is possible in principle.
Maybe you have a friend that you trust who tries to keep on top of these
things. I am not a mathematician or a cryptographer but I know some of them,
and I find that in virtually all cases I trust their *motivations*. They
are human so there is a gap between the theory and what is the case in the
world, but we try to narrow that gap. To me it seems better on average to
place trust in people who are in the business of clearly explaining things
rather than obfuscating and appealing to emotions in order to profit.
Date: Mon, 25 May 2015 11:04:45 -0700
Subject: <nettime> Re: What should GCHQ do?
I think that there are two distinct issues here conflated together, both
of them already mentioned in the comments:
Like literacy, or ship's voice pipes, Internet and crypto are technologies.
They are presented in end-user contraptions called computers (as literacy
is presented on the contraption called paper.) These technologies can carry
and replicate communication, in ways invisible to the speaker (couriers,
postal service, photocopiers, voice pipes, fiber, switches) to some end
recipients one may not see or even know about.
There was a huge difference between public in private before these technologies
existed. Private was something you communicated to one or few individuals
that could hear you, public was something you shouted at the gathering to
all. Today you whisper and shout into the same contraption, and this is totally
non-intuitive. Is that voice pipe ending in the machine room, ship's mess,
or captain's bedroom?
The only way to re-establish the intuitive concept of public/private is,
again, to use technology, in this case cryptography. Like literacy, not everyone
will be able to effectively do that, but many will (though likely fewer than
in the case of literacy.)
There is really no choice - technologies are here to stay. If you don't want
to learn to read and write, and must rely and trust scribes, you shall be
f*cked. Same thing with crypto. Stop wasting time begging governments to
stop listening - it's stupid and silly, at the same time. Instead, become
literate, or join the ranks of the f*cked. It's called civilization.
A side technical note on:
> known-crypto approach, state actors -- being *state* actors -- will
> revalue the currency, as it were, by switching over to more
> exploratory systems. The 'increase the cost' argument may be one of
> few things less durable than motives.
There are no known automated ways to efficiently decipher human-generated
scramblings of any quality. That strategy raises the bar in the sense that
it requires similar (if not greater) expenditure of wetware brain cycles
to de-scramble, than to scramble. Narrowing the context to individual
correspondent pairs, the context not shared with any other pair, makes the
de-scrambling expensive. Just as example, the two can pick a book or mp3
song and then communicate by using page and word (byte) numbers. Today this
can be electronically done very smoothly. The common mistake made here is
that such naive pad implementation is easy to break, much easier than, say,
breaking 256-bit ChaCha20-Poly1305. That's very true, but it has to be broken
*per* correspondent pair, not once for all. If it takes only one minute to
try petabytes of all known mp3s, do frequency analysis, guess the language,
and, assuming that it was simple XOR (without doing ROT-13 first), retrieve
the plaintext, it makes mass surveillance dead in the water. They don't have
that minute to spend.