-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

I have a number of comments about the New York Times article on PGP 5.5 for
Business of which Martin Minow sent a synopsis.

If we had built what they said we had, then we'd deserve of all the
derision people have directed at us. But we didn't. The New York Times got
it flat wrong.

I'll describe what we built, how it works, and its limitations. But first,
some background on the problem we're trying to solve in PGP 5.5.

A couple of years ago, the government sugarcoated their surveillance plans
by switching from "key escrow" to "key recovery," and trying to sell
surveillance to people by pointing out some of the downsides of strong
cryptography, and selling key recovery as the way around them.

One of the downsides of cryptography is that if you lose your passphrase
(or token, PIN, smart card, or whatever), you've lost your data. My
favorite way of expressing this problem is, "if you lose the keys to your
car, then you have to get a new car."

This downside is particularly insidious for a number of reasons. First,
without fixing that problem, strong cryptography will be in some sort of
limbo. You want to use it to protect your valuable information, but you
won't want to use it for any information that's *too* valuable, because
it's easily lost. Crypto-protected information is fragile, and this
fragility could hurt its widespread deployment.

Worse, this gives the government a rationale for regulating cryptography.
Like it or not, government has a mandate to protect the people from
dangerous technologies, be they in foods, drugs, autos, or information
technologies. Many people believe that the government uses this mandate as
a rationale for acquiring power, many people would prefer that they let us
take our chances, but that's not germane to this discussion.

It *is* germane to note that when you hear some ham-fisted remark about how
surveillance is like air bags, they are saying: they have to protect people
from dangerous things, crypto is dangerous, therefore they have to protect
people from crypto.

When they started mumbling along these lines, the privacy community got
their own act together and started describing what we believe to be the
real solution. This is called "data recovery." The first time I heard the
term, I hated it. I still hate it. The reason I hate it is that it's got
the word "recovery" in it, which makes it sound to someone who isn't paying
a lot of attention that all recovery systems are basically the same thing.
Most of the people in the world don't pay a lot of attention most of the
time. When I was at HIP97 this August, I was amused to hear cypherpunks
chanting, "Data recovery good, key recovery bad." The sublimely Orwellian
tone of this mantra makes me laugh and cringe at the same time. (To explain
the reference, in Orwell's "Animal Farm," there's a revolution in the farm
and the animals take over, run by the pigs. One of the slogans they have is
"four legs good, two legs bad." By the end of the book, the pigs are nigh
indistinguishable from the people. But I digress.)

The essence of data recovery is that focusing on the keys is a canard. If
you've misplaced your data, you want the data back, not the keys. The only
people who want your keys are people who want to spy on you. If you've
locked yourself out of your car, you want the use of your car, not the just
the key. Thus, the solution to encrypted data being fragile is to let
people get to the data. Simple, obvious, but subtle, because the key to
getting the data is the key.

If you don't like data recovery, you aren't going to like what we did in
PGP 5.5 -- we built a data recovery system. Some people aren't going to
like it, and some of those will think this missive is a load of
self-serving twaddle. Myself, it gives me the same mildly uncomfortable
feeling that fake rocks for spare keys do, or skeleton keys do, financial
audits, or any other similar technology. Uncomfortable feelings aside, if
the fragility problem is not solved, then many people who should be using
crypto won't, and government will continue to view this problem as a
question of public safety, and thus in their mandate to regulate.

Data recovery is useful for a number of things. Perhaps you lost your
passphrase. Or data might have been encrypted by an employee or co worker
who was in an accident. (As an aside, fifteen years ago, the architect of a
product I worked on was in a severe car wreck. He was not killed, but
suffered brain damage and has never returned to work.) Your spouse might
need access to financial records. Everyone, be they an individual,
business, or coporation has a right to having their data protected, and
protection not only means being able to put it into a safe, but getting it
out of that safe later.

What makes data recovery different from key recovery? In my opinion, data
recovery allows you to get encrypted data without compromising the key of
the person who encrypted it. Data is property, and keys are property. An
ethically built system allows emergency access to data without destroying
the property of the key owner.

Ethically built data recovery software has a number of properties:

(1) It is surveillance-surly. It should be impossible or unwieldy for an
adversary, be they government (yours or foreign), dirtballs (such as
crackers), business competitors (who sometimes count as dirtballs), or
others to use this against you. The system should also be aware of how
passive surveillance (like traffic analysis) interacts with it.

(2) It is an "opt-in" system. Users must consent to it, and must take some
action to start using it. It should be as easy as possible to stop using
the system. The system must also allow someone who does not opt in to use
all the system's features. Please note that abuses of consent (for example,
an employer who says, "consent or we fire you") are something we can't
prevent in any system.

(3) It must obey the principle of fair warning. If you send me a message
that is subject to data recovery, you should know that before sending the
message. This way, if you don't agree with my policy, you can decide not to
send me that message. This interacts closely with the opt-in principle above.

(4) The data recovery system should be preferable to an escrow system. A
number of corporations who use PGP keep copies of their employees' secret
keys. This is both odious and dangerous. Escrowed keys are a target for
attackers, subpoena-bait, and potentially ruin the value of digital
signatures. It's just bad policy.

(5) The system has to allow someone under a legal threat to respond
effectively to that threat. Legal threats include warrants, subpoenas, and
discovery processes. You have to be able to respond to the request for
information without losing your keys and thus all of your data.

(6) It must also provide a response to those who would regulate crypto in
the name of public safety. Fortunately for us, potential regulators have
focused on the horsemen of the infocalyse. There are other
pseudo-public-welfare threats including the rights of a person to their
spouse's records, or the rights of heirs to information property. We, the
people who design privacy systems, have to think about what happens when
the regulators stop dragging out the pornographers and start dragging out
the poor widows and orphans.

Note that these requirements are not completely consistent with each other.
For example, an opt-in system is riskier than an opt-out system, yet
friendlier to one's own privacy. Balancing these requirements is part of
the difficulty of good software design.

If you have been skimming the above, wondering when I'm going to get around
to actually telling you what we did in PGP 5.5, this is it.

With PGP 5, there are a number of attributes of your key that are stored in
a self-signature. For example, your preferred symmetric algorithms are
stored in your self-signature. The data recovery feature -- which we call
"Corporate Message Recovery" -- is an attribute in your self-signature that
tells anyone who receives your key that you want messages encrypted to you
to also be encrypted to that other key. There is also a flag that tells the
encryptor, "please" or "I insist." Architecturally, there can even be more
than one recovery key. That's it. That's all it is.

Well, that's mostly all it is. There are other bits of the system. For
example, if I look up Alice's key on a key server and Alice has a recovery
key, I get Alice's recovery key, too. If Alice's recovery key is a "please
use" key, then I can encrypt to Alice alone. In any case, the PGP software
tells me that Alice has a recovery key, so I can decide to use some other
mechanism to talk to her.

Note that design satisfies the opt-in and fair-warning requirements. Also,
since Alice's recovery key is an attribute of her self-signature, she can
change it. She can even have a second user name (let's call it Bob), that
has no recovery key.

Also, we have three encryption products: PGP freeware, PGP for Personal
Privacy, and PGP for Business Security. Corporate Message Recovery is
included *only* in PGP for Business Security. It is not, and never will be,
in either the freeware or the Personal Privacy product. It is an extra cost
item that we created for businesses as per their requirements. As I stated
above, a number of these businesses keep copies of their employees' secret
keys. One of the reasons we created this feature is to satisfy their
requirements with some mechanism that is less blunt than key escrow.

When a PGP message is formed, there are a number of "packets" that make up
the message. The usual construction is that there is a "session key" packet
for each public key that the message is to be read by. Following that is
the actual message packet, that is encrypted with a symmetric cypher to
session key. The session key packets specify the *key*, by its 64-bit key ID.

This is an important and subtle point. Let's go back to Alice, a.k.a. Bob.
The information that specifies a recovery key is in a self-signature of a
user name, but the session key specifies a public key by keyID. It is
impossible, solely from looking at a message, to know if it is addressed to
Alice or Bob because that information is not stored in the message. A
message that does not honor recovery is syntactically correct.

I don't know why Bruce Schneier said that this is everything the FBI wants.
If it is, then they have changed spots! One of the major ways PGP's system
differs from anything else I've seen is that it has no enforcement built
into the protocol. This helps make PGP surveillance-surly, with or without
Corporate Message Recovery. If this is all the FBI needs, then they've
decided the way to get your files is to knock on your door with a warrant,
and that's a big, big, big step forward.

Getting back to the system, I'm sure you've noticed a gotcha there. If I
mail Alice a message that I encrypted to Bob, she can decrypt it, but the
recovery key can't. If you've been paying very close attention, you have
wondered something akin to, "hmm, if Alice's key accidentally lost its
self-signature, there would be no way to encrypt to her recovery key."
You're right. If Alice really wants recovery on messages sent to her, then
she has to use our SMTP Policy Management Agent.

The policy management agent is an SMTP proxy. You can configure it to do a
number of things. Most relevant to this discussion is that Alice can use
the policy agent to require that her recovery key gets used. However, the
policy agent does *not* decrypt the message. One of the very good features
of Business PGP is that it does not decrypt the message. It does not
prevent or even try to prevent multiple-encryption. It's really, really
easy to encrypt a message to Alice alone, and then encrypt that message to
both Alice and her recovery key. We're not going to change that. Nor does
the policy management agent archive messages, make copies, notify your
mother, or any of the other things we've been accused of doing with it.
It's simply the gatekeeper that enforces Alice's corporate policy.

To sum up, we created the Corporate Mesage Recovery feature to satisfy the
requirements of our customers who need emergency access to data. We made
careful decisions to make it useful and effective for honest people, while
minimizing its potential for abuse. No one has to use it; we do not include
it with PGP freeware, nor with PGP for Personal Privacy. We alert all users
of all products when they encrypt to someone who has a message recovery
key. It is an opt-in system that you can opt out of. It is not a
surveillance system. A few weeks ago, we showed it to the FBI and asked
their opinion. They told us it doesn't meet any of their needs.

- - ------
Jon Callas                                         jon@pgp.com
Chief Scientist                                    555 Twin Dolphin Drive
Pretty Good Privacy, Inc.                          Suite 570
(415) 596-1960                                     Redwood Shores, CA 94065
Fingerprints: D1EC 3C51 FCB1 67F8 4345 4A04 7DF9 C2E6 F129 27A9 (DSS)
              665B 797F 37D1 C240 53AC 6D87 3A60 4628           (RSA)

-----BEGIN PGP SIGNATURE-----
Version: PGP for Business Security 5.5

iQA/AwUBNDqoyn35wubxKSepEQJ9FQCfQcaS8aWdXcTZild0nKe5+LatDRsAnA5n
GTIb2dYUx4+Uh/Frim2hKFuF
=4u2g
-----END PGP SIGNATURE-----


-----
Jon Callas                                         jon@pgp.com
Chief Scientist                                    555 Twin Dolphin Drive
Pretty Good Privacy, Inc.                          Suite 570
(415) 596-1960                                     Redwood Shores, CA 94065
Fingerprints: D1EC 3C51 FCB1 67F8 4345 4A04 7DF9 C2E6 F129 27A9 (DSS)
              665B 797F 37D1 C240 53AC 6D87 3A60 4628           (RSA)