I've been trying for a couple of weeks to put together a couple of interesting

posts on the cryptographic modes of operation for confidentiality and integrity, and I

just can't do it. I'm finding it boring to write about, and if it bores me to write it, I know there's no way that it's going to be engaging to readers!

So, I'm going to move on. I've explained the basic idea of the message

authentication code as an integrity check, and I've described one simple way of

integrating it into a common mode of operation. If you're really interested in

learning more, I recommend Bruce Schnier's book on cryptography, which has ton of

material on modes of operation and protocols, how they work, and how they can

fail.

Meanwhile, I'm going to move on to something that doesn't bore me to write about,

and therefore hopefully won't bore *you* to read about: asymmetric

cryptography, also commonly referred to (although not entirely accurately) as public

key cryptography.

What we've looked at so far is symmetric cryptography. The basic idea of it is illustrated to the right. In a symmetric

cryptosystem, you've got two parties who want to communicate, and they've got a

*shared secret*. That shared secret is the key to the cryptosystem. Put in

pseudo-mathematical syntax, the symmetric cryptosystem is a pair of functions, En and De, such that given a secret key S:

- Ciphertext = En(S, Plaintext)
- Plaintext = De(S, Ciphertext)

We call it symmetric because both the sender and the receiver need the

*same* information. Both of them can encrypt messages using the key; both can

decrypt. There's no way to distinguish between the two on the basis of what

information they have, or how they encrypt their messages.

In an asymmetric system (as illustrated above) that's no longer true. Asymmetric systems use

two keys, S_{1} and S_{2} (also called the private key and the public key) such that:

- Ciphertext
_{S1}= En(S_{1}, Plaintext) - Plaintext = De(S
_{2}, Ciphertext_{S1}) - Ciphertext
_{S2}= En(S_{2}, Plaintext) - Plaintext = De(S
_{1}, Ciphertext_{S2})

In english, suppose you've got two people, Ann and Bob who want to communicate.

They each have *their own* key. To send a message to Bob, Ann encrypts the

message with *her* key. When Bob receives the message, he decrypts it

*not* with the same key that Ann used to encrypt it, but with *his own*

key. When he wants to send a message back to Ann, he takes the same key that he used

to *decrypt* the message from Ann, and uses it to *encrypt* a message

*to* Ann.

This leads to one of the basic terminological confusions. Reading that

description: Bob's key decrypts messages *from* Ann, and encrypts messages

*to* Ann has an element of symmetry to it. There is a very elegant symmetry to

asymmetric encryption! But the terms asymmetric and symmetric in cryptography refer to

the information that each of the parties uses in a cryptosystem. In a symmetric

system, both parties have *exactly the same information*. In an asymmetric

system, they've got *different* information.

For the most part, the basic requirements of an asymmetric cryptosystem are

very similar to the requirements of a symmetric one - you want to be able to compute the ciphertext easily given the encryption key and the plaintext; you want to be able to compute the plaintext easily given the ciphertext and the decryption key; you don't want to be able to compute the key given the ciphertext and the plaintext,

and so on. But there's one *major* complication in asymmetric cryptosystems that creates an additional requirement: given the encryption key, it must be

computationally infeasible to generate the decryption key.

That last requirement kills the overwhelming majority of potential and proposed

asymmetric systems. Having one of the keys in a pair means that you've effectively

got access to an unlimited chosen plaintext attack: you've got the ability to generate

the ciphertext of any plaintext, as many times as you want, and to use that to try

to find the value of the key that will decrypt that text.

The most common use of asymmetric cryptography is public key cryptography. One of the really fascinating things about it is that instead of trying to keep the keys private, you deliberate make one of the two keys widely available - you give it to *everyone*.

The idea is that you take one key, which we call your private key, and you lock

that away. No one but you should *ever* have access to your private key. The

second key, called your public key, you give to everyone. You post it on public

websites, you register it with key libraries, you send it to all of your friends, etc.

Anyone who wants to be able to read encrypted messages from you needs to be able to

get your public key.

Now, when you want to send someone a message and prove it's from you, you encrypt

it with your private key. Anyone can decrypt it - but *only you* could have

encrypted it in a way that allows your public key to decrypt it.

Similarly, when someone wants to send you an encrypted message, they encrypt it

with your public key. Then *only you* can decrypt it.

To set up a secure cryptographic channel with someone, you need to have

*their* public key. So, suppose that we have Amy and Bill, who want to send

secure messages to each other. What are the goals of the cryptosystem here? When Amy

wants to send a message to Bill, she wants to make sure that *only Bill* can

read it. When Bill gets a message from Amy, he wants to make sure that *only
Amy* could have sent it.

The way we can provide both of those is by having Amy take her message, and

*first* encrypt it with *her* private key. Then she encrypts it with

*Bill's* public key. The message has been encrypted twice: the first encryption

pass guarantees that the message is from her (because no one else could have encrypted

a message with her private key); the second encryption pass guarantees that no one but

Bill can read it (because no one but Bill can decrypt a message with his private

key).

Bill does exactly the same thing when he wants to send a message to Amy. First he encrypts it with his private key, to show that it's really him who sent the message.

And then he encrypts it with *Amy's* public key, which ensures that only Amy can read it.

Towards the beginning of this message, I said that asymmetric cryptography is sometimes *incorrectly* called public key cryptography. The reason that I said that is because public key cryptography is really the combination of an asymmetric cryptosystem *together with* the infrastructure that allows the kind of scenario I described above, for Amy and Bill, to work. For it to work, Amy and Bill need to have a way of getting each other's public keys - and *being sure* that they are really getting the correct, valid public keys for each other. There's a very elegant cryptographic attack called a *man in the middle* attack, which relies on weaknesses in the public key infrastructure, not in the actual asymmetric cryptographic system used by the public key system, which I'll describe in a future post.

Are you going to explore the different types of asymmetric encryption? I've always been interested in Elliptic Curve Cryptography, but not due to any real understanding of it, just because NeXT had it when I was young, and NeXT was cool. đź™‚

Years ago I had the idea of burning the private key into a processor chip during manufacture, in such a way that the processor could use the key but it was unavailable from the outside both physically and electronically. The manufacturing equipment that created the key pair would print the public key on the casing or otherwise publish it and then forget the private key completely.

The chip would then be uniquely self-identifying; it can encode and decode with the private key, and no one else can. Nowadays you could build the chip into a cell phone, add some really good biometric hardware to make sure that only the owner can use it (say a DNA scanner), and you have a nearly unbreakable communications and identification system.

We applied for a patent and got it; it was assigned to my employer at the time, Prime Computer. After awhile I left, Prime died, and the patent expired.

I really wish I knew if it was a good idea or not.It was a good idea, but it was born too early. http://en.wikipedia.org/wiki/Trusted_Platform_Module

What is wrong with using plain old "Alice and Bob"?

Tobias:

Nothing's wrong with Alice and Bob. It's just boring to always use the same names, so I went for a variation. đź™‚

Robert:

How far I get in asymmetric encryption will be determined by a combination of how long it takes before I get bored, and how well readers respond to it. If lots of people really want to know about elliptical-curve encryption, I'll probably wind up writing about it.

I can't speak for Alice, but I'm sick and tired of just encrypting and decrypting the whole day long. Half the time it's just another string from

lorem ipsum.I have to do this by moving rocks around in a desert, you know. It's not fun.

Bob: It doesn't sound like such a great idea to me. Consider:

1. The chip gets fried. Now no one, including yourself, can decrypt or sign any messages to/from you. Oops.

2. The chip gets hit by a random alpha particle that flips one of the bits in the secret key. Similar to #1, except that you may not realize the problem for a while, and send out a number of critical messages signed with a random key.

3. You really have to trust the manufacturer not to keep an extra copy of the secret key, don't you? In view of #1 and #2, this may even be what a lot of the customers want as a backup plan. But then you have to trust that the manufacturer doesn't make an extra copy of the key list for the NSA, or whoever. Given how easily most telecoms seem to have agreed to break the FISA wiretapping law at the government's request, how far do you think a manufacturer can be trusted? Now consider that a lot of chip manufacturing is currently taking place offshore...

Lassi: Thanks. I hadn't looked at that effort for a couple of years. They talk about doing a hash of the OS code to verify that it hasn't been tampered with, something I had also in my patent. I really think that if the patent hadn't expired, they'd be subject to it.

Dave W:

#1: Losing the key is always something you have to worry about with encryption. You just have to keep the possibility in mind when you're deciding how to use it. For instance, I wouldn't use a key embedded in my cell phone for important, long-term documents, because I might lose the phone. I wouldn't store important documents in just one place, or encrypt them with just one key. I wouldn't store all my keys in the same place.

#2: Checksums and other redundancy can reduce this problem to near-zero probability.

#3: There's always going to be something or someone that

has to be trustedsomewhere in the implementation and use of a secure system. If the manufacturer can't be trusted, then you need an oversight agency that inspects and monitors their equipment. And you have to trust that agency.#1: Make spares.

#3: Use a field-programmable gate array.

...Bruce Schnier's book on cryptography...Which one?

Wait a minute, Mark, you have this wrong.

In public key cryptography you don't have Alice encrypting with her private key and Bob decrypting with his private key (as you write in the text and the picture suggests)!

Alice encrypts with Bob's public key and sends the message to Bob, who can decrypt with his private key.

Alice only uses her private key to sign the message (or better: a hash of the message) and Bob can use her public key to verify that she did the signing.

At least as far as I have understand asymmetrical cryptography - how could decryption possible work with two private keys that have nothing to do with each other?!

RE: WiseWoman (comment 10)

> In public key cryptography you don't have Alice

> encrypting with her private key and Bob decrypting

> with his private key.

Mark has it right.

You _DO_ have A encrypting with her private key, in order to "sign" the message. If it were signed with her public key, it wouldn't be secure (and B would need A's private key). If it were signed with B's public key, it wouldn't be verifying A's identity. And we hope that A doesn't have B's private key, so that leaves us with only one option.

And you _DO_ have B decrypting with his private key, as that's the heart of the security. If it were decryptable with his public key, anyone could do it. If it were decryptable with A's public key, it would defeat the purpose. And if it were decryptable with A's private key, B would be out of luck.

No - the private keys have no connection! Alice cannot encrypt with her private key and Bob decrypt with his private key, that's nonsense.

We have to differentiate signing and encrypting (two different things, that need to be done with two different keys, or there are attacks that can succeed):

Alice signs the message to Bob with her private key, and encrypts with Bob's public key.

Bob decrypts with his private key, and verifies the signature with Alice's public key.

Normally, however, Alice prepares a hash of her message, signs that, and then encrypts the whole mess, to keep from encrypting twice. Asymmetric cryptography can be 1000 times slower than symmetric.

@WiseWoman

You and Michael are saying the same things really, he's just taking it for granted that the order of actions is:

1) A encrypts with A's Private Key.

2) A encrypts with B's Public Key.

3) B decrypts with B's Private Key.

4) B decrypts with A's Public Key.

As you pointed out, 1 and 4 are usually described as signing (and it's convenient to just hash and sign said hash, as AC systems are slow). There is no real mathematical difference between signing and "encrypting", it's simply a different way of using the cryptosystem (at least with RSA, which is the algorithm I am most familiar with).

Since WiseWoman touched on the speed of Asymmetric Cryptography, another common practice is only using an Asymmetric Cryptography algorithm to encrypt a Symmetric Key, and to encrypt the payload with a Symmetric algorithm (AES/Twofish/3DES).

Also, +1 vote for a posting on Elliptic Curve Cryptography.