Posted by: jonkatz | July 7, 2010

Surveying cryptography

Yesterday I gave a day-long lecture on cryptography at the ACE Cyber Security Boot Camp run out of the Air Force Research Lab in Rome, NY. The ACE boot camp could make an interesting post in its own right, but for now I want to focus just on my lecture.

My aim was to present the aspects of cryptography that “information security professionals” should know, but I was a bit unsure exactly what to present even though I face exactly the same dilemma every time I teach cryptography as part of my Computer Security class. (In fact, except for cutting some material due to lack of time, I ended up covering pretty much what I cover in my Computer Security class.) I definitely feel that what I cover in both these cases should be targeted differently from what I cover in an undergraduate cryptography class — the goal of the latter is to really develop a deep understanding of cryptography, while the goal in the former cases is (in my opinion) to teach people how to use cryptography. (There is, of course, also the issue of time — there is no way I can cover all of a crypto undergrad course in one day, and when I teach Computer Security I cannot devote the entire semester to cryptography.)

So, what to teach? Here is roughly what I covered:

  • I began with a discussion of “modern cryptography”, stressing the importance of definitions, explicit (cryptographic) assumptions, and proofs. While I don’t expect information security professionals to ever write (or read) a formal definition or proof, I do want them to know that these are out there: they should be able to informally define security requirements for some task; they should understand that different encryption schemes, say, provide different levels of security; and they should know to be very wary of using any crypto scheme that doesn’t come with a proof of security. I also want to correct the misconception that so many people seem to have about cryptography being an “art” rather than much more of a “science”. (Sadly, even many people teaching crypto at the university level seem to have this misconception…)
  • I discussed private-key encryption, beginning with perfect secrecy and its limitations and using this to motivate computational security. I went through the exercise of asking them to propose a good definition of security to illustrate how subtle this really is. I covered PRGs and PRFs/block ciphers (3DES, AES), and defined CPA-security. I showed proofs for some simple constructions, though in retrospect this was probably a mistake since I think it was too much to cover in such a short time. (My goal was just to show one proof so they got the idea of how such proofs work.) I taught them about CBC mode and CTR mode, and stressed that they should use a standard mode with a standard block cipher.
  • I then spoke about message authentication codes. Again I gave a simple construction and proof (and, again, this may have been too much), and then showed CBC-MAC and HMAC. Along the way I got to talk about hash functions. After this I talked briefly about CCA-security and authenticated encryption, and taught them to use the “Encrypt-then-Authenticate” technique to obtain authenticated encryption.
  • I then moved on to a discussion of public-key cryptography. I began with the Diffie-Hellman protocol, and it was interesting to me how little number theory is needed to follow it. (I was planning to cover more, but cut it short due to lack of time. On the other hand this forced me to cut some details of Diffie-Hellman that I was going to address.) I talked about El Gamal encryption and then moved on to RSA. I mentioned “textbook RSA” encryption and why it is insecure, and then introduced RSA PKCS #1 v1.5 (“padded RSA”). I then taught them about hybrid encryption. Finally, I discussed chosen-ciphertext attacks/malleability, and then told them about the existence of RSA-OAEP/PKCS #1 v.2.1 (without going into any details). I stressed that they should always use a CCA-secure scheme.
  • I ended with signature schemes, where I found (relatively speaking) not very much to say. After describing what signatures can be used for (along with the definition of security), I showed them “textbook RSA” signatures and why they are insecure, and then showed them “hashed RSA” (i.e., FDH). I mentioned the existence of DSA, without any details. I also mentioned the hash-and-sign technique.
  • I had planned to talk a bit about PKI, and end with a discussion of “cryptography implementation pitfalls” but ran out of time. (I do cover these topics in my Computer Security class.)

I was clearly basing most of these topics on what is covered in my book, and I am overall happy with this coverage.

What would I do differently next time? I would like to mention PKI and talk a bit about “crypto in the real world” (i.e., implementation issues) since this seems more critical for this audience than some of the more theoretical topics I covered. I would also like to stress a bit more some of the concrete issues (e.g., what RSA modulus size should currently be used), although I should mention that this did come up due to questions I got from the audience.

To make room for the above (and also because I ran short of time even with what I covered), I have to cut something. I would probably cut all the proofs (while still talking about how proofs are important) — it’s simply too difficult to convey a cryptographic reduction to someone who has never seen one before in the limited amount of time I had.



  1. two comments.

    first, the ACE mission is great: The Advanced Course in Engineering Cyber Security Boot Camp develops the next-generation of cyber leaders through education, hands-on training, officer development and weekly 8-mile runs. how different would grad school be if we went on weekly 8-mile runs?

    second, i think it’s an interesting question how well the “standard” treatment of crypto resonates with security professionals. phil rogaway makes a great point about this in Section 7 of his essay: . It’s not even just a conceptual difference, since I think any of the possible ways of combining encryption and authentication work for the AEAD notion (thanks to Yevgeniy for pointing that out to me).

  2. Fortunately, instructors do not have to do the 8-mile runs. (Believe me, I made sure to check that it was not a requirement…)

  3. In venues like this, I like to spend an hour or so covering ‘snake oil warning signs.’ My experience with students of this kind (military officers being sent for post-bachelors technical training or education) is that they are actually not likely to develop computer systems themselves. Instead, they are likely to be in charge of programs: development programs, acquisitions programs, etc. They will control a lot of money, and will be the target of many, many sales pitches. To that end, they need ability to form accurate judgments about people, products, and companies. So in addition to showing them what ‘good’ cryptography looks like, I like to show them what *bad* cryptography looks like.

    Bruce Schneier has a good list of the most egregious snake oil signs (one-time pads, absurdly long keys, etc.). But I feel this list should be extended to include the kinds of ‘honest mistakes’ I see well-meaning security professionals make all the time: believing that encryption provides integrity or source-authentication, that all AES modes have the same properties, that the adversary knows nothing about the plaintext being encrypted, etc.

  4. Sounds like fun! It was fun to read about your experience.

    I know it will pain you to hear this, but I think you should skip the definitions, theorems, proofs, reductions, emphasis on modern cryptography, provable security, etc. I think you should replace that with a brief overview saying that when dealing with a newly designed protocol, it needs to be analyzed using those methods, and describing them how to hire a competent cryptographer-consultant who does know provable security. (Hint: There are some really smart cryptographers at many universities, who would probably welcome some extra consulting incoming.) These folks don’t need to know anything about provable security themselves, they just need to know when it can be useful and how to hire a specialist.

    You might also find it interesting to teach the Top #10 most common crypto mistakes.

    They might also find it interesting to read about flawed crypto in a variety of deployed products — case studies, so they get a sense of what goes wrong.

    I agree with the value of covering `snake oil warning signs’, and how to evaluate crypto stuff if you’re not a cryptographer. I also agree with the value of covering implementation problems (e.g., crypto that has failed due to errors in the implementation, not in the algorithm), including buggy code, side channels, fault attacks, etc.

  5. Maybe i’m just being naive, but it seems like security professionals would be well served by understanding the basics of provable security, which are quite empowering. Especially given that jon is a leading expert on the subject it would seem a shame for him to simply talk about warning signs, implementation issues, etc, which don’t require that kind of expertise to explain. It seems to me that anyone with a reasonable technical background could understand the basic ideas, and if we simply avoid them then we’re in some sense telling people they’re too complicated/mysterious for them to understand.

  6. I work as a “security professional” working in application security, I have found the most useful crypto information for me has been about how existing crypto primitives can be misused and attacked. E.g. Why CBC mode doesn’t stop tampering attacks, that length extension attacks on hash functions exist, how many PRNG constructions are predictable, that reseeding them with time() down some triggerable code path is possibly the worst thing you could do, how some DES implementations used for hashing only really hashing 7 bytes of the data because the implementation is only grabbing 7 bytes of key, etc

    And if you know any good resources on similar things, I would be very interested in seeing them. (Also, if you know any good primers on attacks on crypto primitives, such as for an undergrad crypto course, I would be interested in seeing those, but I am probably an exception in infosec there)

    As a security professional working in industry (as opposed to .gov) I’m generally not interested in proofs of security because no-one will ever have one, probably because no-one is writing a new crypto algorithm, they’re just butchering whichever primitives their framework provides and saying it is secure because they can’t figure out a way to break it.

    Also, related to another post on this blog, when reviewing real world applications, you *will* find crazy things like decryption oracles (typically down some code path no-one thought about), and sometimes keys will be reused in some other part of the application for something else and so a CCA attack would be useful to a real-world adversary.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: