Algorithm agility and OATH

The startling recent announcement that the SHA-1 hash function was not as secure as previously believed also raised interesting questions in the world of one-time password (OTP) technology, since the newly proposed HOTP algorithm is based on SHA-1.

Should the industry standardize around a single OTP algorithm? And what role should algorithm agility have in the future of OTPs?

HOTP, the HMAC-based One-Time Password algorithm, is favored by OATH, a consortium organized last year to promote OTP technology. HOTP is based on the HMAC-SHA-1 algorithm (HMAC stands for Hash-based Message Authentication Code), which in turn is based on SHA-1. In HOTP, a OTP is computed as a function of a token secret and a counter value:

one-time password = HMAC-SHA-1 (token secret, counter)

Although HOTP is new, HMAC-SHA-1 itself is fairly widely standardized as a method for ensuring message integrity and is also often recommended for additional purposes such as key derivation.

As it turns out, the recent research results, which affect only SHA-1's collision resistance -- the difficulty of finding two new messages with the same hash value -- don't directly affect HMAC-SHA-1, which primarily depends on the one-wayness of SHA-1. Since HOTP depends on the strength of HMAC-SHA-1, not the collision-resistance of SHA-1, the research results don't directly affect HOTP, either.

Nevertheless, there is still good reason to question whether HOTP is suitable as a standard algorithm for OTP generation, and, more generally, whether such a standard algorithm is even necessary at all.

When an algorithm supports a protocol that is employed in a one-to-many basis, standardization can be quite important, because the "many" may reflect multiple different implementations from a variety of vendors. For instance, code signing and digital certificates need standard algorithms to ensure that the signatures generated by one party can be verified by many others.

OTP algorithms that are based on a shared token secret, however, are inherently one-to-one: one token generates a OTP, and one authentication authority verifies it -- namely, the one that shares the token secret. Other parties (a desktop client, an application server) may transport the OTP, but they don't need to know how to generate or verify it. (Although the authentication authority might be implemented across multiple servers, these servers act in concert, being under the same administrative control.)

If a single, standard OTP algorithm is not necessary, one might ask if there is any harm in establishing a single standard. There are two major reasons for why it would be counterproductive to do so.

First, algorithms come and go over the years. SHA-1 itself was already on the course for replacement by the next decade based simply on its originally expected underlying security level for collisions. The 80-bit security level, following recommendations by the National Institute of Standards and Technology and American National Standards Institute X9F1 working group, has a "best before end date" of 2010. It's not that the algorithm will become insecure at that point; it's that a conservative design suggests to plan for gradual upgrades to higher security levels, and these take a long time in practice. A system based on HMAC-SHA-1 would need to accommodate stronger algorithms over time anyway, just to keep up with these recommendations.

Second, application requirements change over time, and innovation in OTP algorithms is needed in anticipation. At RSA Security, we've been developing a number of enhancements to our traditional time-based algorithm that offer a variety of new features. Other token vendors likely have their own extensions to offer as well. A single, standard algorithm would make this kind of innovation difficult.

Both these reasons are instances of the principle of algorithm agility: A system should be flexible in its choice of algorithm, where possible, both to maintain security and to meet application requirements over the long term.

The principle of algorithm agility is evident in many of the security specifications in wide use today. X.509 certificates, for instance, can convey any type of public key and can be signed with any digital signature algorithm -- even algorithms that weren't envisioned when X.509 was first proposed two decades ago. Meanwhile, the industry has been able to transition from one hash function to another (e.g., MD5 to SHA-1) and to support multiple public-key algorithms (RSA, DSA, ECC) without any change in the certificate structure itself (although such changes would happen as improvements for other reasons).

The Secure Sockets Layer and Transport Layer Security protocols likewise support multiple alternative algorithms through the concept of "cipher suites," a process that has facilitated algorithm innovation for new applications. In addition, the PKCS #11 interface for cryptographic tokens works with a large variety of algorithms, so that a one-time investment in the interface can return value in multiple algorithm environments.

Even if one could argue that industry should move toward a single, standard OTP algorithm, it's not clear that HOTP would be the best one. Counter-based algorithms are fine for many applications, but they suffer from the potential danger that user error may require significant resynchronization -- what if I accidentally click the token too far ahead? -- and they also provide no assurance of the actual time that the OTP was generated. Time-based algorithms and challenge-response algorithms both address these concerns, and applications benefit from their availability as well.

By focusing so far on standardizing the generation of OTPs rather than their use, OATH really hasn't given a good reason for innovation in this space, except perhaps among the multiple vendors that will compete in terms of the implementation of an HOTP token. Competition around the features enabled by different algorithms, across a common framework, is a lot more interesting -- and more robust for the long term. This is why RSA Security has focused on building out that framework through the One-Time Password Specifications (OTPS), which include techniques for provisioning token secrets, retrieving OTPs from tokens, transporting them to applications and authentication servers, and validating them -- but, notably, not for generating them. Accordingly, the OTPS framework can work with any OTP algorithm, including HOTP. This framework will encourage more widespread use of many kinds of stronger user authentication, which will benefit the industry as a whole.

Ultimately, the most important issue is what is in the best interests of users. For the reasons just explained, standardizing on a single OTP algorithm doesn't fulfill the promise of stronger authentication for users. Industry collaboration toward a standard framework for integrating a wide range of OTP algorithms can. By ensuring that users and the organizations they interact with can leverage the OTP algorithms that best meet their needs -- within whatever context they need to authenticate -- the industry will be encouraged to make necessary long-term investments in stronger user authentication.

Burt Kaliski is vice president of research at Bedford, Mass.-based RSA Security and chief scientist of its research center, RSA Laboratories. Active in the development of cryptographic standards, he coordinated the development of the Public-Key Cryptography Standards (PKCS), working with major early adopters of public-key cryptography. He also was chairman of the IEEE P1363 working group, which developed a standard, IEEE Std 1363-2000, covering the three main families of public-key cryptography. Kaliski is a graduate of MIT, where he received bachelor's, master's and Ph.D. degrees in computer science.

From CIO: 8 Free Online Courses to Grow Your Tech Skills
Join the discussion
Be the first to comment on this article. Our Commenting Policies