Computerworld - The startling recent announcement that the SHA-1 hash function was not as secure as previously believed also raised interesting questions in the world of one-time password (OTP) technology, since the newly proposed HOTP algorithm is based on SHA-1.
Should the industry standardize around a single OTP algorithm? And what role should algorithm agility have in the future of OTPs?
HOTP, the HMAC-based One-Time Password algorithm, is favored by OATH, a consortium organized last year to promote OTP technology. HOTP is based on the HMAC-SHA-1 algorithm (HMAC stands for Hash-based Message Authentication Code), which in turn is based on SHA-1. In HOTP, a OTP is computed as a function of a token secret and a counter value:
one-time password = HMAC-SHA-1 (token secret, counter)
Although HOTP is new, HMAC-SHA-1 itself is fairly widely standardized as a method for ensuring message integrity and is also often recommended for additional purposes such as key derivation.
As it turns out, the recent research results, which affect only SHA-1's collision resistance -- the difficulty of finding two new messages with the same hash value -- don't directly affect HMAC-SHA-1, which primarily depends on the one-wayness of SHA-1. Since HOTP depends on the strength of HMAC-SHA-1, not the collision-resistance of SHA-1, the research results don't directly affect HOTP, either.
Nevertheless, there is still good reason to question whether HOTP is suitable as a standard algorithm for OTP generation, and, more generally, whether such a standard algorithm is even necessary at all.
When an algorithm supports a protocol that is employed in a one-to-many basis, standardization can be quite important, because the "many" may reflect multiple different implementations from a variety of vendors. For instance, code signing and digital certificates need standard algorithms to ensure that the signatures generated by one party can be verified by many others.
OTP algorithms that are based on a shared token secret, however, are inherently one-to-one: one token generates a OTP, and one authentication authority verifies it -- namely, the one that shares the token secret. Other parties (a desktop client, an application server) may transport the OTP, but they don't need to know how to generate or verify it. (Although the authentication authority might be implemented across multiple servers, these servers act in concert, being under the same administrative control.)
If a single, standard OTP algorithm is not necessary, one might ask if there is any harm in establishing a single standard. There are two major reasons for why it would be counterproductive to do so.
First, algorithms come and go over the years. SHA-1 itself was already on the
- The Pivotal Big Data Suite- Reducing the Risks of Big Data The explosion of big data and the rapid evolution of big data tools and technologies is challenging IT to meet the demands of...
- A Survival Guide for Data in the Wild All corporate data used to reside in the data center. Safe and sound behind the corporate firewall. But now, employees have multiple devices...
- Transforming Security: Designing a State-of-the-Art Extended Team The information security mission is no longer about implementing and operating controls.
- The Big Data Security Analytics Era Is Here New security risks and old security challenges often overwhelm legacy security controls and analytical tools.
- What should I look for in a Next Generation Firewall? SANS Provides Guidance With so many vendors claiming to have a Next Generation Firewall (NGFW), it can be difficult to tell what makes each one different....
- Responding to New SSL Cybersecurity Threat The featured Gartner research examines current strategies to address new SSL cybersecurity threats and vulnerabilities. All Security White Papers | Webcasts
Our new bimonthly Internet of Things newsletter helps you keep pace with the rapidly evolving technologies, trends and developments related to the IoT. Subscribe now and stay up to date!