Hidden malware in offshore products raises concerns

"You've go to be a little paranoid to survive in this business." -- Andrew S. Grove, chairman and founder, Intel Corp., circa 1980

The extreme difficulty in discovering back doors hidden deep within a complex application, buried among numerous modules developed offshore in a global software marketplace, is forcing those assigned to protect sensitive national security information to take defensive actions.

The threat of hidden Trojan horses and back doors surfaced this summer when the governments of the U.S. and China announced plans to strengthen national security policies covering information processed by applications written in the global software marketplace. The private sector joined the fray with the August announcement of the File Signature Database, which will use hash values to protect software integrity from malicious additions (see story).

The National Security Agency's information assurance director, Daniel Wolf, in testimony before the House Select Committee on Homeland Security's cybersecurity subcommittee in July, called for a federal lab that would "find malicious software routines that are designed to morph and burrow into critical applications." Separately, the State Council of the People's Republic of China in August directed all government ministries to buy only Chinese software in the next upgrade cycle in an effort to not only encourage the development of local software companies but also to protect sensitive government data.

Mark Willoughby

Steps taken so far

The simmering global paranoia is rooted in the realization that no simple solution exists today, experts say. It's virtually impossible to find unauthorized malware hidden deep within a sophisticated multitiered application with data normalization, messaging middleware and other modules originating from labs in a half-dozen countries.

Robert Lentz, the U.S. Defense Department's director of information assurance, said in a written statement, "The DoD currently is studying several aspects of software assurance. The DoD has a current software acquisition policy. The group studying software assurance is looking to supplement that policy with strengthened mechanisms to increase our confidence in the security of both foreign and domestic software products."

Input, a Chantilly, Va.- based technology research firm, says federal government spending on IT products and services will grow 8.5% yearly from 2003 to 2008, from $45.4 billion to $68.2 billion. Approximately half of that spending will be in areas in which the government would like to see stronger information assurance.

Incidents of back doors compromising sensitive national security information may never be known. That's not so in the private sector.

"There have been a number of cases where software was found with intentionally planted back doors," said Shawn Hernan, team leader for vulnerability handling at the CERT Coordination Center at Carnegie Mellon University. "Most of these were for providing support, although no such support option was given to commercial customers. It's happened in both proprietary and open-sourced software."

Hernan said discovering hidden malware is one of the most difficult tasks facing an assurance investigator. CERT doesn't track vulnerabilities by country of origin, he added.

Software engineering processes are only now beginning to focus on providing traceability in security code. Traceability, which would allow a given line of code or a software module to be tracked back to the developer, is viewed as the Holy Grail in combating hidden malware. Traceability is also an effective tool for discovering software defects that expose an application to a myriad of exploits, viruses and worms.

Watts Humphrey, a fellow at the Software Engineering Institute (SEI), also at Carnegie Mellon, said the SEI's Capability Maturity Model (CMM) Level 5, the highest software quality certification, doesn't "extend traceability to individual programmers."

However, the SEI is introducing Team Software Process (TSP), which brings traceability of specific code modules to individual programmers, said Humphrey, a former IBM software engineering executive. Indian software companies and a few U.S. developers, notably Microsoft Corp., are aggressively implementing TSP.

"TSP gives team ownership of the [software] product," Humphrey said. It brings methods and processes that provide traceability and therefore accountability to individual programmers, and is equally adept at identifying and fixing software defects that result in security breaches.

The SEI conducted a special TSP security workshop at Microsoft headquarters last year. Microsoft's director of security engineering strategy, Steve Lipner, said individual accountability for developers is the primary focus of the company's Trustworthy Computing initiative.

Microsoft's focus on processes that make developers accountable will help to detect software defects, and "the same processes protect the integrity of the source code from internal threats," Lipner said.

Microsoft is also putting the Windows operating system through Common Criteria testing for an Evaluation Level 4 rating, which offers a baseline of assurance by indicating the rigors of the test performed, described as a security profile. The Common Criteria security profile is a battery of tests to check for common vulnerabilities. It doesn't specify the actual security of the software being evaluated, nor does it require identifying country of software origin.

However, "malware can be detected at the higher levels of Common Criteria testing," said Jim Fink, the U.S. director of the Common Criteria Testing Laboratory at Computer Sciences Corp. in El Segundo, Calif. But an Evaluation Level 5 through 7 rating requires examination of the application's source code, a costly and inexact process. Commercial software vendors are reluctant to share their source code, mindful of the intellectual property ramifications.

The weaknesses in Common Criteria testing in finding hidden malware in commercial software are contributing to paranoia on the matter. A spokesperson for the U.S. National Security Agency said a source code evaluation is likely to discover only "obvious vulnerabilities." Skilled programmers are very good at hiding malware of "moderate" and "high" attack potential, which even the most skilled code auditor may need some luck to discover, the NSA spokesperson said.

In an effort to allay hidden malware paranoia and protect access to the Chinese government software market, Microsoft earlier this year granted Windows source-code inspection rights to the Chinese government (see story). IDC, a Framingham, Mass.-based technology analyst company, forecasts the Chinese public-sector software market to grow at 28% yearly, from $427 million this year to $939 million in 2006.

Oracle Corp. Chief Security Officer Mary Ann Davidson said the globalization of software development dictates global development processes. "The assumption is that everybody physically located outside the U.S. is more of a risk." But that assumption is incorrect, Davidson said, citing the many documented and publicized security lapses from trusted U.S. employees in both the public and private sector. Still, at the end of the day, it's difficult to find hidden malware with current tools, she said.

Fifteen countries, mostly NATO members but also Israel, support the Common Criteria. Discussions are under way to add Japan, South Korea and Russia. India and China haven't committed to the process. Approximately 12 certified private-sector labs test off-the-shelf software for a Common Criteria Evaluation Level.

Special Report

Offshore Buyer's Guide

Stories in this report:


Copyright © 2003 IDG Communications, Inc.

It’s time to break the ChatGPT habit
Shop Tech Products at Amazon