Skip the navigation

Guidelines released for antivirus software tests

By Jeremy Kirk
June 14, 2010 01:20 PM ET

IDG News Service - A coalition of security companies and researchers has agreed on guidelines for how security software products should be tested, which may help put an end to long-running disputes about different testing methodologies.

Two sets of guidelines covering principles for testing security software for performance and testing entire security suites were adopted by the Anti-Malware Testing Standards Organization (AMTSO) at its latest meeting in Helsinki.

The guidelines are part of a series of documents on AMTSO's Web site aimed at introducing a set of standards broadly agreed through the industry as appropriate for ranking the effectiveness of security software.

Security companies have bickered over tests run by organizations such as AV-Test.org and Virus Bulletin and others, which regularly test and rank security software.

Among the contentious issues are the age of the malicious software samples used to test antivirus programs. Companies that failed to detect certain kinds of malware have argued in the past that the samples weren't threats any more and that they had removed the signatures from their databases.

Other companies have countered that their failed test wasn't accurate since other technologies in their software would have stopped the particular threat. Some antivirus testing programs perform static testing, where an antivirus engine is run against a set of samples.

But many security companies have incorporated other complex ways to detect malware. One such method, known as behavioral detection, checks to see how malware behaves on a computer.

The new guidelines are voluntary, but many testing organizations have agreed to go along with them, such as Virus Bulletin, AV-Comparatives, West Coast Labs and ICSA Labs, said David Harley, an AMTSO board member and director of malware intelligence for ESET. AV-Test.org will also use the guidelines, according to Andreas Marx, who founded the company.

"We're just trying to get people to think harder their methodologies so that they actually make sense," Harley said. "It doesn't mean you can't do things different ways, it just means you have to try and conform to a rationality."

Virus Bulletin has contributed to the guidelines covering performance testing, said John Hawes, technical consultant and test team director.

"We've already started implementing some of the ideas developed while discussing and designing this document, with some major expansions to the performance data we report in our comparatives in recent months and more improvements on the way," Hawes said. "We're also hard at work developing a new style of test which will allow us to measure the full range of features in many of today's security solutions."

The guidelines may not entirely end the feuding between security companies and testing organizations but will better inform those who write reviews of security software, said Stuart Taylor, manager of Sophos' threat lab and chairman of the board for AMTSO.

Reprinted with permission from IDG.net. Story copyright 2014 International Data Group. All rights reserved.
Our Commenting Policies
Internet of Things: Get the latest!
Internet of Things

Our new bimonthly Internet of Things newsletter helps you keep pace with the rapidly evolving technologies, trends and developments related to the IoT. Subscribe now and stay up to date!