2006-11-01
Abstract
Updates and corrections to last month's Windows 2000 Server comparative review.
Copyright © 2006 Virus Bulletin
Following the publication of last month's Windows 2000 Server comparative review, some questions have arisen over several of the files from the clean test set which caused false positives from a number of vendors. After some deeper analysis, VB concludes that some amendments are required to the clean test set, as well as to the number of VB 100% awards given in last month's review.
The file that spoiled BitDefender's chances of gaining a VB 100% award, along with those of G DATA and AEC (manufacturer of Trustport), has been identified as a hacker tool, detection for which was recently added to the BitDefender product. The file will be struck from the clean set, and since this was the single point of failure for all three of these products, all three are now awarded a VB 100%. G DATA also joins the elite group of products detecting 100% of samples across all the test sets in October's review. VB extends its apologies to all three companies.
The file labelled 'suspicious' by Symantec has also been identified as a hacker tool, and as such it will be removed from the clean set (since Symantec's product merely labelled the file as 'suspicious', rather than claiming that it was malicious, the product was not denied a VB 100% in last month's review).
Finally, a corrupted zip which Avira's Antivir product flagged as infected, has been identified as a file which should have been removed from the clean set a while ago. The file has been confirmed as containing code of the Fosforo virus, which after careful extraction remains a working threat. Antivir was the only product to detect this. The remaining clean set file alerted on by Avira has been confirmed to be a false positive - we are told that Avira developers spotted and fixed this issue in late September.
Moving on from false positives, VB regrets that typographical errors appeared in both the on-demand and on-access tables published for the October 2006 comparative review. In both tables the number of files missed by Antivir in the polymorphic and standard test sets were transposed. In both tables the numbers should have read '0' in the standard set and '150' in the polymorphic set. The percentages reported in the tables are correct as they stand. VB apologises for the confusion.
A thorough review of the VB clean test set will be conducted before the next comparative review, which will test products for the Windows XP 64-Bit platform. The results of that review will be published in next month's issue of VB. Vendors wishing to submit products for future reviews should contact John Hawes at [email protected].