2006-02-01
Abstract
Matt Ham fully expected a bumper harvest of VB 100% awards this month, simply due to the familiarity of the Windows NT platform to the developers.
Copyright © 2006 Virus Bulletin
Windows NT is such an ancient platform that writing a review for it seems more akin to writing about history than present-day affairs. The platform is still used by a fair number of people the world over though, so the review will be relevant to many.
For a reviewer, both very old and very new platforms are of great interest. When products are tested on very new platforms one tends to see many oddities as developers struggle to accommodate unexpected technology, while products tested on the very old platforms have the potential to be utterly broken due to these very struggles. Symantec, for example, no longer supports Windows NT in its most recent product line (SAV 10), and thus SAV 9 was submitted for test here.
That said, I was fully expecting a bumper harvest of VB 100% awards on this occasion, simply due to the familiarity of the platform to developers.
The test sets were aligned to the October 2005 WildList, which was the most recent edition available on the product submission date, 9 January 2006.
The overwhelming majority of new samples in the test sets were of W32/Mytob, with close to 50 new variants added this time. Other additions were also predominantly bot-related – perhaps VB should consider Bot Bulletin as an alliterative name change.
avast! is the first of a small number of products in this test in which archive scanning is not activated by default. Where scanning and detection were concerned, however, default settings seemed to have been well chosen.
A selection of files were missed – primarily polymorphics and some macro samples – though none of these were in the In the Wild (ItW) test set and no false positives were generated when scanning clean files. avast! is thus the first product to receive a VB 100% award in this test.
Please refer to the PDF for test data
With a string of recent good results behind it, Avira had ample opportunity in this test to fall from grace and little to improve. In the event, however, test results were exactly the same as the last time the product was tested: full detection in all sets. With no false positives in the clean test sets, this performance gains Avira another VB 100%.
Please refer to the PDF for test data
As usual, this version of eTrust is included here for information only, since the InoculateIT is provided within the eTrust installation, but not activated by default. Also as usual, the logging functions within eTrust remain utterly abominable – screenshots being more useful than the dumped log versions available from within the scanner. It was notable that the InoculateIT version of eTrust detected more viruses than the default Vet engine on this occasion.
Please refer to the PDF for test data
Much of the comment about eTrust has already been made under the previous section, and with the interface here being identical, there remains little to comment on other than the scanning results.
With 100% detection of samples in the ItW test set and no false positives generated, these were sufficient to guarantee a VB 100% for eTrust when using its default Vet engine.
Please refer to the PDF for test data
Quick Heal remains a fast and easy product to test, and its performance was once more sufficient for the product to earn a VB 100% award. There was little else to note about CAT's product, so this remains a short write up.
Please refer to the PDF for test data
Logging became a problem while testing Command AntiVirus, with logs available only in RTF format - one of the least friendly formats for automated parsing. Since all logs were truncated in any case, they were sufficiently useless that parsing was not attempted. Thankfully, the number of misses was small enough that manual inspection of the truncated logs, combined with scan summary information, could easily pin down the missed files on demand. Such a small number of misses is always a promising sign, and indeed Command AntiVirus receives a VB 100%.
Please refer to the PDF for test data
Before installation of Dr.Web would complete, a version of psapi.dll needed to be installed on the machine, in order to make on-access scanning possible. A great change was noted in the on-access scanner: it seems that a reboot is no longer necessary after making changes in the on-access scanner configuration. After many years of constant restarts during testing, this came as a happy event.
Probably more happily for the developers, the number of missed files remains very low – the only files missed were during on-access scanning, and then only if in EML or ZIP format. It comes as no surprise, therefore, that a VB 100% award is in order.
Please refer to the PDF for test data
NOD32 was the second product in this test that required archive file scanning to be activated when testing the zipped clean files. With an otherwise uneventful set of tests I was able to come up with only one event of note: the log file for on-demand scanning was 1337 kb in size – clearly this is highly significant if one favours conspiracy theories or numerology. Less shocking will be the news that Eset gains a VB 100% as a result of the tests.
Please refer to the PDF for test data
FortiClient's performance was sufficient for another VB 100% to be added to Fortinet's collection. The misses that remain are scattered through the test sets to such an extent that no real pattern emerges. One suspects that results will improve gradually.
Please refer to the PDF for test data
Perhaps to make the log files seem more interesting, a very large amount of information was included – though it was at least easy to filter out during parsing. Only one sample was missed during on-demand scanning, in the standard test set, though a few more misses were added on-access. None of these are currently rated as In the Wild, however, so a VB 100% is awarded.
Please refer to the PDF for test data
FSAV is a product where very small numbers of misses are something of a habit. On this occasion the product missed only the stored .TMP sample of W32/Nimda.A on demand. On access, the total was increased by the two zipped samples of W32/Heidi. However, since these are all currently in the standard test set (not In the Wild), F-Secure is also the recipient of a VB 100%.
Please refer to the PDF for test data
Continuing the theme, AVK managed to miss even fewer samples than the previous products – no samples went undetected. With no false positives, it goes without saying that these results earn AVK a VB 100% award. However, the product's scanning performance does come at a small expense, with a slightly slow scan rate as a side effect. The trade off between detection and scanning speed is a common dilemma for anti-virus developers, with many misses in these tests occurring as a result of pragmatism, with developers opting for faster on-access scanning at the cost of some detection when files are being manipulated rather than executed.
Please refer to the PDF for test data
AVG missed a number of files in the various test sets, though none were classified as In the Wild. Of those missed, the majority were polymorphic in nature or packaged in slightly unusual formats. With no false positives and full detection of ItW viruses, AVG earns itself a VB 100%.
Please refer to the PDF for test data
ViRobot held the dubious distinction of having by far the largest number of false positive detections in this test. Six files were reported as infected, while a further clean file was declared to be suspicious. As a result, the product does not qualify for a VB 100% this month. This will be something of a disappointment, since detection rates were respectable.
Please refer to the PDF for test data
Other than a lower price and older graphics, AntiVir is essentially identical to Avira internally and thus similar scanning results were expected. This was indeed the case. Minor variations in the scanning throughput rates were noted, though with Windows being host to numerous unpredictable background processes, it would be surprising if results here were found to be identical.
Please refer to the PDF for test data
The ever-productive interface developers at Kaspersky have been at work once more for this version. Personally, I am less of a fan of this latest incarnation than the previous interface, though this is more due to unfamiliarity than any obvious faults. The only oddity noted was in the 'time remaining' bar on the scanning interface, which demonstrated some interesting time dilation and compression phenomena.
On the detection front, however, there were few changes to be seen. Two zipped W32/Heidi samples on access were the sum total of missed files, leaving Kaspersky the holder of a VB 100% yet again.
Please refer to the PDF for test data
VirusScan was the third and last of the products in this test to require manual activation on archive file scanning during clean set tests. It also showed notable differences between scanning on access and on demand, with several samples of W32/Etap missed on access. No misses were noted on demand, however, and no false positives surfaced either. McAfee thus receives a VB 100% award for VirusScan's performance.
Please refer to the PDF for test data
As a rebadged version of the GDATA product, eScanWin might be expected to show similarities to that product, despite being blue in places rather than yellow.
Somewhat disturbingly, however, there was a major difference, in that the on-access scanner crashed during testing. This occurred only once though, so did not seem easily reproducible. Happily, the differences in performance did not extend to detection capabilities and, with 100% detection of ItW viruses and no false positives, eScanWin also gains a VB 100%.
Please refer to the PDF for test data
Installation of Virus Chaser failed initially due to the requirement of a new version of mfc42.dll. Installing the redistributable C++ libraries on the machine solved this problem.
Somewhat less easily solved was the total lack of control of on-access scanning available within the program. In the end, scanning was performed while locking an appropriate key in a depressed position – scanning in this way taking a little over 24 hours to complete. On demand, scanning progressed more easily, though the logs must have been the creation of either a sadist or a fan of complex logic problems. Overall, there was an impressive degree of user unfriendliness in this product.
Such irritations aside, the product's scanning performance was less than awesome, with a smattering of misses across the test sets. The fact that samples of W32/Yaha.G and W32/Yaha.E were missed on access was sufficient to deny Virus Chaser a VB 100% on this occasion.
Please refer to the PDF for test data
Minor improvements seem to have been made to the creation of new tasks in Norman Virus Control of late, since the process seemed less painful than it has done in the past. Of course, this could merely be due to the fact that I have gained familiarity with the interface, but either way the effect was appreciated.
When the logs were analysed the results were much as expected: some polymorphic and a few other samples were missed, but with no ItW samples missed and no false positives, NVC is a VB 100% winner.
Please refer to the PDF for test data
BitDefender continues to be a solid performer in our tests, with little in the way of comment necessary. It is presumably this solidity which has led to its being the basis of detection in several other products, including Hauri's offering in this test. SOFTWIN will be pleased that none of the false positive issues apparent with that derived product were present in BitDefender, thus entitling it to a VB 100%.
Please refer to the PDF for test data
Of note in Sophos's clean file scans was the fact that archive scanning is now activated by default. This is a recent and much appreciated configuration change. With both detection and lack of false positives in their usual respectable state, Sophos earns itself a VB 100% award. That said, logging functions were not without their niggles, with various unnecessary spaces added to lines which serve no purpose but to make parsing a little more complex. Meanwhile, archives are designated merely by appending \[archivename] directly to the path in which the infected archive is located. This ensures that parsing is made more complex for these entries and would be an ideal place to use the spare spaces just mentioned.
Please refer to the PDF for test data
SAV's scanning speed was far slower with some settings activated than with others. Default scanning settings are not pleasant when large numbers of infected files are present, though acceptable when files are mostly clean. Unfortunately, it seems that on-access no POT or PPT files were checked in the default mode, thus resulting in samples of O97M/Tristate.C being missing in the ItW test set and no VB 100% being awarded this time.
Update: Subsequent to this review test it was discovered that the incorrect version of SAV had been provided for testing. The correct version - v9.0.5 - was retested and achieved VB 100% status satisfactorily.
Please refer to the PDF for test data
The good news for UNA is that scanning was fast and no false positives were flagged. The bad news is that there were a multitude of missed detections in every test set. Although only two files were missed In the Wild both on access and on demand, this is ample reason to deny a VB 100%.
Please refer to the PDF for test data
VirusBuster was perhaps the most troubled of all the products. First, it required mfc42.dll to be installed – a hurdle that was easily passed. When scanning on access, however, the scanner failed repeatedly. This failure was silent, with no indication other than the fact that no files were being checked. It seemed reproducible, simply by passing around 6,000 infected files through the on-access scanner. Woes were to continue in the clean sets too, where a suspicious file was noted. Matters on the detection front were no more inspiring. Despite having .EML files flagged for scanning, the .EML version of W32/Nimda.A was missed both on access and on demand. A VB 100% award is thus out of reach for VirusBuster this month.
Please refer to the PDF for test data
The biggest surprise for me in this test was not the products that failed to detect virus samples, but the issues concerning operating system support. NOD32, for example, included the Microsoft C++ foundation classes as part of its installation package and asked whether they should be installed. Several products, however, were missing DLLs when installed onto the Windows NT platform. This shows a little lack of care for Windows NT, even if it is aged and mostly ignorable as far as new installations are concerned.
The instabilities noted with on-access scanning are more worrying, and presumably due to the operating system rather than any basic software flaws, since the same issues have not been noted with these products on other platforms. Essentially, the developers are caught between the most modern and most ancient incarnations of the NT operating systems and the desire to produce one package which will install on every variant. With the differences apparent between Windows XP and Windows NT, this is obviously easier said than done.
While Microsoft can drop support for a platform, the same is not true for developers. Without the Microsoft monopoly to back them up, anti-virus developers can gain customers by their range of supported platforms, and lose them if they cut back when a customer demands support for machines of more historical than practical interest. One wonders whether Microsoft's entry into anti-virus, currently restricted to Windows-only platforms, will be influenced by this in future.
Test environment. Three 1.6 GHz Intel Pentium 4 workstations with 512 MB RAM, 20 GB dual hard disks, DVD/CD-ROM and 3.5-inch floppy, all running Windows NT 4 Workstation SP 6.
Virus test sets. Complete listings of the test sets used can be found at http://www.virusbtn.com/Comparatives/WinNT/2006/test_sets.html .
Results calculation protocol. A complete description of the results calculation protocol can be found at http://www.virusbtn.com/virusbulletin/archive/1998/01/vb199801-vb100-protocol .