2006-06-01
Abstract
In Matt Ham's final comparative review for Virus Bulletin he puts 26 products for Windows XP through their paces. Two products enter the test line-up for the first time this month: TrustPort Antivirus and the rather more well-known Microsoft OneCare. In his own inimitable style, Matt provides rude comments and/or praise for all products, as well as the all-important VB 100% results.
Copyright © 2006 Virus Bulletin
Yet again the Windows XP comparative review is upon us, with the usual throng of products arriving to be tested and to test my patience. On this occasion two new products were submitted: TrustPort Antivirus and the rather more famous Microsoft OneCare. Rude comments and/or praise for these products can be found later in the review.
As this is the last review I will conduct for Virus Bulletin, I had hoped for an easy run overall – sadly this was not the case for several products. Although instability was less common than in previous tests, scanning speeds for some products were even slower than they have been in the past. There were also a number of products in this test whose feature sets can only have been designed by folk who are either totally ignorant of usability or bred for enhanced sadism.
The test sets were aligned to the February 2006 WildList. As always, the contents of the WildList can be viewed at http://www.wildlist.org/.
When I first started anti-virus testing, the WildList consisted of some 300 different viruses, one third of which were boot sector types. I have none-too-fond memories of inserting 90 floppies into a machine for scanning on demand, then repeating the process on access. Thankfully for my successor, this month's tests saw a major, if long foreseen, change in that there are no longer any boot sector viruses that are considered to be in the wild. Similarly anticipated was the fact that all but a small number of macro viruses dropped out of the test sets this month, including all Excel and WM/ samples.
Numerous other files also dropped out of the test set this month – and, as ever, yet more were added to replace them. Overall numbers in the test set increased marginally; more than 100 samples were added and not quite as many removed. Samples of W32/Rbot, W32/Mytob and W32/Sdbot accounted for the majority of these changes and, together, these three fill around half of the space in the WildList.
Starting the line-up on this occasion, AhnLab's V3Pro managed one of the slowest installation routines I have witnessed. It also demonstrated some odd logging behaviour, so that detection was performed ultimately by deletion of infected files.
Unfortunately, a false positive and a suspicious file in the clean test set were sufficient to deny AhnLab a VB 100% this month, though scanning of these files was notably speedy. In addition there were numerous misses of samples in the In the Wild (ItW) test set, which suggests that slow updates could be the problem here.
Please refer to the PDF for test data
As ever, on-access detection for avast! was performed by copying the test set and deleting infected files – on-access scanning is not triggered simply by opening files. avast! also suffered from a round of false positives – a total of three being sufficient to dash any hopes of a VB 100%. However, there were no misses during the scanning of infected files in the ItW test set, and misses elsewhere were at the same low background level as ever.
NOTE: 01 July 2006. After discussions with the developers and further investigation of the files, VB now considers that the files Alwil produced false positives on were inappropriate for inclusion in the clean test set. The files have been removed and VB extends its apologies to Alwil. With a faultless performance across the ItW test sets, and an admirable performance elsewhere, a VB 100% is belatedly awarded to avast!.
Please refer to the PDF for test data
At first glance, AntiVir looked very much to be taking a step backwards in this version, since many options seemed no longer to be present. Happily, it turned out that these are merely somewhat hidden in the default interface view. With this minor hitch disentangled, AntiVir went on to detect all infected files in all test sets – a performance that earned the product a well-deserved VB 100% award.
Please refer to the PDF for test data
Having progressed to version 8, both the eTrust products now rejoice in a new interface. However, the new interface seems to prioritise looking new and trendy over being intuitive and easy to use. Something I found to be particularly irritating was the fact that the interface is launched as HTML in a browser window which is almost unusable on any lower resolution screens.
I was hoping for an improvement in eTrust's reporting of infections. However, hard to credit though it is, on-screen reporting proved to be even worse than it had been previously. In this version of the product infections are reported in a tiny text box which, by default, is truncated and cannot be resized.
It is thus impossible to tell which files are infected through the use of the on-screen display. This can be overcome by printing the log file, though there is no obvious way of obtaining a useful version of this as a file.
As in previous comparative reviews, this version of eTrust is not eligible for a VB 100% award, since the InoculateIT engine is not the product's default.
Please refer to the PDF for test data
Of course, the comments made in the previous section also apply to this version of eTrust. As mentioned, the Vet engine is the default for use in scanning – in fact eTrust reverts back to Vet on each restart of the GUI.
Despite the interface woes, eTrust's detection rates were up to their usual good levels, and since no false positives were detected in the clean test set a VB 100% is the result. Scanning speeds were also good for both of the engines.
Please refer to the PDF for test data
Problems for CAT started in the clean test sets, where the generation of a false positive denied the product any chance of a VB 100% immediately. On a truly bizarre front, Quick Heal reported internally that all scans of clean objects took exactly one hour each. In reality, scanning speeds were good. Unfortunately, there was a second major disappointment for CAT in that samples of W32/Bagle.X were missed in the ItW test set.
Please refer to the PDF for test data
Vexira bears a very close resemblance to VirusBuster – which can be explained by the fact that it is a rebadged version of VirusBuster. Purists might point out that one product is red and the other blue, but my advanced skills of observation saw past this dissimulation.
Unfortunately stability was not a strength of this product, which caused a hang on the test machine after on-access scanning. On demand, matters were substantially worse, with there being repeated crashes while scanning PowerPoint files. After this performance had been tolerated for long enough to obtain results, there remained a number of misses of samples in the ItW test set, thus the product was prevented from obtaining a VB 100%.
Please refer to the PDF for test data
Once again, the most irritating thing about this product was the log – which is available only in a very truncated RTF format. An extensive search of the machine did not help in finding a useful log, thus infected files were deleted to determine detection rates.
After having jumped through the appropriate hoops, the scanning results were good, with only very few, non-ItW, infected files being missed. As a result, Authentium earns itself a VB 100% award.
Please refer to the PDF for test data
On the negative side, Dr.Web's on-access monitor SpIDer Guard lies about its configuration settings – option changes are only ever implemented after a reboot, a fact not reflected by the interface.
The story improved though, with scanning being perfect on demand, while missing only archived files on access. This performance was certainly ample for a VB 100% to be on its way to Doctor Web.
Please refer to the PDF for test data
NOD32 was the first product in this month's test with which I could find no real fault. Full detection across all test sets and a lack of false positives leave me little to comment on and earn Eset a well-deserved VB 100% to add to its collection.
Please refer to the PDF for test data
Another product that displayed no remarkably bad or notably new features, FSAV also obtains a VB 100% for its performance. Misses here were limited to viral code, which is a stored rather than directly executable form.
Please refer to the PDF for test data
The trend of good results with few shocks is continued with Fortinet's offering. Although the product missed a noticeable number of polymorphic files, detection results across other test sets were very strong. As a result, FortiClient adds another VB 100% to its collection.
Please refer to the PDF for test data
Unfortunately, the run of products displaying excellent results and few faults is cut short here, since all was not perfection for F-Prot. Scanning speeds were fair, but unfortunately a smattering of misses across the test sets included a sample of W32/Aimbot, which is classified as in the wild. A VB 100% award therefore is out of the grasp of FRISK on this occasion.
Please refer to the PDF for test data
Despite a somewhat slow performance, GDATA managed full detection of all samples in all categories, with no false positives. AVK's developers should be pleased with this performance, and a VB 100% should add to their contentment.
Please refer to the PDF for test data
One of the more common user queries I have been faced with during my time at Virus Bulletin concerns how to delete infected files using AVG. Having tried to do so, the frequency of complaints no longer surprises me. Numerous files, although flagged as infected, were not subject to any automated deletion or disinfection.
Apart from this there were no surprises in either the clean or infected test sets, with a VB 100% being the pleasing result for Grisoft.
Please refer to the PDF for test data
Unfortunately, Hauri's chances of gaining a VB 100% evaporated with a false positive and suspicious file noted in the clean set – and scanning rates were not particularly speedy here either.
Misses in detecting infected files were plentiful too, although looking on the brighter side, none of the missed detections occurred in the ItW set.
Please refer to the PDF for test data
KAV includes various self-protection features which turn out to be a double-edged sword. The less-than-welcome aspect is that the virus definitions are so well protected that they are, by default, unable to be updated manually. Since the update function does not allow updates from a local folder, this is somewhat irritating.
There also seem to have been some changes in scanning methods, the effects of which are particularly unpleasant. On-access scanning was seemingly interminable, while the clean set scanning rate is pretty indicative of the speeds seen while scanning the infected sets. This is not an effect of low scanning priorities however – during scanning KAV remained steadily at 99% processor usage.
All of this work was, at least, for good reason as all files in all test sets were detected and no false positives were produced. A VB 100% award thus acts as a distraction from the various problems encountered.
Please refer to the PDF for test data
Happily, with VirusScan we return to a product that had no nasty surprises in store and gave a good performance with full detection of infected samples across all test sets. With no false positives noted in the clean test sets either, VirusScan is awarded a well deserved VB 100%.
Please refer to the PDF for test data
As might be expected of a Microsoft product, OneCare operates in the guise of paranoid nanny. The user is not trusted to make many decisions of their own, which made certain parts of the test process frustrating.
The progress counter that is displayed during scans is particularly laughable, reaching 99% in ten minutes and then remaining at that point for approximately another 20 minutes or so. This is a result of the automatic disinfection and quarantine (the user has no say in the matter). Indeed, Microsoft's idea of quarantining is somewhat novel, consisting of appending what looks like a checksum to the end of the file name.
What with constantly resetting the areas to be scanned and hanging after the on-access scan, this product cannot be said to be one of my favourites. However, its detection rates were sufficient for a VB 100% to be in order.
Please refer to the PDF for test data
eScan is a rebadged version of GDATA's AntiVirusKit, so it should come as no great surprise that the results for eScan include full detection of samples across all test sets, a VB 100% award and no adverse comment.
With little else to say, let's move on to a product that behaved badly instead.
Please refer to the PDF for test data
Having been a source of frustration in previous reviews (see VB, April 2006, p.17), Norman Virus Control continued to manifest new problems on this occasion.
On-access scanning was subject to repeated crashes, whether dealing with infected or previously disinfected files. The effects were sufficient to reduce Windows to a state of complete paralysis, in which only a hard reboot had any effect on the test machines.
Upon reboot the splash screen displays the question 'Would you go for anything but green?' (green being Norman's corporate colour). My answer would be that anything would be better than this.
Unfortunately for the forces of truth and justice, after strenuous efforts scanning results were sufficient to warrant a VB 100% for this shockingly behaved product.
Note: 1 August 2006. In VB's June 2006 comparative review it was reported that the Norman product behaved badly, with repeated crashes on dealing with infected or previously disinfected files. VB would like to note that since then, neither Norman's developers nor VB's new resident product tester have been able to reproduce the bad behaviour described.
Please refer to the PDF for test data
Since Virus Chaser is a rebadged version of Dr.Web, it should come as little surprise that it shares both the irritations and praise of that product.
With faultless detection rates across all the test sets and no false positives noted in the clean test set, a VB 100% can be included in the shared experience.
Please refer to the PDF for test data
There were few notable moments during the testing of BitDefender, though the scanning of clean executables was certainly slow enough to be tedious to oversee.
As far as detection was concerned, BitDefender had a small number of missed detections, although no real pattern was discernable among them. Happily for SOFTWIN, however, there were no misses in the ItW set and no false positives were picked up in the clean test set, thus BitDefender also earns a VB100%.
Please refer to the PDF for test data
Sophos's product was as well behaved as ever. Whether it was practice with the GUI or some small changes in it, something made its use seem very much simpler than I can remember it having been recently, which is always a plus point. With an admirable performance across the test sets, a VB100% is in order for the Sophos product.
Please refer to the PDF for test data
The Symantec GUI has remained the same for many years and on this occasion the product's full detection rate across all test sets leaves little scope for discussion.
Not even my pathological hatred of the colour yellow can detract from the fact that the product's performance was ample for SAV to be awarded a VB 100%.
Please refer to the PDF for test data
Since this product is based on a combination of BitDefender and Norman scanning engines, I was fearful, when I first launched TrustPort, that its scanning performance would resemble blue whales forced into pogo-stick races. Thankfully, scanning speeds were not absolutely terrible, just pretty bad.
The combination of the two engines may be responsible for one of Trustport 's oddities, namely that it reported many more files as having been scanned than actually existed in the test sets.
A further mystery was the sheer variation in the actions taken upon detection of a virus. Using the default settings, samples were deleted, disinfected, quarantined, renamed and simply left to fester – all in the course of one scan.
All this aside, the detection rates demonstrated by the product came close to decent, but there were certainly a sufficient number of ItW misses to deny TrustPort a VB 100%. Maybe my successor will see an improved performance in the not too distant future.
Please refer to the PDF for test data
Not surprisingly, VirusBuster suffered some of the same woes as Vexira, though thankfully to a lesser extent. Instability on demand resulted in scanning simply not being available after existing scans aborted while in progress. Only a reboot solved this broken state. Misses of samples in the ItW test set merely added to these woes, meaning that VirusBuster was denied a VB 100% on this occasion.
As a side note, after discussion with the developers, the reason for the scanning speed issues which plagued VirusBuster in the Linux comparative review (see VB, April 2006, p.13) was determined to be the handling of alert messages. In the default setting, alerts are sent to the client and if the client is set such that it will not accept these alerts, then the sending will wait until it times out. Since the client is set, by default, not to accept these alerts, this causes a dramatic slowdown in scanning rates. Clearly this problem can be solved easily by some simple changes in the client or scanner configuration.
Please refer to the PDF for test data
My final words should be statements, grave judgements and moments of prescience, so as to leave a lasting memory of the quality of my reviews. Unfortunately for this line of thinking, the only thoughts I have to offer are of a cynical nature.
The names and descriptions of the threats may change, but the anti-virus industry remains pretty much the same as it ever has been. The major companies are the same, user ignorance is unchanged and the hyperbolic press releases are the same. Even the claims that 'soon all will change' are simply repeats of the past.
If I should return to the anti-virus field in the future, I really don't think it would take more than a few minutes to become re-acclimatised – I just hope that NetWare is extinct by then.
Test environment. Identical 1.6 GHz Intel Pentium machines with 512 MB RAM, 20 GB dual hard disks, DVD/CD-ROM and 3.5-inch floppy drive running Windows XP Professional SP2.
Virus test sets. Complete listings of the test sets used can be found at http://www.virusbtn.com/Comparatives/WinXP/2006/test_sets.html.
Results calculation protocol. A complete description of the results calculation protocol can be found at http://www.virusbtn.com/virusbulletin/archive/1998/01/vb199801-vb100-protocol .