VB100 Comparative review on Windows XP Professional SP3

2012-05-01

John Hawes

Virus Bulletin
Editor: Helen Martin

Abstract

As expected, the annual VB100 test on Windows XP was an epic. A higher than usual pass rate was tempered by numerous stability issues with the products under test, prompting the unveiling of a new stability rating system. John Hawes has all the details.


Table of contents
Introduction
Platform and test sets
Results
Agnitum Outpost Security Suite Pro 7.5
Avast Software avast! Free Antivirus
Avertive VirusTect
AVG Internet Security Business Edition 2012
Avira Free Antivirus
Avira Professional security
Bitdefender Antivirus Plus 2012
BullGuard Antivirus 2012
Central Command Vexira Antivirus Professional
Check Point ZoneAlarm Extreme Security
Check Point ZoneAlarm Internet Security Suite
Clearsight Antivirus
Commtouch Command Anti-Malware
Coranti 2012
Coranti Cora Antivirus
Defenx Security Suite 2012
Digital Defender Antivirus Premium
Digital Defender Antivirus Pro
eEye Digital Security Blink Professional
Emsisoft Anti-Malware
eScan Internet Security Suite
ESET NOD32 Antivirus 5
Filseclab Twister AntiVirus v7
Fortinet FortiClient
Frisk F-Prot Antivirus for Windows
F-Secure Client Security
G Data AntiVirus 2013
GFI VIPRE Antivirus 2012
Ikarus virus.utilities
Iolo System Shield
K7 Total Security
Kaspersky Endpoint Security 8 for Windows
Kaspersky Internet Security 2012
Logic Ocean Gprotect
McAfee VirusScan Enterprise + AntiSpyware Enterprise 8.8
Microsoft Security Essentials
Norman Security Suite
Optenet Security Suite PC
PC Booster AV Booster
PC Tools Internet Security
PC Tools Spyware Doctor with AntiVirus
Preventon Antivirus Premium
Preventon Antivirus Pro
Qihoo 360 Antivirus
Quick Heal Total Security 2012
Sophos Endpoint Security and Control 10
SPAMfighter VIRUSfighter Pro
SureGuardian Antivirus Premium
Total Defense Inc. Internet Security Suite
Total Defense Inc. Total Defense r12
Troppus Software Digital Life Now Anti-Virus Premium
TrustPort Antivirus 2012
UnThreat AntiVirus Professional
UtilTool AntiVirus Premium
UtilTool Antivirus Pro
VirusBuster Professional
Untestable products
Results tables
Conclusions
Technical details

Introduction

The approach of spring has always been met with mixed emotions by the VB lab team. As the semi-rural surroundings of the VB office erupt in greenery and blossom, and the sun begins to nudge its way through the winter cloud, our joy at the environmental improvements is dampened by the knowledge that the annual XP test is upon us. The platform itself is as simple and comforting as a familiar old blanket, and its ongoing global popularity ensures solid and unquestioning support from every anti-malware vendor under the sun. As a result, this test is all but guaranteed to be the biggest of the year.

As the deadline day approached, all the evidence pointed to yet another monstrous haul of submissions. On the deadline day itself, the flood of submissions became a torrent and the numbers rose still further, with a selection of new faces appearing on top of our ever-growing clutch of regulars. In the final reckoning though, things did not look too bad, with a cunningly late deadline announcement helping fend off those with slower reactions and a handful of expected entries failing to materialize.

The initial total came in at 60 entries, plus a pair of new faces which were accepted as tentative submissions only, pending some experimental test runs to see if their design would fit in with our testing methodologies. So, not quite the behemoth we’d feared, but still comfortably inside the top five biggest hauls in the history of VB100 testing.

This month saw several companies submitting multiple products. This is something which we have been quite open to up until now – our policy has been to accept and test up to two products per company free of charge, and to impose a small fee for any additional products to help support the extra work involved. In the future, though, we will be tightening our policy to allow only one product free of charge per vendor.

A remarkable chunk of this month’s participants were based on the same underlying engine, with some 17 products making use of the ubiquitous VirusBuster engine in some form or another.

One change introduced this month was the intention to be stricter in dealing with highly unstable products. In the past, where submissions have proven difficult we have done our best to nurse them through the tests, but with time so tight under our new multi-part testing processes, we decided that more than a handful of crashes or failed scans would be enough to see a product excluded from further testing, freeing up the test bench for more reliable products. We also decided that we would not keep the names of the most troublesome products to ourselves – they will be called out at the end of this report as ‘untestable products’.

A final related development we had hoped to add this month was a table of data listing such details as install and update time. One of the core things we planned to include here was an indication of product stability, evolving from the comments which have been made in the written reviews for some time now. We worked out a simple system, splitting problems into levels of severity and rating them depending on whether they occurred in normal use or only in high-stress situations, ranging from ‘Solid’ for products with no issues at all, through ‘Stable’, ‘Fair’ and ‘Buggy’ for those with increasing numbers of minor issues, and ‘Flaky’ for the most unreliable. Once this system has settled in and become reasonably repeatable and scientific, we hope to be able to include stability as one of the VB100 certification requirements.

A more detailed breakdown of this system will be merged into our procedures once it is finalized – unfortunately, as this month’s testing went on, and it became clear how much time it would take to gather the existing fields of data, our hopes of having the new system fully operational this month fell by the wayside. Nevertheless, we will include some indication of how it would have looked as we go through the products – we hope to provide a full table as part of the next comparative.

Platform and test sets

Windows XP has been with us for well over a decade, and even now it is only just being supplanted as the most popular operating system worldwide. Most estimates put it on between 25% and 35% of end-user systems, with the much more modern Windows 7 on somewhere around 35%. The lab team have been through the process of setting up XP machines dozens, perhaps hundreds of times, so it presented few difficulties. In fact, for this comparative an old image was recycled, refreshed and updated with a few handy tools before the final test image was taken and spread to the suite of test machines.

Sample sets were then prepared and placed on all systems, with the bulk of the work going into the clean set – alongside the usual tidying up, a hefty bundle of end-user software was added to the set, mainly harvested from a large collection of DVDs given away with popular magazines. With much of the stuff on these disks being either somewhat obscure or (in some cases) clearly a little less than trustworthy, we limited it to only the most popular and mainstream items.

The WildList sets were built using the January 2012 lists, which were released just a few days before the deadline for this comparative (15 February). With the Extended WildList still going through a settling-in process, and many items still being removed from the list well after it has officially been finalized each month, we needed to be fairly flexible about what we included. Essentially, we accepted any challenge against any item in the list which our contacts felt to be inappropriate, excluding anything that did not fully satisfy our requirements to be labelled as malware. In the end, most of these decisions proved justified, with several post-release addenda confirming the removal of the items.

The RAP sets were built, as usual, in the weeks around the deadline, and the Response sets were compiled on a daily basis from the latest available samples. Some additional work has gone into honing the content of the Response sets. We have improved automation and fine-tuned the measures we take to ensure that no single provider swamps a set with unexpectedly large contributions, while at the same time aligning the contents more closely with prevalence information. This has had a knock-on effect for the RAP sets, which are put together using the same basic set-up – we hoped to see the impact of this in the form of a more rigorous exercise of the products’ detection capabilities, especially in the proactive set. In the end, the RAP sets averaged around 30,000 samples, with the daily sets used for the Response tests ranging from 2,000 to 6,000 samples, averaging around 4,000.

Speed and performance sets remained essentially unchanged, but we hope to find time to expand our selection of samples used in the speed measures, and to soup up our set of performance measures a little, in time for the next test.

With everything in place, we were ready to begin our epic trawl through the seemingly endless list of products.

Results

Agnitum Outpost Security Suite Pro 7.5

Product version: 7.5.1 (3791.596.1681)

Update versions: 15/02/2012, 14/03/2012, 21/03/2012, 27/03/2012

First up this month is Agnitum’s Outpost suite, a very regular participant in our tests, and one which rarely gives us much trouble. The package provided measured 112MB, including an initial set of updates. The set-up process is fairly lengthy, much of the time being devoted to initiation of the firewall components. As in recent tests, we observed that the option to join a feedback scheme is rather sneakily hidden on the EULA acceptance page of the install process. Updates for the main part of the test – which were run between two and five weeks after the initial installer was sent to us – averaged around 14 minutes.

The interface is clear and sensible, providing a fairly decent set of controls for the anti-malware component, although most space is devoted to the firewall. Operation was smooth and stable, with no problems observed throughout the tests, and all jobs (including high-stress scans of infected sample sets) completed in good time. Scanning speed measures showed some decent initial rates, speeding up massively in the warm runs, but on-access lag times seemed a little on the high side. Resource use was also fairly high, and our set of tasks took noticeably longer than the baseline measures.

Detection rates were reasonable, dropping off somewhat in the last few days of the Response sets, and falling rather sharply in the proactive week of the RAP sets too, as expected. There were no problems in the certification sets though, with flawless coverage of the WildList sets and no false alarms in the clean sets, and Agnitum starts this month’s test off with an easy pass. Testing took no more than the 48 hours allotted to each product, and with no bugs or problems encountered, the product was rated as ‘Solid’ for stability. In the vendor’s recent test history, we see ten passes from ten attempts in the last two years.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Avast Software avast! Free Antivirus

Product version: 6.0.1367

Update versions: 120215-1, 120315-0, 120312-0, 120327-0

Avast released its latest upgrade, version 7, shortly after the deadline for this month’s test, so we were stuck with version 6 – which was no great pain for us, given its solid performance to date. For a change, the install package came separately from the latest updates, making for a slightly larger download than usual: 62MB for the main installer and 51MB for the updates. The set-up process is simple with no big surprises, completing in under a minute, and in general updates ran quickly too. In one of the three runs there was a bit of a snag though, when updating seemed not to be working at all. After leaving it sitting at 0% for quite some time, we finally gave up, and from there two reboots were required to get things moving. Once it had got over this temporary glitch, things got back on track and testing continued; ignoring this period, updates averaged only a couple of minutes.

The interface is beautifully designed, combining eye-pleasing visuals with mind-pleasing clarity of layout and an extremely thorough set of controls made easily usable and unthreatening to the novice. Testing was generally fairly straightforward, although at one point when setting up the on-access tests the system became unresponsive and required another reboot. Having learned from past experiences, we whitelisted some of the test scripts and tools, and the behavioural monitoring was artially disabled to enable tests to proceed unimpeded.

From here on things moved along very nicely indeed, although speeds were a little slower than expected. On-access overheads were also a little high, but our set of tasks ran through in good time and resource usage was on the low side. Detection rates were excellent, with some superb scores in the Response sets and the reactive parts of the RAP sets; the proactive week did show quite a decline compared to recent tests, but it is likely that this is mainly due to adjustments in how the sets were developed, and we expected to see similar if not steeper drops in other products.

The core certification sets were handled impeccably, with no problems achieving the required standard for certification. After a small hiccup a few months ago, things seem to be back on track for Avast, with a single fail and 11 passes in the last two years. A few fairly minor bugs were noted during testing, for which reboots were required. These occurred outside the high-stress parts of the test, thus the product achieves only a ‘Fair’ rating for stability.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Avertive VirusTect

Product version: 1.1.81

Update versions: 14.1.218, 14.1.267, 14.1.278, 14.1.283

The first of a raft of products from the same root source, Avertive’s VirusTect is based on the VirusBuster engine via the Preventon SDK, and has become a fairly regular visitor to our test bench. The installer was provided with latest definitions included for the RAP tests, and weighed in at 85MB. The set-up process is smooth and simple with no surprises, and completes in good time. Updates were likewise simple, reliable and speedy, with total install time averaging under ten minutes.

The GUI sticks to the older format that is familiar from many previous tests, and proved simple to operate. Our only gripe was the dropping of old log data after 20MB or so, and the default setting being to log everything checked by the product, thus resulting in much pointless logging. A registry tweak changed the former, while an interface option was provided for the latter, and tests proceeded smoothly from there on. The only real bug observed when using the product was that the on-demand scanner ignored a setting to ‘detect only’ when fired off via the context menu – there was no such issue when starting scans from the interface itself, and this is the most minor of bugs.

Scanning speeds were not bad, and overheads just a little on the high side, while resource use and impact on our set of tasks were both fairly average. Detection rates were reasonable if not especially impressive, with a downward trend towards the more recent days of the Response test and a fairly steep drop into the proactive week of the RAP sets. The certification sets were handled well though, with thorough coverage of both WildList sets, only a single item ignored on-access thanks to being a self-extracting archive – as this was in the Extended list, it presented no obstacle to certification and a VB100 is duly awarded.

Avertive has only entered three tests in the last six, but has passed each time; the two-year view shows four passes and two fails from six attempts. With only a single, very minor issue observed, our bugginess rating is ‘Stable’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.93%

False positives: 0

AVG Internet Security Business Edition 2012

Product version: 2012.0.1913

Update versions: 2112/4809, 2114/4872, 2114/4884, 2114/4897

as usual, AVG provided its premium business suite for testing, with the installer weighing in at a fairly hefty 142MB and the set-up process running through in a fair number of steps, including offers to install a security toolbar and change the default search provider. The whole process was fairly speedy though, completing in just a few minutes, and later online updates were fast too, averaging only three minutes.

The interface is simple and unthreatening – a little angular, and with a sombre colour scheme to keep business folk from getting overexcited. The layout is clear and usable and a very thorough level of options is provided in very accessible fashion. Tests ran smoothly, with no issues of any kind.

Scanning speeds started off pretty decent and increased to lightning speeds in the warm runs, while overheads were mostly pretty light – a little above average in some sets to start with, but benefiting hugely from some smart caching in the warm measures. RAM usage was low and CPU use below average too, with a notable but extreme effect on our suite of standard tasks.

Detection rates were splendid, with a very consistent showing through the Response sets and good scores in the RAP sets too – once again that rather sharper than usual drop-off into the proactive week reflected our adjustments to the compilation of the set. The WildList and clean sets were brushed aside without any problems, and a VB100 award is easily earned by AVG.

A momentary lapse just under a year ago means AVG now has five passes and a single fail in the last six tests; 11 passes from 12 attempts in the last two years. There were no stability issues, meaning that the product earns a ‘Solid’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Avira Free Antivirus

Product version: 12.0.0.849

Update versions: 7.11.23.32, 7.11.25.110, 7.11.25.198, 7.11.26.28

As usual in our desktop tests, Avira submitted both its free version and its paid-for version, with the free one up first simply due to alphabetical ordering.

The product was provided with an offline update bundle, the main installer being 80MB and the updater a further 58MB. The install process was pretty simple, enlivened only by the offer to install the Ask toolbar (versions of which are flagged by many products as ‘potentially unwanted’) – an option which, pleasingly, is unchecked by default. The remainder of the process is speedy and unchallenging, taking no more than two minutes, and updates ran through in five minutes on average. T

he interface is much improved after a recent facelift, and is generally pretty usable and easy to navigate, with a pretty thorough set of controls available. Tests ran through without problems, with decent speeds and reasonable overheads, low resource use and a low impact on our set of tasks.

Detection rates were excellent, with reliably strong scores through the Response sets and a splendid start to the RAP sets, declining a little into the later weeks as expected. The certification sets presented no problems, and a VB100 award is well deserved. Our test history for this version shows three passes from three entries in the last six tests; five from five in the last two years. With no bugs or problems noted, a ‘Solid’ stability rating is earned.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Avira Professional security

Product version: 12.0.0.1209

Update versions: 7.11.23.32, 7.11.25.112, 7.11.25.198, 7.11.26.28

The paid-for version of Avira is similar in many ways to its free sibling, with the install package measuring 83MB and the same 58MB initial update bundle shared by the two. The set-up process ran along similar lines, although without the offer of a third-party toolbar, and completed in two minutes. Later updates were surprisingly slow, however – on most occasions claiming that a full hour would be needed, but actually taking between 20 and 30 minutes.

The interface closely resembles that of the free version but does provide a few little extras in terms of configuration, including a more granular scanning system in the interface (although in both cases most scan jobs were run using the context menu option). The layout is clear and usable.

Scanning speeds were noticeably faster here than in the free version, and overheads distinctly lighter – this is something we have observed before, and the developers regularly query it as they insist that the two products should be more or less identical. However, we observed the same differences over several repeat runs on different test systems. In the performance measures, while RAM use was low, rather more CPU cycles seemed to be used, and impact on our set of tasks was a little on the high side. Again, on observing the differences between the two products, the tests were re-run several times, but the same distinction between the two was observed.

Detection rates were identical to those of the free product in the RAP sets, thanks to that shared update, but in the Response tests the Pro version had a slight edge in most sets – perhaps thanks to luck in the timing of the tests, which were run on the same day but a few hours apart in most cases. There were no differences in the certification sets though, and another VB100 award is easily earned by Avira. Our test history shows a very solid 12 passes in the last two years. With no bugs observed, stability was also ranked ‘Solid’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Bitdefender Antivirus Plus 2012

Product version: 15.0.31.1283

Update versions: 7.41018/7750778, 7.41495/6996989, 7.41561/6956643, 7.41654/6984626

Bitdefender’s flagship range has had a rather drastic makeover of late, with the latest version of its Antivirus Plus 2012 product provided as a hefty 212MB package. The set-up is fast and simple, adorned with the company’s new ‘Dragon Wolf’ styling, and takes not much more than a couple of minutes to complete, even including an initial ‘quick scan’ (which is very quick indeed). Online updates took four to five minutes, although in one instance as soon as this initial update was complete and the interface showing, a second update was requested, which took about another five minutes.

The GUI is dark and a little forbidding, but attractive and simple to operate, with an impeccable range of options provided. Speeds were OK to start with, showing some somewhat irregular speed-ups in the warm runs. On-access lag times also started fairly well and improved notably on repeat attempts. Resource use was extremely low, almost imperceptible in terms of CPU use, and our set of tasks blasted through in splendid time.

Detection rates were similarly impressive, with the Response sets showing only the slightest downward trend into the more recent days, and some solid figures in the RAP sets too – even the proactive week was pretty well covered, given the toughness of the challenge this month. The certification sets presented no issues, and Bitdefender comfortably earns a VB100 award.

The last six tests show a perfect six passes, with the longer-term view showing only a brief lapse, with a single fail and 11 passes in the last two years. No stability problems were noted, earning Bitdefender a ‘Solid’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

BullGuard Antivirus 2012

Product version: 12.0.213

Update versions: 12.0.215

BullGuard is based on the Bitdefender engine, but has its own distinct look and feel. The installer measured 148MB including updates, and the product installed with minimal fuss in a little over a minute. Updates were also speedy, averaging less than five minutes – although on one occasion a rather unassuming alert did suggest a reboot might be a good idea.

The interface is fairly minimalist and pared-down, with a warm orange bar along the top and a sparse, pale grey area below where simple messages and large buttons provide the bulk of the product’s feedback and control. Options are fairly basic in some areas, a little more complete in others, but usage is relatively simple and intuitive.

Scanning speeds were blisteringly fast, and overheads fairly light, except over the archive set. RAM use was around average, but CPU use was one of the highest observed this month and our set of tasks took much longer than necessary to complete – this could be related to some archive work in the activities set, with the sets of samples fetched and manipulated in zip format in several of the stages.

Detection rates closely mirrored those of Bitdefender, outstripping them slightly in the earlier days of the Response sets, but dropping slightly lower in the last few days. Once again, RAP scores were solid in the reactive weeks and still pretty impressive in the proactive week. With no issues in the certification sets, BullGuard also comfortably qualifies for VB100 certification this month.

The company’s history shows five passes from five attempts in the last six comparatives, only the annual Linux test having been skipped. Longer term entries are a little less regular, with seven passes from seven attempts in the last two years. With no stability problems, BullGuard earns a ‘Solid’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Central Command Vexira Antivirus Professional

Product version: 7.3.33

Update versions: 5.4.1/14.1.219, 5.4.1/14.1.263, 5.4.1/14.1.270, 5.4.1/14.1.282

Another of the 17 products based on the VirusBuster engine in this month’s test, Vexira is the most similar to the mother ship, with only the colour scheme differentiating it from VirusBuster’s own offering. The 69MB installer ran through fairly well, although the process of applying a licence key was a little odd, with the focus jumping to the ‘next’ button each time a key was pressed, making it rather fiddly to enter the full key code. The set-up was supplemented by a 75MB update bundle, which was applied quickly and easily offline; later online updates were a little more troublesome, however.

As noted last time we looked at this product, several of the update methods, including a few spots in the main interface and one of the options in the system tray, seemed to have problems initiating updates properly. Messages would suggest the job had started, but no progress could be observed and even after leaving it alone for several hours no change in status was apparent. In the past, leaving it overnight seemed to get the job done, but this month time was too tight for us to leave it that long. Fortunately, we discovered that one of the update options – which opens what appears to be a dedicated updating GUI – did work properly, running through a number of steps (an option to progress from one to the next was available, but given the earlier problems we chose to ignore it). With the download time a little slow, the total process of installing and updating using this method took 25 minutes on average.

The interface has had a minor polish recently, but remains rather bland and wordy, with a decent degree of fine control available, but much of it presented in a fiddly, clunky fashion. Thanks to much practice with the product we were able to get things moving along quickly though, and in the speed measures we saw some fairly mediocre scan times and somewhat high overheads on access. Resource use was not excessive though, and our set of tasks took only slightly longer than average to complete. Detection rates were similarly mid-range – fairly steady through the Response sets and reasonable in the reactive parts of the RAP sets, dropping quite sharply into the proactive week. The WildList and clean sets did not turn up any surprises though, with good coverage throughout, and a VB100 award is duly earned.

Central Command had a spot of bad luck a few months ago, with a rare false positive blemishing an otherwise solid record – the vendor currently stands on five passes from six attempts in the last six tests; 11 passes and a single fail in the last two years. The issues observed with the update system were the only problems encountered, and Vexira is thus rated as ‘Fair’ for stability.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Check Point ZoneAlarm Extreme Security

Product version: 10.1.079.000

Update versions: 8.1.8.79/1078935552, 8.1.8.79/1081090624, 8.1.8.79/1081435200, 8.1.8.79/1081774656

Check Point tends to take part in our tests only once or twice a year, but generally does well thanks to the underlying Kaspersky engine. This month the vendor submitted two products, which for the most part seemed fairly similar.

The installer for the ‘Extreme’ suite came as a slimline 5MB downloader tool, which fetched the noticeably larger 360MB main installer. A bundle of updates was provided for offline application for the RAP tests, which needed to be dropped into place in safe mode; later updates took around 15 minutes on average. The set-up process itself was fairly straightforward, taking only three to four minutes once the main installer was fetched (and taking an additional 25 minutes on the first install). The option of a browser toolbar was once again noted. The process did need a reboot to complete, and on at least one occasion a second reboot was required after running the online update.

The interface is a little short on the glitz and glamour one expects of end-user products these days, looking a little old-fashioned and clunky, and although a reasonable degree of fine-tuning was available, it was occasionally tricky to find and lacking in consistency across the product. During testing we noted a number of minor issues and irritations, although for the most part these only occurred under high stress. We observed the interface freezing several times after large scan jobs, and occasionally saw other error messages which resulted in the GUI restarting. Logging also proved somewhat unreliable, with a couple of jobs requiring a re-run after no log information could be found at the end of the scan.

Scanning speeds were rather slow, except over our set of media and document samples, but overheads were not too high. RAM use was reasonable, but CPU use was very high. Our set of activities showed a fairly low impact in terms of runtime, however.

Detection rates were excellent, with a dependably high rate throughout our Response sets and solid levels in the RAP sets too, not dropping off too sharply in the proactive week. The core sets were handled well, with only a few items in the clean sets labelled as being of a hacker-ish bent, and Check Point earns a VB100 award for its efforts. This is the vendor’s only entry in the last six tests, but the two-year view shows three passes from three attempts. With a few issues observed, mainly under high stress, stability was rated ‘Fair’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Check Point ZoneAlarm Internet Security Suite

Product version: 10.1.079.000

Update versions: 8.1.8.79/1078935552, 8.1.8.79/1081097984, 8.1.8.79/1081424032

Similar in most respects to its ‘Extreme’ sibling, the ZoneAlarm suite held few surprises, with the small downloader fetching the full 343MB main package. The set-up process ran through the same set of steps, although at one point an error message appeared (complete with entirely unhelpful error code). Updates took around 15 minutes on average.

The interface is fairly angular and wordy, prone to occasional wobbliness, and again some high-stress work required multiple attempts after freezing up or failing to complete properly. Scanning speeds were also rather slow, and noticeably faster in the media and documents set. Overheads were fairly average, and low RAM use was countered by high consumption of CPU cycles, while our set of tasks completed in good time. Detection rates were very similar indeed, showing some splendid scores just about everywhere. This solidity of detection (if not of interface) extended to the core certification sets, with just a few warnings of possible hacker tools in the clean sets, and a VB100 award is duly earned.

With no history of entering multiple products, we can only share the single track previously reported – this one pass in the last six tests; three passes from three attempts in the last two years. Stability was hit by a few issues, the more significant ones at least only occurring under unusually high stress, but we would rate it as no more than ‘Fair’, verging on ‘Buggy’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Clearsight Antivirus

Product version: 3.0.70

Update versions: 14.1.218, 14.1.267, 14.1.278, 14.1.283

Another of this month’s glut of Preventon/VirusBuster offerings, Clearsight has gone for a slightly newer iteration of the product with a number of additions. The submission only provided us a ‘Pro’ version however, which meant that many of the extras, such as behavioural monitoring and web filtering, were unavailable. There was little change in the set-up process, with the 89MB installer running through standard steps and completing in good time; updates averaged only six minutes or so.

Opening the interface showed us an initial bug, with the desktop icon not responding. Opening via the system tray did work though, and from then on the desktop icon seemed to come alive as well. On making the usual registry changes to fix the log capping, a reboot was attempted to apply the changes, but the only effect of clicking the restart button was to shut down the product interface. A second reboot attempt restarted the machine.

Speeds tests showed similar results to others in the range, with decent scanning speeds and reasonable overheads, average resource use and average impact on our set of tasks. Detection rates were a little harder to gather, with an initial run through the first part of the clean sets and the initial version of the Response sets apparently zipping through in record time. A closer look at the logs showed that a nasty file had tripped something up somewhere, and all subsequent files were flagged with an engine error. Checking the system, it appeared that the on-access protection was disabled, although the interface insisted it was operational. A similar issue has been noted with this product line in previous tests, where a file had apparently locked up the engine in such a way that it fails to ‘open’. A reboot was needed to fix things, and the response set was re-run from where it had fallen over.

Eventually, we could see detection rates just as expected – fairly reasonable in most areas, dropping off sharply into the proactive part of the RAP sets. The core sets were handled well though, and a VB100 award is earned. The product has notched up five passes from five entries in the last six tests; six passes and one fail in the last two years. This month’s performance rates as distinctly ‘Buggy’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.93%

False positives: 0

Commtouch Command Anti-Malware

Product version: 5.1.15

Update versions: 5.3.9/201202151549, 5.3.9/201203191034, 5.3.9/201203221633, 5.3.9/201203261124

Commtouch has been having something of a rocky time in our comparatives of late, with bad luck following it from test to test – presumably its developers will be hoping for a change of fortune on this occasion. The current version came as a compact 13MB installer, with an offline update bundle of 28MB. The set-up process was very fast and simple, taking not much more than a minute with no need to reboot, and initial online updates averaged 20 minutes.

The product interface is fairly simplistic and minimal, with little opportunity for getting lost or making mistakes. Fine-tuning opportunities are rather limited, but the options that are provided are clear, sensible and responsive. In the past we’ve had some issues with exporting of the jumbo-sized logs we often generate, but these seem to have been fixed, and even under the heaviest of stress there was no sign of wobbliness.

Scanning speeds were not bad, but our on-access runs over the same sample sets took an enormous amount of time – in some cases more than double the baseline measures. The same effect showed in our set of standard tasks too, which took an age, with fairly high CPU use throughout, although RAM use was not excessive.

Detection rates were quite impressive in the Response sets, dropping off only slightly in the last day, but were fairly mediocre in the RAPs, tailing off steeply into Week +1 – implying that stronger detection is in place when cloud access is available. The fact that the RAP set scan took an epic 4,475 minutes (a little over three full days – some products completed it in under two hours this month) supports this theory, with the extra time put down to failed look-ups.

The core sets were handled well, with no signs of false alarms or misses in the WildList sets, and a VB100 award is well deserved. This hopefully marks the dawn of a new era for Commtouch, which now has two passes and three fails from five attempts in the last six tests; four passes and four fails in the last two years. Stability was rated ‘Solid’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Coranti 2012

Product version: 1.005.0006

Update versions: 21105, 21132, 21240

Once again, multi-engine behemoth Coranti submitted a brace of products, the first being the vendor’s mainstream offering straight from its Japanese HQ. Thanks to a mix-up at submission time, the lab was not informed that it would need to be installed and updated on the deadline day (indeed, the submission arrived too late to fit this work in anyway). After much head-scratching, we concluded that the smallish 53MB install package contained no detection data and would need to be excluded from the RAP tests. Attempts to install without online access led to some distinctly odd behaviour, including the product at one point insisting that the system clock was inaccurate and needed changing. The time was changed by a few hours, and initially we assumed this was to sync the systems with the company’s Japanese base – it was not until much later that we noticed the date had been set, quite bizarrely, to 1928.

Later installs ran through a fairly simple process, but once installed, online updates ran for some time, averaging close to two hours. This is doubtless thanks to the many engines contained within the product, each of which needs its own set of data, and perhaps in part due to the distance between our test lab and the company’s home market region.

The interface is wordy, but pleasantly laid out and simple to operate, with a good level of controls in most areas. It seemed to run solidly and reliably through the bulk of our testing, with some decent scanning speeds helped by smart caching in the warm runs. On-access and performance measures proved more difficult to gather, however, as the standard set of scripts we use (most of which are fairly rudimentary batch files) threw up clusters of errors and failed to produce data for chunks of the tests. No explanation could be found for this, as in most cases identical jobs which should have run multiple times were blocked on some occasions, but not on others. Re-running the tests on several fresh installs, on different hardware with fresh copies of both scripts and sample sets, produced similar, if not identical results. Thus, both on-access overheads and performance scores were estimated based on what data was available, and may be less accurate than we would like.

Detection tests (RAPs excluded) were much less fraught with difficulty though, and some splendid scores were seen in the Response sets, tipping only slightly downward in the later days. The WildList sets caused no problems and with no mistakes in the clean sets either, Coranti’s mainline product earns itself a VB100 award, having given us a few minor headaches.

The product’s history shows three passes and one fail from four entries in the last six tests; five passes and three fails in the last two years. Stability in the product itself seemed good, but the issues with our testing scripts – which were clearly caused by something the product was doing, with no information given as to why the system was not working as expected – must be counted as a bug, giving it only a ‘Fair’ rating for stability.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Coranti Cora Antivirus

Product version: 2.003.00013

Update versions: 21105, 21132, 21240

Coranti’s second offering, Cora, comes from its Ukrainian office, and some of the splash screens and other displays contain Cyrillic characters alongside Roman ones. Other than that, it seemed pretty similar in most regards. The installer was even smaller this time, measuring just 26MB, and the set-up process was once again unable to operate properly without running an online update (again we noted an issue with resetting the date, this time picking the year 2057). The update took an age to complete, downloading around 250MB of data, but taking an average of over two hours to do so.

The interface is again wordy but usable, scanning speeds decent, becoming super-fast. A difference between the sibling products emerged in the on-access and performance tests though – on this occasion they all ran through unimpeded, producing a full set of results without complaint. Some pretty light overheads were seen, and reasonable resource consumption, with a good rate getting through our set of tasks. In fact, results fairly closely mirrored the figures we had pulled together for the 2012 product, so perhaps filtering out the chunks of failed results did not cause too much inaccuracy for the 2012 version after all.

Detection rates were pretty similar: very solid throughout the Response sets with only a slight downturn in the later days, and the core sets were dealt with well, earning Cora another VB100 award. That makes it three passes from three attempts in the last six tests, the product’s first appearance coming last autumn. With no issues to report other than the extreme download time and the oddities when installing offline, the product is rated ‘Stable’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Defenx Security Suite 2012

Product version: 2012 (3734.575.1669)

Update versions: 15/02/2012, 18/03/2012, 21/03/2012, 27/03/2012

Number five of 17, Defenx is an evolution of Agnitum’s Outpost suite, providing Agnitum’s firewall alongside VirusBuster’s malware engine, with a few additions of its own. The user experience closely mirrors that of Outpost, with the set-up process from the 114MB installer (provided pre-updated) running through a fair number of stages and taking quite some time. Updates ran smoothly and reliably, but again were fairly slow, averaging 20 minutes for the initial runs.

The interface is clear and welcoming without being overly fluffy, and is clearly laid out with a good basic set of controls that are easy to find and adjust. Operation was stable and reliable, with no noticeable problems, and tests proceeded nicely. Scanning speeds were reasonable, aided by speed-ups in the warm runs, and on-access overheads likewise sped up nicely from a decent starting point. RAM use was a little high, and as with Agnitum’s product, CPU use and impact on our set of tasks were both very high.

Detection rates were no more than reasonable in the Response sets, and tailed off quite steeply in the RAPs, but the certification sets were properly dealt with and Defenx earns a VB100 award. The product’s history is solid, with five passes from five entries in the last six tests; ten from ten in the last two years. In terms of stability, the product is rated ‘Solid’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Digital Defender Antivirus Premium

Product version: 3.0.70

Update versions: 14.1.218, 14.1.267, 14.1.278, 14.1.283

Another from the Preventon/VirusBuster stable, Digital Defender is one of the older names on the list, and in this test it appears in both Premium and Pro versions. Differences between the two are minimal though: they use the same installer, and the additional components provided in the Premium edition are activated by the application of a licence key. The set-up, from the 89MB install package, ran through quickly and smoothly, and updates seemed reliable, taking around seven minutes on average.

As with previous products using the newer variant of Preventon’s GUI, some oddities were noticed when opening the interface for the first time, and also when trying to reboot the system. In this case, with several extra defensive layers enabled, we noticed some further issues, with several initial scan attempts freezing up completely. Carefully picking through the changes revealed that the ‘Safety Guard’ component was at fault – its main purpose seems to be to check detections in the cloud to minimize false positives, but something was clearly not right with it. Once this component was disabled, things seemed to run fine – apart from the scan engine being snarled up nastily by a single file in the first round of Response sets, leaving it incapable of scanning any further files until the system was rebooted.

We also noted, when running on-demand speed tests, that scans of the C: partition frequently froze up even without the ‘Safety Guard’ enabled. Attempts to scan other areas occasionally resulted in the dead C: scan reviving, snagging up as before in the same spot. To complicate matters further, the disabled component also frequently reactivated after reboot.

We eventually managed to complete all our tests, the results showing scanning speeds around average and overheads perhaps a little on the high side. Resource use and impact on our set of tasks were similarly standard. Detection rates were mediocre in the RAP sets, with a sharp decline in the proactive week. They were a little better in the Response sets, but again declined slightly towards the more recent sets. The certification sets were handled properly though, and a VB100 award is granted.

Digital Defender’s recent test history is decent, with five passes from five entries in the last six tests (previous ones all being for the Pro product only). Longer term, things are less impressive, with six passes and four fails in the last two years. This month’s performance showed a number of stability issues, some of them fairly worrying, putting the product on the outer edge of ‘Buggy’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.93%

False positives: 0

Digital Defender Antivirus Pro

Product version: 3.0.70 Update versions: 14.1.218, 14.1.267, 14.1.278, 14.1.283

The same product with a different licence key and some features disabled, the set-up process for this one was unsurprisingly much the same – updates again taking seven minutes or so on average. Given that the main feature expected to make a difference between the two in static detection tests was disabled in the Premium version for practical purposes, and in this version as it falls outside the Pro protection level, everything else was as similar as might be expected, with slightly fewer bugs thanks to the problem feature being switched off from the start.

The product showed mid-range speeds, overheads and resource drains, and mid-range detection rates. The VB100 requirements were met once more, but like its sibling, the product earned a stability rating in the ‘Buggy’ range.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.93%

False positives: 0

eEye Digital Security Blink Professional

Product version: 5.0.1, Rule version 1630

Update versions: 1.1.2000, 1.1.2043, 1.1.2054, 1.1.2058

Blink is a bit out of the ordinary for our tests, with its creator, eEye, specializing in vulnerability monitoring and management. Indeed, the product is a little non-standard for the vendor too, and is listed on the company’s website under ‘Additional products’. It has become fairly familiar to us over several years of testing, though. The latest version came as a 220MB install package with updates included. The set-up process was reasonably simple, requiring online access to check licensing details – for which a lot of personal information is requested, including a postal address. As we have noted before with this product, updates were rather lengthy – routinely taking more than two hours, and fetching close to 200MB of data on each install.

The interface is clean and follows a fairly standard template, but is occasionally a little confusing to navigate, providing a limited set of controls. It seemed generally fairly stable and responsive though. Scanning speeds were slow in some areas, but reasonable in others, while overheads proved pretty heavy across the board. Our performance measures showed some good results though, with memory use below average, CPU use just a little higher, and our set of tasks running through in good time.

Detection was pretty good too, with decent scores in the Response sets – a little lower in the second half than the first – and a very impressive starting week in the RAP sets, curving fairly sharply downwards in the latter two weeks. The WildList and clean sets presented no difficulties, and a VB100 award is earned by eEye. The vendor has been doing fairly well of late, with four passes and a single fail in the last six tests; six passes and two fails in the last two years. Stability was good throughout, earning the product a ‘Solid’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Emsisoft Anti-Malware

Product version: 6.0.0.57

Update versions: 5,288,487; 5,506,826; 5,565,809; 5,561,228

The product formerly known as ‘A-Squared’ returns once more, hoping to end its run of bad luck in our tests. The install package came fully updated, at 115MB, and set-up was very fast, completing in under a minute. Later online updates took an average of 14 minutes.

The interface is a little quirky, leaving one searching for ‘back’ or ‘home’ buttons quite regularly, and some of the language is a little unusual too, but in general it makes a reasonable degree of sense, providing a limited set of controls. There were a few issues with larger scans once again, with jobs freezing near the end and on one occasion simply disappearing without trace, but these only occurred under heavy stress.

Scanning speeds were not bad, and overheads a little heavy in some runs, but better in others. RAM use barely registered, with CPU use also fairly low, but our set of tasks did take a little while longer than usual to complete. Detection scores were excellent as usual, with good scores throughout the Response sets and RAP scores which started very strongly and didn’t fade away too sharply. However, a handful of items in the Extended WildList set were not picked up, and in the clean set a single item – a component of some reporting software from Microsoft – was labelled as a Lolbot trojan. As a result, Emsisoft once again misses out on certification by a whisker.

Recent tests haven’t been kind to Emsisoft, with five fails from five attempts in the last six tests; in the last two years, the vendor has managed two passes, alongside seven fails. Stability seemed reasonable, and fine for use under everyday circumstances, earning the product a ‘Stable’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 99.67%

ItW Extd (o/a): 99.67%

False positives: 1

eScan Internet Security Suite

Product version: 11.0.1139.1146

Update versions: NA

India’s eScan is one of our most regular participants, rarely missing a test, and it appears here once again with its latest suite version. The install package was a large 186MB, with the installation process running along standard lines – although it did point out, after running an initial ‘quick scan’, that some erroneous registry entries had been spotted and fixed. The whole process took around three minutes, with initial updates taking an extra 15 minutes on average.

No problems were spotted during testing, and speed measures were reasonable, speeding up a little in the warm runs. Overheads were pretty light, with resource use on the lighter side of average, and there was a fairly low impact on our set of tasks. Detection rates, aided by the Bitdefender engine included in the product, were excellent, showing only the slightest downward trend in the Response sets and not too steep a decline in the RAPs. With the core sets also dealt with easily, eScan earns another VB100 award.

The company has a strong record, with a perfect six in the last six tests; ten passes and two fails in the last two years. Stability this month was sound, earning the product a comfortable ‘Solid’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

ESET NOD32 Antivirus 5

Product version: 5.0.95.5

Update versions: 6886, 6886, 6986, 6886

Another product almost guaranteed to appear in every comparative, and equally likely to perform well, ESET’s NOD32 was provided this month as a slimline 55MB installer, including updates. It set up in little over a minute, the process enlivened as always by the company’s unusual approach to the thorny ‘Potentially Unwanted’ issue – forcing users to make a clear decision as to whether or not to point out such items. Online updates were impressively speedy, taking less than five minutes, but activation proved a little fiddly on occasion, with one attempt to start a trial simply disappearing part-way through, without an error message or any other explanation.

The interface is attractive and stylish, providing a broad range of controls in a generally usable style, although in places it does seem a little repetitive. As usual, we were unable to fathom the settings to open archives on access, and despite our best efforts they went unexamined.

During testing, we also observed a brace of blue-screens – the only ones observed during the whole testing period. These both occurred at seemingly innocuous moments: the first when attempting to paste a screenshot of the product interface into Paint for our records. The second, on a different install, happened after running the initial on-access check of our archive set – 100 or so archives containing the EICAR test file – with only the unarchived samples detected. Having run the test, we tried to open the opener tool’s csv log file in Notepad, and there we were again, rebooting unexpectedly.

Scanning speeds were very impressive though – almost getting back to their old form of five years ago – and overheads were featherlight too, barely registering in some areas. RAM and CPU use were fairly average, but our set of tasks blasted through, taking not much longer than our baseline measures.

Detection rates were oddly disappointing in the Response sets, but pretty decent in the RAPs, tailing off not too steeply. The core sets were handled splendidly, with a cluster of adware and toolbar warnings in the clean sets, all of which appeared to be justified. ESET thus earns a VB100 award to add to its ongoing epic record: 12 out of 12 in the last two years and stretching back way further than that.

On putting this report together and noting down the version information for each run from screenshots taken at the time, it became clear that some of the updates had not completed properly – although the interface reported a successful update (even recording the time of the last update run as just moments earlier), the actual data in use remained old. This happened on two of the three main test runs, and explains the unexpectedly low scores in the Response sets. On top of the two blue screens, this tips the product’s stability rating into ‘Buggy’ territory.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Filseclab Twister AntiVirus v7

Product version: V7 r3

Update versions: 14.251.43669, 15.38.49880, 15.44.15917, 15.50.54570

Filseclab is the perennial battler, bravely entering our tests month after month with little reward, but edging ever closer to certification standard. The current version came as a 130MB main installer with 112MB of updates, in an executable bundle openly available on the company’s website. Main set-up stages seemed minimal and ran smoothly, but later update attempts were less reliable, with several attempts failing and needing to be re-run. The actual download time for successful updates was only five to six minutes, but some extra must be added to account for only one in three attempts actually succeeding.

The interface is little changed since we first saw it – slightly out of the ordinary, but fairly simple to figure out and navigate, with a decent if not very thorough set of controls. These were not necessarily reliable: options were provided to extend the depth of archive scanning on access, but they appeared to have no effect. We were also unable to make the product scan non-standard file extensions on access, thus limiting our set of speed measures in this mode.

Scanning speeds were a little slow on demand, and fairly high on access, in the areas in which they could be properly measured. Performance measures showed nothing too extraordinary though, with resource use unexceptional, and impact on our set of activities fairly reasonable. Detection rates were pretty decent, with good levels in the Response sets and a good start to the RAPs too, tailing off fairly sharply in the latter weeks. Results in the certification sets proved disappointing once again however, with a number of misses in the WildList sets and quite a few false alarms in the clean sets, including items from major players such IBM (one of whose packages apparently contained the EICAR test file), SAP and Sun, alongside popular tools such as WinZip and VLC.

No VB100 award can thus be granted to Filseclab, which now has two fails in the last six tests; five fails in the last two years. Stability was generally good though, with issues in the updating process the only ones noted, thus earning the product a ‘Stable’ rating.

ItW Std: 97.48%

ItW Std (o/a): 97.48%

ItW Extd: 95.17%

ItW Extd (o/a): 93.32%

False positives: 37

Fortinet FortiClient

Product version: 4.1.3.145

Update versions: 4.3.392/15.215, 4.3.392/15.320, 4.3.392/15.344, 4.3.392/15.357

Fortinet routinely competes for the title of smallest main install package, and must be well up there this month with a tiny 10MB offering. Offline updates were fairly hefty though, at 135MB, and after a very rapid, simple install (which took less than a minute with no need to reboot), online updates were fairly lengthy. In most cases the initial download apparently took only seven or eight minutes, but after this, an additional period of up to half an hour was needed to ‘process’ the update.

The product interface is efficient and businesslike, providing an excellent set of controls in a lucid and logical manner, and testing was a pleasant process, free from shocks or surprises. Scanning speeds were decent, pretty good over some types of files, while overheads were a little high. Resource use was on the high side – particularly CPU consumption – and our set of activities was impacted fairly heavily. Detection rates were impressive though, with some good scores in the Response sets and a good showing in the RAP sets too, dropping away only a little into the latter weeks.

The core test sets presented no problems, and VB100 certification is comfortably earned, leaving Fortinet with five passes in the last six tests (the annual Linux test being skipped); nine passes and a single fail in the last two years. Stability was excellent, comfortably earning a ‘Solid’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Frisk F-Prot Antivirus for Windows

Product version: 6.0.9.6

Update versions: 4.6.5

Frisk’s F-PROT seems not to have changed at all in many years, and the current build promised few surprises. The 36MB installer ran through very speedily, and the 29MB update bundle was dropped manually into place. A reboot is demanded to complete the set-up process. Online updates seemed reliable and simple, averaging just under six minutes, and the interface is basic and minimalist, with only a handful of options and nothing to confuse even the most inexpert user.

Most tests ran without issues, showing some reasonable scanning speeds but fairly hefty on-access lag times, with RAM use on the low side, CPU use around average, and our set of tasks running through surprisingly quickly. Detection rates were pretty mediocre, but respectable in most areas.

Running large scans was, as ever, fraught with minor issues, with most jobs crashing out at least once, but restarting from where things had left off was simple enough and tests did not take too long to get finished. The core certification sets were handled well, and with no false alarms or important misses F-PROT earns a VB100 award. It now has five passes and one fail in the last six tests; nine passes and three fails in the last two years. The only problems noted were large scans of infected sets crashing out, with no issues in more usual ‘everyday’ use, thus a ‘Stable’ rating seems appropriate.

ItW Std 100.00% ItW Std (o/a) 100.00% ItW Extd 99.74% ItW Extd (o/a) 98.54% False positives 0

F-Secure Client Security

Product version: 9.20 build 274

Update versions: NA

F-Secure often submits two fairly similar products for our tests, resulting in a rather complex history on our website, but this month only one was entered. It came as a 64MB installer with a 145MB updater executable. The set-up was fairly fast and simple, needing a reboot to complete. Online updates were mostly reliable, but took up to 20 minutes for initial runs to complete. On occasion it seemed to be stuck in a loop, with progress bars hitting 100% multiple times and the process apparently restarting immediately.

Most tests ran smoothly, with speeds impressive to start with and powering through in the warm runs, while lag times were mostly decent, resource use low and some impact noted on our set of tasks. The first two parts of the test programme ran smoothly, although one large scan did impose a heavy weight on the system, using up 555MB of RAM and 86% of CPU time, but it completed without issues and the machine remained reasonably responsive throughout.

On the third run, however, everything went completely haywire. When running through the WildList sets on access, protection seemed to switch on and off at random, and even with several runs, no consensus could be reached on which files should be blocked and which ignored. On-demand work was similarly troublesome, with scans freezing, vanishing without trace, or claiming completion but producing no log data and reporting far fewer items scanned than were actually present. Multiple reinstalls, on almost all of our tests systems over a period of more than two weeks, repeatedly brought similar experiences, and eventually we had no choice but to abandon the entire job.

All detection results reported thus only cover the first two of the usual three runs; there at least we did see some solid work, with high rates in the Response sets declining very gently through the days and the RAP sets looking excellent, although dropping off considerably in the proactive week. In the WildList sets, even ignoring the disastrous final run, a few items appeared to be missed in both sets, and while no false alarms were noted in the first two-thirds of the clean set, no results could be obtained for the final portion which contained most of the most recent additions. No VB100 award can be granted this month, but we have been informed that F-Secure’s developers have investigated and fixed the issues we reported. Until this month, the product had managed to record a pass with one or other of its products in every test (except those on Linux platforms) over the last two years. Although most problems occurred in high-stress work involving multiple malware samples, this month’s showing could only be rated as ‘Flaky’.

ItW Std: 100.00%

ItW Std (o/a): 98.95%

ItW Extd: 100.00%

ItW Extd (o/a): 97.25%

False positives: ??

G Data AntiVirus 2013

Product version: 23.0.0.19

Update versions: NA

G Data’s product routinely vies for the title of biggest installer, and things looked promising this month with a jumbo 353MB install package submitted. Set-up is uncomplicated though, taking little more than a minute to run through the standard steps, requesting a reboot at the end. Online updates averaged 20 minutes.

Scanning speeds started off fairly decent, and most jobs sped up to under a second in the warm runs, while lag times on access were a little on the high side but showed some signs of improvement in warm measures too. Resource use was a little above average but far from excessive, and our set of tasks completed in good time. Detection rates, as ever, were remarkable, with very little missed in the Response sets or the reactive weeks of the RAP sets – even the proactive week handled pretty impressively. The core sets were dealt with admirably too, and a VB100 award is easily earned by G Data, whose history shows a stable pattern of four passes and one fail in the last six tests; eight passes and two fails in the last two years, with only the Linux tests not entered. No problems were noted during testing, and a ‘Solid’ rating is duly earned.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

GFI VIPRE Antivirus 2012

Product version: 5.0.5134

Update versions: 11549, 11671, 11693, 11717

GFI’s VIPRE has a history of annoying the lab team with odd quirks and stability problems, but has been making some great strides of late, just in time to be rated under our new system. The current version has a tiny 11MB installer, and we also fetched 80MB of updates for use offline in the RAP tests. Set-up was very quick and easy, accompanied by an information slide show, and updates seemed effective too, initial runs averaging around eight minutes.

The interface is a little different from most, for some reason ignoring standard approaches and going its own way. For the most part only minimal configuration is possible, and some of it is less than clear at first glance. Operation seemed fairly stable though, with none of the issues under high stress noted in past comparatives, although our cautious approach – instinctively running each job in small chunks rather than single large runs – may have helped with this.

Scanning speeds were a little slow in most areas, but overheads were fairly acceptable, and RAm use was low. CPU use was around average, and impact on our set of tasks was pretty low. Detection rates proved excellent, rivalling the very best in the bulk of the Response sets and only dropping off a little in the most recent day, with a similar pattern in the RAP sets: excellent in the older weeks and still decent in the latter ones. The core sets presented no issues, and a VB100 award is comfortably earned.

GFI’s history shows four passes and one fail in the last six tests; seven passes and one fail in the last two years. With no stability issues to report (somewhat to our surprise), a ‘Solid’ stability rating is merited.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Ikarus virus.utilities

Product version: 2.0.125

Update versions: 2.0.127

Ikarus is another victim of a recent run of bad luck. Its current product was, as usual, provided as an ISO image of a full install CD, measuring 209MB but including much else beside the basic product installer (such as a redistributable version of Microsoft’s .NET framework which needed to be installed prior to the main product being set up). This was all done automatically, but added several minutes to the total runtime, which was around five minutes in the end. Updates were performed offline from a 72MB bundle for the RAP sets, and online for the other parts of the test, initial runs averaging 18 minutes.

The interface leans heavily on .NET, and is thus rather ugly and clunky, but these days at least it is generally responsive and usable. Options are pretty limited, but those that are provided seemed usable and reasonably easy to find.

Scanning speeds were a little slow, and on-access overheads very heavy indeed, with RAM use a little above average and CPU use pretty high; our set of tasks didn’t take too long to complete though.

Detection rates were pretty good – very high indeed in the response sets and similarly stellar in the first half of the RAP sets, dropping a little into the proactive week, as we would expect. The WildList sets were well covered, and in the clean sets for once no false alarms emerged, earning Ikarus a VB100 award after a lengthy spell of failures. The vendor now stands on one pass and three fails in the last six tests; three passes and four fails in the last two years. We noticed a little wobbliness in the interface when running large scans, but no issues in everyday use, thus rating it ‘Stable’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Iolo System Shield

Product version: 4.2.4

Update versions: NA

Iolo continues to submit to our tests, but despite over a year of pleading, its developers have so far been unable to provide us with any details on how to decrypt its awkward log format – which can be displayed well in the product interface, but is all but impossible to handle offline. The product itself came as a 450KB download tool, which as usual was set up and updated online on the deadline day, no facility being in place for offline updating. Fetching the initial installer took around six minutes, and from there on standard steps were run through fairly speedily, taking another two minutes or so before requesting a reboot to complete. Online updates were then needed, taking six minutes on average.

The product interface is slick and professional, and provides a reasonable level of controls in a fairly usable manner, although one common item – the option to run scans from the Explorer context menu – was notably absent. It seemed fairly stable, with no issues observed running through the tests. Scanning speeds were reasonable, overheads very heavy on access and despite low RAM use, CPU consumption was very high and there was a significant impact on our set of standard activities.

Detection rates were fairly mediocre across the board, as far as we could ascertain from the gnarly logs. On access, our own logging system recorded a 100% block rate through the WildList sets, and this was confirmed from our reading of log data, but either on-demand logging was less easily deciphered or there were a fair number of misses. Thus, despite no apparent false alarms in the clean sets, Iolo cannot be granted a VB100 award. For mostly the same reasons, the vendor’s test history now shows one pass and three fails in the last six tests; two passes and four fails in the last two years. Stability was ‘Solid’ though.

ItW Std: 99.37%

ItW Std (o/a): 100.00%

ItW Extd: 98.61%

ItW Extd (o/a): 100.00%

False positives: 0

K7 Total Security

Product version: 11.1.0072

Update versions: 9.130.6177, 9.134.6437, 9.135.6532

Enjoying something of a golden spell of late, despite rather sporadic entries, K7’s latest version was submitted as an 81MB installer with 85MB of updates. It zipped through the set-up process in very good time, the whole thing completing with just a couple of clicks in a little over 30 seconds, with no need for a reboot. Updates were similarly speedy, averaging only two minutes for the initial runs.

The interface is bright to the point of gaudiness, and a little on the wordy side, but it is reasonably easy to find one’s way around and does provide a thorough level of controls. It behaved well throughout testing, remaining responsive even under heavy pressure. Scanning speeds were not the fastest, but were decent nevertheless, and overheads were very light indeed, at least in the warm runs. RAM use was around average, but CPU use barely noticeable, and our set of tasks ran through in good time.

Detection results were decent – not challenging the leaders this month, but more than respectable, and the core certification sets were handled well, earning K7 another VB100 award. That puts the vendor on two passes from two attempts in the last six tests; five from five in the last two years. The product showed no stability issues, earning a ‘Solid’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Kaspersky Endpoint Security 8 for Windows

Product version: 8.1.0.646

Update versions: NA

Kaspersky once again submitted both business and consumer products for testing, with the corporate version up first. It came as a fairly hefty 248MB installer, with offline updates fetched using a special tool which builds a local mirror of update server content. The set-up process ran through quite a few steps, including building a list of trusted applications on the local system, but took no more than two minutes. Initial online updates averaged around 12 minutes.

The interface is glossy and glitzy, with several unusual touches regarding how it operates, but with a little practice and exploration it soon becomes highly usable, and provides an impeccable range of fine-tuning options. It ran through the tests well, with only one issue noted: in the on-demand speed tests, most sets were dealt with fairly slowly at first, speeding up massively for the warm runs, but in the set of miscellaneous files something seemed to snag somewhere, and each attempt to run the job was aborted after the maximum permitted time of 30 minutes (most others this month took no more than two minutes to complete this job).

Other tests were problem-free though, and on-access lag times were low, with average RAM use, CPU use a little high and a fairly big impact on our set of activities. Detection rates were splendid, extremely thorough just about everywhere, with even the proactive week of the RAP sets showing a very respectable score. The WildList sets were dealt with flawlessly, and in the clean sets we only saw a few, entirely accurate alerts on potential hacker tools. A VB100 award is thus earned without trouble.

The test history for Kaspersky’s business line shows a rather rocky road of late, with three passes, two fails and a rather historic no-entry in the last six tests; eight passes and three fails in the last two years. The only issue was the freezing speed scan, and as this occurred over normal, clean files it is judged more significant than similar problems under high stress; still, a ‘Stable’ rating is granted.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Kaspersky Internet Security 2012

Product version: 12.0.0.374

Update versions: NA

The second product from Kaspersky this month is pretty similar in most respects, using the same mirror of updates for the RAP tests, but the installer was noticeably smaller at just 78MB. After 30 seconds or so preparing to run, the actual process only required a couple of clicks (an option to display more advanced set-up controls was provided), and the whole thing took not much more than a minute, with no need to reboot. Updates took around 15 minutes on average for the initial run.

The interface looks much like the business product, with a little more colour. Once again, some of the buttons and controls are a little funky, giving us a few surprises and a little confusion at first, but after we’d settled in it all proved usable and fairly intuitive, with a splendid range of controls available.

Scanning speeds started fairly slow, with less sign of improvement in the warm runs, but the on-access lags, which were not too heavy from the off, did show considerable speed-ups. Resource use closely mirrored the business product, with low RAM use, fairly high CPU use and a fairly heavy impact on our set of tasks. Detection rates were again superb just about everywhere, with an excellent showing in the RAP sets. The core sets proved no problem, with just a few alerts on suspect items in the clean sets. A second VB100 award is thus earned by Kaspersky this month.

The consumer product line’s test history shows four passes and one fail from five entries in the last six tests; nine passes and two fails in the last two years. With one of the speed sets again causing repeated freezes during on-demand scans, the stability rating is no higher than ‘Stable’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Logic Ocean Gprotect

Product version: 1.1.81

Update versions: 14.1.218, 14.1.267, 14.1.278, 14.1.283

Another from the Preventon/VirusBuster gang, Logic Ocean has a few entries under its belt already. The submission weighed in at just under 85MB – smaller than most of the others and hinting, encouragingly, that it was still using the older, simpler, and this month much more stable version of the interface. The smooth and speedy set-up was completed in good time, and updates took around nine minutes on average for the first runs. The GUI was indeed pleasingly familiar with no additional bells and whistles, providing a good basic set of controls in a clear and usable format, and it maintained reasonable stability throughout.

Scanning speeds were not bad, with slightly high overheads and fairly high use of resources and impact on our set of tasks. One oddity we noticed (on top of the usual disregard for action settings when running scans from the context menu) was a tendency to kick off unrequested scans of our set of archives (the first job done during our speed tests) each time a setting was adjusted.

Detection rates were rather dreary, as expected, but the core sets were handled well and a VB100 award is earned. Logic Ocean now has two passes from two entries in the last six tests; three from three entries in the last two years. With just a few minor bugs spotted this month, the product earns a ‘Stable’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.93%

False positives: 0

McAfee VirusScan Enterprise + AntiSpyware Enterprise 8.8

Product version: 5400.1158

Update versions: 6620.0000, 6649.0000, 6656.0000, 6662.0000

McAfee’s enterprise offering is one of a small band to have seen very few drastic changes in the last five years, and with a solid record of performance and reliability, it is always a welcome sight on the test bench. The installer is on the small side at 38MB, with 119MB of updates provided for offline use. The installation process is clear and straightforward, completing in little more than a minute. Updates ran smoothly, taking an average of eight minutes for the initial runs. The familiar interface is unflashy and plain, but provides a comprehensive set of controls which are simple to find and operate.

Speed tests were pretty quick to start off and faster still in the warm runs, with lag times starting off fairly high but also benefiting from some handy speed-ups. Resource use was low – particularly CPU use – and our set of tasks zipped through in splendid time. Detection rates were decent in the Response sets, with a slightly downward lean in the last few days, and a little lower in the RAP sets, again dropping away in the later parts – the company’s cloud system clearly pays dividends here. The WildList sets were dealt with without problems, and with no issues in the clean sets either, McAfee comfortably earns a VB100 award. From three entries in the last six tests, the vendor has three passes, but longer term things are a little more shaky, with a rough spell last year leaving it on five passes and two fails in the last two years. Stability was rated an excellent ‘Solid’ throughout.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Microsoft Security Essentials

Product version: 2.1.1116.0/3.0.8402.0/1.1.8001.0

Update versions: 1.119.1976.0, 1.121.1834.0, 1.123.62.0, 1.123.554.0

Microsoft’s home-user solution tends to alternate in our tests with the vendor’s corporate offering, appearing only in desktop tests such as this month’s. The submission came as a typically compact 8MB installer with 61MB of updates. The set-up process was a little slow, with a number of false starts where the installer looked like it was about to start its business, then came back with just one more question. In total, it took around six minutes to complete, with initial online updates running nice and quickly, adding only another four minutes.

The interface looks crisp and clean but provides only minimal controls, many of them less than clear thanks to some rather over-complex wording. In general, though, it seemed fairly usable and responsive, with no problems standing up to the high stress of our tests. Speeds were not great, but overheads were fairly light, with low use of RAM, average CPU use and a very low hit on the runtime of our set of tasks. Detection rates were pretty solid across our Response sets, and good in the RAP sets too, dropping a little into the final weeks, but no more so than most this month.

The core certification sets presented no difficulties and a VB100 award is easily earned. Security Essentials now has three passes from three entries in the last six tests; five from five in the last two years. This month’s uneventful performance earns a ‘Solid’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Norman Security Suite

Product version: 9.00

Update versions: 6.07.13, 6.08.03

Norman’s Suite product has proven distinctly flaky in the past but has shown some improvements of late, and we hoped to see an eradication of all bugs in time for our new stability rating system to come into force. The installer submitted this month measured 169MB, including updates. There were several noticeable pauses in the set-up process where not much seemed to be happening. Only a few steps had to be clicked through, but with a reboot at the end (after which there was no sign of any product for at least 30 seconds), the whole process took over three minutes. The product initially claimed that 15 minutes would be needed for updates, but on average they took only seven minutes to complete.

The interface is not pretty and is a little awkward, its browser-based design leaving much to be desired, but at least for the most part it seemed to do as it was told. For on-demand jobs, a much simpler and more traditional interface was also available. Scanning speeds were sluggish but did speed up considerably on the warm runs, while overheads also improved a little but remained slightly heavy throughout. Resource use seemed low though, and our set of tasks were completed in very good time.

Detection rates showed a continuation of recent improvements, trailing off a little in the second half of the Response sets and dropping away quite sharply into the proactive part of the RAP sets, but the core sets were well handled and a VB100 award is earned. A spell of bad luck a few years ago is now firmly behind Norman, with recent test history showing five passes and one fail in the last six tests; nine passes and three fails in the last two years. With only a few hiccups noted, and only during high-stress tests (mainly where large scans slowed to a halt and had to be restarted), Norman earns a ‘Stable’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Optenet Security Suite PC

Product version: 1.0.4

Update versions: NA

Optenet is a fairly rare participant in our tests, with only two appearances so far. The product includes the Kaspersky engine, but has yet to show results that reach the rarefied heights we would expect to see from such a source. The installer measured 158MB including updates, and took only a few clicks and a minute or so to get set up, with a reboot needed to complete. Online updates were slow and hard to keep track of, with only an on-hover measure from the system tray icon to indicate that the job was still ongoing; there seemed to be no sign of progress anywhere in the main interface. As far as we could tell, the job averaged at least 25 minutes over the several installs performed.

The interface is again browser-based and suffers most of the problems associated with such an approach. The design is also a little awkward and uncomfortable, providing only limited controls and making what options are available tricky to find and fiddly to operate. Most large scan jobs suffered from GUI problems too, with the interface implying that scans had some way to go long after the logging indicated that they had finished.

Speed measures were not bad, but lag times were quite heavy, and in our performance measures there seemed to be some kind of snag in the set of activities – which took more than twice as long as the second slowest product this month. It seems that much of this extra time was spent doing nothing, as RAM measures were very low and CPU measures were actually much lower than the baselines – suggesting that the system was idle for long periods.

Detection rates were respectable, but nowhere near as high as one might expect, with reasonable scores in the Response and RAP sets, both showing a downward trend into the later parts. The core certification sets raised no problems though, with clear runs through the WildList and clean sets each time, earning Optenet another VB100 award.

Although this is the product’s first appearance in over six tests, it has three passes from three attempts in the last two years. With a few, mostly fairly minor issues noted, a ‘Fair’ stability rating is earned.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 97.55%

False positives: 0

PC Booster AV Booster

Product version: 1.1.81

Update versions: 14.1.218, 14.1.267, 14.1.278, 14.1.283

Another Preventon/VirusBuster offering, again with the slightly smaller 85MB installer promising a slightly smoother run. Installation was, as expected, fairly smooth and simple, with updates taking seven minutes or so on average, and the older-style interface proving fairly clear and usable, with limited but sensible options that were easy to find and operate.

Scanning speeds were decent on demand, overheads a little high on access, with reasonable use of resources and reasonable impact on our set of tasks.

Detection rates were mediocre, as anticipated, but the core sets were handled well and PC Booster earns a VB100 award. The vendor now has two passes from two entries in the last six tests; three from four with a single fail over the last two years. This month’s performance earns the product a ‘Stable’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.93%

False positives: 0

PC Tools Internet Security

Product version: 2012 (9.0.0.909)

Update versions: 367717/6.19260, 364231/6.19490, 364774/6.19520, 365118/6.19560

Generally only appearing in our desktop tests, PC Tools makes up for missing out on half our comparatives by entering two products for the rest. As usual, the suite product is up first. The pre-updated package submitted for testing measured 237MB, and did its business with fairly few steps to click through and little time to wait – it was all done in under a minute. Online updates ensued, taking 10 minutes on average, and then a licence key was applied, following which on each run an additional update was required, taking a little less time and averaging an extra six minutes.

The interface has a glossy new look, but much of the layout is familiar. Controls are pretty basic – limited to on and off in most areas – and in parts it can be a little unclear as to how things are supposed to be operated, but we managed to complete most tests without difficulty. Scanning speeds were around average to start with, but blazing fast in the warm runs, and overheads were fairly light – again, improving on repeat runs. RAM use was a little below average, but CPU use slightly on the high side, as was impact on our set of standard activities.

Detection rates were harder to fathom, with logging clearly not designed for human use. A confusing system of logging detections in a main log file (which is sometimes deleted after being sent back to home base), but then later recording them as ‘Exonerated’ by cloud look-ups in a different, usually truncated log, added considerably to the work involved in processing the data, and in the end some of the entries were simply ignored. We did manage to conclude that both RAP and Response scores were pretty good – with ‘Suspicious’ alerts included, they would be near-perfect, even in the proactive part of the RAP sets – and that nothing was rated above ‘suspicious’ in the clean sets without later being marked ‘exonerated’. So, a VB100 award is granted, but we have had to take the unusual step of leaving the ‘suspicious’ column marked with a question mark in our final tables.

PC Tools’ suite offering has a decent history in our tests, with a brief blip last summer; it currently stands on two passes and one fail from three entries in the last six tests; five from six entries in the last two years. No bugs were noted, earning the product a ‘Solid’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.74%

False positives: 0

PC Tools Spyware Doctor with AntiVirus

Product version: 9.0.0.909

Update versions: 367717/6.19260, 364231/6.19490, 364774/6.19520, 365118/6.19560

The Spyware Doctor variant of the PC Tools range is pretty similar to the suite, although lacking the firewall and providing a few slight differences in the control systems. Its set-up package was thus a little smaller, at 217MB, but the experience was very similar, offering to install Google’s Chrome browser as part of the install process, but otherwise running along the same lines as its sibling product. Again, in most cases a double update was required, making the full average time to install and update around 15 minutes. The interface is similarly glossy and colourful, with the same scarcity of controls and occasional moments of bafflement. Once again, the logging system is fiddly in the extreme. Scanning speeds were a little faster here and, bizarrely, on-access lags actually increased in some areas in the warm runs – but resource use was similar, as was impact on our set of tasks.

Detection rates were almost identical, again with large numbers of ‘Suspicious’ alerts discounted, but many others marked with a heuristic ‘ZeroDayThreat’ tag left in. Working to this rule, scores were decent (and would have been close to 100% including the suspicious alerts, but ignoring the cloud exclusions), and there were no issues in the core sets, thus earning PC Tools a VB100 award for its second product. The product’s history is identical to its sibling’s: two passes and one fail in the last six tests; five from six in the last two years. Without any problems noted during testing, this product also rates as ‘Solid’ for stability.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.74%

False positives: 0

Preventon Antivirus Premium

Product version: 5.0.70

Update versions: 14.1.218, 14.1.267, 14.1.278, 14.1.283

The mother of what seems like the bulk of the entrants in this month’s test, Preventon’s own product came in both Premium and Pro versions, although some of the additional features of the Premium edition were disabled as they were too unstable. The installer measured 89MB, and set-up was speedy and simple, with seven minutes needed to update. As elsewhere, we noted some nasty issues with the ‘Safety Guard’ component and had to switch it off, while a file in one of the runs tripped the product up and left it inactive (while still claiming full functionality). We also observed settings we had changed reverting on reboot, and scans running over areas we had not requested and quickly locking up.

Scanning speeds were reasonable though, with medium overheads and average resource use, CPU use somewhat lower than the rest of the family, while our set of tasks ran through in decent time. With no issues in the core sets a VB100 award is earned – officially the first for this Premium edition. Stability was not good though, coming in at the buggier end of ‘Buggy’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.93%

False positives: 0

Preventon Antivirus Pro

Product version: 5.0.70

Update versions: 14.1.218, 14.1.267, 14.1.278, 14.1.283

With not much else to say beyond what has already been discussed for the Premium version, this one had the same installer and interface with just a few areas disabled by default, and slightly fewer bugs. Speeds were almost identical – in the middle of the range – with slightly higher overheads, and CPU use also a little higher, but detection rates were identical in their mediocrity.

No problems emerged in the certification sets and a VB100 award is earned, adding to Preventon’s mainline history which now shows five passes from five attempts in the last six tests; six passes and two fails in the last two years. Although not quite as wobbly as the Premium version, the Pro version also earns a ‘Buggy’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.93%

False positives: 0

Qihoo 360 Antivirus

Product version: 3.0.0.2121

Update versions: 2012-02-13, 2012-03-19, 2012-03-26, 2012-03-29

Qihoo’s product is another that is based on a third-party engine – in this case Bitdefender – and which routinely performs less well than might be expected. The latest build came as a 132MB installer including updates. It ran, with only a single click required, via a screen combining a welcome message, the selection of an install destination, and access to a EULA which is not displayed unless requested. Later, online updates took around ten minutes to complete.

The interface is clear, bold and colourful, providing a basic set of controls which are mostly easy to fathom (although in places clarity suffers a little in translation). As we have noted repeatedly in past reports, the on-access component functions rather differently from the norm, with no sign of on-read protection, and on-write not actually blocking immediately (as it claims to do), but allowing files to be written to quite happily and later (often quite a bit later) logging that it has observed them. This somewhat throws off our speed measures; on-demand scans were extremely slow in some areas – notably archives and binaries – and very fast in the others, but on-access overheads appeared very light, thanks mainly to no real filtering actually occurring. This also impacted our resource measures, which show low figures, and even our activities – which should at least in part be affected by the on-write scanning – appear very fast. We saw some other issues with the on-access component when dealing with infected samples, including apparently no detection at all for the EICAR test file, which is picked up without problems on demand. Some jobs using many malware samples also caused the on-access component to simply shut down after a few hundred detections, and several times the interface became so confused over whether things were active or not that we could no longer operate it – a complete wipe of the system and a fresh install were required.

Detection rates were eventually cobbled together from several runs. They turned out to be quite good in the RAP sets, starting pretty high and not tailing off too much, but in the Response sets they were decidedly lower than expected, hinting that once again there were some issues with updates applying properly. In the certification sets, on-access scores were pretty decent – enough to have merited an award – but the on-demand scores were strangely much lower; as we have noted in the past, the product appears to be a little hasty at times and has some issues writing out its logs properly, which may be behind this oddity.

No VB100 award is granted this month, making for one pass and two fails in the last six tests; four passes and three fails in the last two years. Some wobbliness in the GUI was noted, with most problems only appearing under high stress, but some of them fairly severe; a ‘Buggy’ rating is the best we can offer.

ItW Std: 97.07%

ItW Std (o/a): 100.00%

ItW Extd: 99.17%

ItW Extd (o/a): 99.80%

False positives: 0

Quick Heal Total Security 2012

Product version: 13.00 (6.0.0.2)

Update versions: NA

Quick Heal’s latest edition was provided as a bulky 313MB install package, including all required updates for the RAP tests. The installation process begins with a somewhat lengthy pause, which had us wondering if we had indeed executed the file. It suddenly woke up though, did some preparatory work including a memory scan, asked a couple of basic questions, then got down to work, with the whole process completing in a minute and a half. Initial online updates were mostly speedy, averaging three minutes, but on one occasion after the first update was complete a second one was requested, taking a further 15 minutes.

The interface is glossy and shiny, with large icons covering the various components, and under the covers a fairly thorough range of controls can be found. In some cases it is a little less than simple to dig the controls out though, and in others two very similar items are found rather far apart – which can lead to the odd mistake in set-up. Things mostly moved along nicely though, with scanning speeds very fast and overheads pretty light. RAM use was high, but CPU use fairly low, and our set of activities ran through in slightly longer than average time. At the end of the performance tests, the system appeared to be completely unusable, and a reboot was needed to set things right, despite no work with malware having been performed.

Detection rates in both the Response and RAP sets were a little disappointing, and also rather uneven, and in the core sets a couple of items in the standard WildList set were not picked up on access, despite complete coverage on demand. Thus, despite there being no false alarms in the clean sets, Quick Heal does not make the grade for VB100 certification this month. The vendor’s history shows a run of good times coming to an end, with two fails and three passes in the last six tests; nine passes from 11 attempts in the last two years. With the rather odd issue noted after the performance tests, when the system should not have been under stress, the product earns no more than a ‘Fair’ stability rating.

ItW Std: 100.00%

ItW Std (o/a): 99.87%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Sophos Endpoint Security and Control 10

Product version: 10.0.1

Update versions: 3.28.1/4.74G, 3.29.0/4.75G

The Sophos product line continues to evolve without any drastic changes being made to the product’s overall design, and once rather empty, the home screen of the product interface is starting to fill up nicely. The latest version was provided as a 103MB installer with a svelte 7MB update bundle for offline use. It installed in good order for the most part, taking not much more than a minute, with only an additional four minutes required for the initial online update, on average. On one occasion the installer did complain about being unable to remove a competitor product, despite there being nothing installed on the machine other than the base operating system. Looking closer, we spotted that the system clock had been altered rather drastically by a previous install attempt, and fixing this got rid of the problem.

The GUI is clear and pleasant to use, with a wealth of fine-tuning controls, and navigation seems fairly straightforward and intuitive – although familiarity may be helping somewhat here. Speed measures were reasonable, overheads a little high in some areas but OK in others, with RAM use perhaps a shade above average, but CPU use low and our set of tasks zipping through very quickly.

Detection rates were a little below par in the RAP sets, but considerably better in the Response sets where the company’s cloud look-up system comes into play. The value of this system is somewhat challenged by the certification results however, where a single item in the standard WildList was missed, along with a few more from the Extended list. Looking closer, it was clear that the standard list item at least was detected fine without a web connection, and the firm confirmed that the issue was indeed caused by an erroneous whitelisting of the file in question in the cloud.

No false alarms were noted, only the usual adware alert on a game in the clean sets, but no VB100 award can be granted to Sophos this month. That makes for two fails in three attempts for the usually reliable performer, which now stands on four passes and two fails in the last six tests; ten passes in the last two years. A single issue was observed, but given the minuteness of the bug, the fact that it only appears to occur when the system clock is set several decades out of whack, and that it was dealt with gracefully, a little flexibility keeps the product in the ‘Solid’ range for stability.

ItW Std: 99.79%

ItW Std (o/a): 99.79%

ItW Extd: 99.74%

ItW Extd (o/a): 99.74%

False positives: 0

SPAMfighter VIRUSfighter Pro

Product version: 7.1.138

Update versions: NA

By some way the most interesting of the ever-expanding Preventon crowd, SPAMfighter’s product makes some effort to establish its own identity, beyond the usual tweaks to the colour scheme. The 91MB installer runs through some fairly standard steps, including demanding an email address, and completes in under a minute, with later updates also fairly speedy, averaging less than five minutes. The interface has some similarities to related products but a definite flavour of its own, with a good basic level of controls provided. In this case there is no option to put a stop to the overly verbose logging, but it can be disabled from the registry.

Speeds were much as expected: fairly reasonable in most areas with overheads a touch on the high side, while resource usage was not extravagant and our set of activities completed in good time. Detection rates closely mirrored the rest of the batch, all run on the same day, meaning that RAP scores were on the low side with a sharp downturn, and Response scores no more than respectable. The core sets were dealt with properly however, and a VB100 award is earned.

The product has done well lately, with five passes from five attempts in the last six tests; longer term things are less serene, with three fails and six passes from nine entries in the last two years. Stability was OK in the main, but all large scans caused the interface to suffer major freak-outs, as usual: after more than a few hundred detections the entire system became essentially unusable, with control returning at the end of the scan but the product GUI frozen up until a reboot. Although these problems only occurred in high-stress situations, they were serious enough to move the product into the ‘Fair’ category for stability.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.93%

False positives: 0

SureGuardian Antivirus Premium

Product version: 3.0.70

Update versions: 14.1.218, 14.1.267, 14.1.278, 14.1.283

Yet another member of the Preventon club, SureGuardian has not previously been seen on our test bench, but promised few surprises – although the ‘Premium’ in this one’s name sparked immediate worries. As could have been predicted, the installer was around 89MB, set-up fast and simple, updates averaged eight minutes, speeds were reasonable, overheads a little high, resource use not bad and impact on our set of tasks respectable, while detection scores were unimpressive for the most part and rather depressing in places.

The usual clutch of bugs were apparent – experience teaching us to watch out for them and work around them where possible – and once again, part of the functionality had to be disabled to enable tests to continue.

A VB100 award is granted thanks to a decent showing in the core areas, giving SureGuardian certification at first attempt, but the problems put it well inside the ‘Buggy’ category.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.93%

False positives: 0

Total Defense Inc. Internet Security Suite

Product version: 8.0.0.7

Update versions: 1.6.01891/5724.0.0.0, 5733.0.0.0, 5742.0.0.0

The latest iteration of Total Defense’s consumer solution is new to our test bench, and apparently does not function without access to the Internet – meaning it was not included in this month’s RAP tests. This design would also make it unsuitable for clean-up of sensitive systems – most experts’ advice still being to disconnect manually from the Internet the moment a malware infection is suspected. The installer was a sizeable 172MB, and after initial unpacking there was a ‘mandatory’ update run, which only lasted 20 seconds or so. Then things got moving properly, with a slide show to keep us entertained during the install process. The whole thing lasted no more than two minutes with a reboot required at the end. On at least one install, a message appeared after this restart claiming that the system had shut down unexpectedly.

The GUI is little changed from previous versions – slightly over-styled, with something of a learning curve to negotiate before it can be navigated easily, and still some of the controls are hidden behind confusing labels. Options are fairly limited anyway. Scanning speeds were excellent, as usual, starting at a decent pace and speeding up hugely in the warm runs. Overheads were perhaps a little above average in some areas, but resource use was unexceptionable and our set of tasks were completed in decent time.

Detection measures proved more tricky to piece together, with most scans falling over at some point, and the system was generally completely unusable during scanning. The clean sets proved particularly hard to get through in a single run, and text logging (which, mercifully, is provided) fell over a few times, apparently when writing a filename containing unusual characters. Results were still stored in the main log database though, and were fairly straightforward to pull out using SQL-to-text conversion tools. Detection rates started quite well in the Response sets, but headed fairly sharply downward in the later half. No RAP scores are available at the request of the developers. The core certification sets were properly dealt with, and a VB100 award is earned, putting the product – which generally only enters desktop comparatives – on two passes and one fail in the last six tests; three passes and three fails in the last two years. With a number of issues of minor to medium significance observed, both under high stress and in everyday use, stability edges into the ‘Buggy’ category, presumably mostly due to the newness of the product.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Total Defense Inc. Total Defense r12

Product version: 12.0.0.832

Update versions: 5418.0.0.0, 5672.0.0.0, 5733.0.0.0

The business offering from Total Defense continues to bear the old CA branding in a number of places, and as usual the submitters insisted on installation and online updating on the deadline day – thus making unnecessary extra work for us, as the product clearly has the facility to update from a local file (as one would expect from an enterprise-grade solution). Installation is run from a DVD image measuring over 3GB, but much else beyond the basic endpoint product is included there. Initial installation requires the .NET framework, which adds several minutes to the process if the right version is not already in place, then alongside the usual steps are requests for personal details and information about management servers. A reboot is needed to complete, and again on one run we noted an error message claiming that the system had not expected to restart.

Updates were rather slow, averaging 30 minutes for the initial runs, and on one of them we saw a message in the log suggesting that another reboot was needed to get things moving properly, although there was no more active notification of this to the user. Licensing also proved rather fiddly, with the input boxes for licence codes not properly designed and requiring each section to be pasted in separately, while previously activated products frequently claimed they had lost their activation – the whole process was a bit of a headache.

Once up and running though, all went well. The GUI itself is fairly clear and generally responsive, with a decent level of options, although once again, settings which appear to offer archive scanning on access had no effect whatsoever. Speeds were impressive – extremely fast even on the first run and managing to improve further in the warm runs. Overheads were pretty light, resource use a little above average in both RAM and CPU measures, but our set of tasks got through in quite good time.

As ever, detection results had to be ripped out of unfriendly SQL format storage – although, presumably, in full enterprise installs some form of reporting is available at the management server level (hopefully providing text options rather than the pointless PDFs, which was all we could find the last time we looked at it). Response scores were notably lower than the product’s consumer cousin, again showing a distinct downward trend into the more recent days, and RAP scores were on the low side too, dropping away fairly steeply. The core sets were properly dealt with though, with no issues in the WildList or clean sets, thus earning Total Defense a second VB100 award this month. The business line’s test history shows four passes and one fail with a single no-entry in the last six tests; six passes and three fails in the last two years. There were no serious stability issues, but a couple of minor bugs were observed, earning the product a ‘Stable’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Troppus Software Digital Life Now Anti-Virus Premium

Product version: 3.0.70

Update versions: 14.1.218, 14.1.267, 14.1.278, 14.1.283

Perhaps one of the more unusual product names we’ve encountered for a while, Digital Life Now Anti-Virus will be abbreviated to DLNA from here on, as it is in places in the product itself. This is yet another member of the Preventon/VirusBuster crowd, and another first-timer on our test bench. The product presented no surprises in the set-up from an 87MB installer, which ran quickly with no need to reboot, updates averaging seven minutes. On one occasion an update attempt failed after five minutes and needed to be re-run, however.

The interface actually looks quite nice in DLNA’s chosen colour scheme, but this is once again the ‘Premium’ version, including all the buggy components. These slowed down testing once again, but knowing where to expect them helped somewhat, and we soon had results safely recorded. Scanning speeds were as expected: reasonable on demand, a little heavy on access, with average resource use and average impact on our set of tasks. The Response and RAP sets showed fairly mediocre scores, dropping quite steeply through the RAP weeks, but the core sets were properly managed and a VB100 award is earned at first attempt. With all the predicted problems, stability is rated as ‘Buggy’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.93%

False positives: 0

TrustPort Antivirus 2012

Product version: 12.0.0.4857

Update versions: 12.0.0.4860

A much more familiar name, TrustPort is a regular in our tests, with a strong record of high scores. The latest version was provided as a 226MB installer, which ran through the standard set of questions, completing in under a minute with a reboot required at the end. Updates were a little unpredictable, their duration increasing the further we got from the date the installer was put together. The first run took less than five minutes, but later ones were much longer, with an overall average of over 20 minutes. The interface is a little unusual, with scattered components and no real main control tool, but after a little exploration it soon becomes simple to operate, providing a thorough set of controls.

We have noted in the past a few oddities with the product, including some strange window behaviour which can make a system rather tricky to use. This month we also had a problem with some of the controls: having turned off the on-access component to replace a broken sample set we found it could not be reactivated until a reboot returned things to normal. Scanning speeds were fairly slow, with overheads a little high on access; RAM use was around average, but CPU use fairly low, and our set of activities were completed in good time.

Detection rates, as ever, were awesome, with the Response sets totally destroyed and some excellent scores in the RAPs too, only the proactive week showing a slight decline. The clean sets were handled well despite the double risk from the product’s dual-engine approach, but in the WildList sets a single item in the Extended list was not picked up in either mode, thus denying TrustPort certification this month. Closer investigation by the developers brought to light an issue with certificate approval and a difference between behaviour in real and virtual environments, highlighting the problems of performing QA procedures on virtual machines.

This breaks a solid run for the company, putting it on three passes and a single fail in the last six tests; seven passes from eight attempts in the last two years. With a few little problems with the interface spotted, stability can be rated as no more than ‘Fair’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 99.93%

ItW Extd (o/a): 99.93%

False positives: 0

UnThreat AntiVirus Professional

Product version: 4.3.31.11543

Update versions: 11543, 11685, 11700, 11713

With only a single appearance under its belt, UnThreat’s product is still something of an unknown entity, despite being based on GFI’s VIPRE engine. Its previous entry gave us quite some trouble, but it was not clear how much of that was down to known issues with the underlying engine. The current version came as a fairly slimline 94MB installer, which ran through quickly with minimal clicks required, requesting a reboot after less than half a minute. Updates were also zippy, averaging just four minutes.

The interface is a lot more standard in design than GFI’s own, and provides a decent level of controls in a fairly approachable manner. It looks good and generally responded well. Scanning speeds were distinctly slow in the archive sets, but not bad elsewhere, showing some good improvements in most of the warm runs. Overheads were a little high, but resource use wasn’t bad and our set of tasks didn’t take too long either.

Detection rates were excellent in the Response sets, but rather disappointing in the RAPs, implying that perhaps the definitions provided with the submission were not as fresh as they ought to be. The core sets were dealt with well though, and a VB100 award is earned. This is the product’s first appearance in the last six tests, and it now has two passes from two entries in the last two years. We did observe a few stability issues, with scans crashing and freezing, but these were only in high-stress situations and we give it a ‘Stable’ rating.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

UtilTool AntiVirus Premium

Product version: 3.0.70

Update versions: 14.1.218, 14.1.267, 14.1.278, 14.1.283

The final member of the Preventon clan, and another double submission, UtilTool was the only one of the set to provide a Premium product in the last test, and failed thanks to a couple of items missed on access – almost certainly due to the ‘Safety Guard’ component which has caused so many issues this month. The vendor’s approach is to provide the full Premium version as a free trial for a limited period, hence the differences in the submissions.

The set-up runs along very familiar lines, the installer measuring 87MB, taking only a few steps and not much more than a minute to complete, with no reboot; updates average seven minutes. All the usual issues were observed, and everything else was as expected too, with mid-range speeds, overheads and resource usage, a fair, but not too heavy impact on our set of tasks.

Detection rates were unimpressive, but the core sets were handled properly and a VB100 award is earned. With much effort, we managed a run through the WildList sets with the ‘Safety Guard’ enabled, and found it did indeed lead to a few more misses, but only on access and only in the Extended set – meaning that a pass would still have been earned with this feature activated. The product now has one pass and one fail from its two entries, both in the last six tests. Stability is once again rated at the outer edge of ‘Buggy’.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.93%

False positives: 0

UtilTool Antivirus Pro

Product version: 3.0.70

Update versions: 14.1.218, 14.1.267, 14.1.278, 14.1.283

The same product with a different licence key applied, the experience here was pretty similar, minus a number of bugs.

The interface is fairly clear and usable, with a good basic range of controls, and the set-up ran at the same speed, with updates again averaging seven minutes. Speed measures were more or less identical, the additional components active in the Premium version not affecting any of our tests, and detection scores were once again disappointing, but not terrible. A second VB100 award goes to UtilTool this month – the first for the Pro edition, which is also rated ‘Buggy’ despite having slightly fewer issues.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 99.93%

False positives: 0

VirusBuster Professional

Product version: 7.3.33

Update versions: 5.4.1/14.1.219, 5.4.1/14.1.266, 5.4.0/14.1.270, 5.4.1/14.1.282

Despite being the final product to discuss this month, VirusBuster’s core technology has already appeared in some 16 other products, so we were able to make some pretty clear predictions before even looking at it. The installer measured 70MB with an additional update package of 76MB, and ran through the usual stages, completing in a minute or so with no need to reboot. Updates were a little flaky if run from the main GUI, but using the system tray option seemed more reliable, initial runs completing in 15 minutes on average. The interface has been polished a little but the basic design is unchanged – a little clunky, but providing a decent range of controls. Once again, on-access archives were not scanned despite the option to do so clearly being enabled.

Scanning speeds were fairly average and overheads pretty light, but our set of tasks took a little longer than most to complete, with resource use around average too. Detection rates were not great, with a steep drop in the RAP sets and a more gradual, but still noticeable decline through the Response sets. There were no problems in the certification sets though, and a VB100 award is earned. VirusBuster maintains a flawless record of 12 passes in the last two years. Stability is rated ‘Fair’ thanks to a few fairly minor issues occurring during general use.

ItW Std: 100.00%

ItW Std (o/a): 100.00%

ItW Extd: 100.00%

ItW Extd (o/a): 100.00%

False positives: 0

Untestable products

In addition to the products discussed above and detailed in our results charts, several other products were submitted for testing this month, but after much hard work (all of them were given at least three installs and subjected to a selection of tests) were found too unstable to be worth the effort of further attempts. These were: CMC Internet Security, ESTSoft ALYac Enterprise, Maya Software N360 Internet Security 2012, and Roboscan Enterprise, the latter being essentially a clone of ESTSoft’s product. In all cases, every effort was made to include the products in the test.

Results tables

(Click for a larger version of the table)

(Click for a larger version of the table)

(Click for a larger version of the table)

(Click for a larger version of the table)

(Click for a larger version of the graph)

(Click for a larger version of the graph)

(Click for a larger version of the table)

(Click for a larger version of the table)

(Click for a larger version of the graph)

(Click for a larger version of the graph)

(Click for a larger version of the table)

(Click for a larger version of the table)

(Click for a larger version of the table)

(Click for a larger version of the table)

(Click for a larger version of the graph)

(Click for a larger version of the graph)

(Click for a larger version of the table)

(Click for a larger version of the table)

(Click for a larger version of the table)

(Click for a larger version of the table)

(Click for a larger version of the table)

(Click for a larger version of the chart)

Conclusions

This was another epic month for us, running well past our predicted deadline. The usual issues with illness in the lab team, overcrowding and overheating in the test lab, and the extra work required by our new multi-part testing process all played their part in the extra-long duration of the test, but a major part was played by the many, many issues with the products under test. This ongoing problem has been the driving force behind the introduction of our stability rating system and the decision to name and shame those found to be too unstable to work with (in the past such products have routinely been dropped quietly from the tests). Hopefully this will put pressure on developers to ensure their products are up to scratch before inflicting them on us (and, presumably, on a paying public), where repeated complaints and criticisms in the written reports over the last few years have not – it is quite remarkable that some issues that have been brought up here test after test have still not been fixed.

In general, it was a good month for certifications, with a much higher than usual pass rate – although much of this can be put down to the fact that almost a third of the field of entries were based on the same engine. With no complex polymorphic samples on the Standard WildList to challenge the products, and some flexibility still required regarding the Extended list, the clean set was the main area in which we expected to see problems. Relatively few false alarms were observed though – perhaps hinting that we need to work harder on expanding the coverage of the sets (although there is always the chance that things could really be improving across the industry in this area). We did see a few products failing to fully cover the WildList sets, to our surprise, and in several cases this appeared to be due to problems with much-vaunted cloud protection systems.

We also saw continued improvement in our still fairly new Response tests, with adjustments to the way we compile the daily sets bringing results a little closer to what we would expect – most products now showing a general trend towards better performance in the older days and slightly (in some cases very much) poorer coverage in the more recent ones. The RAP sets have also benefited from this process, with only the freshest items at the time of compilation going into the sets, and this is reflected in a much tougher challenge in the proactive week. We look forward to seeing whether this trend is maintained in future tests. We also hope to squeeze in some work on the set of activities we use for our performance measures in time for the next test – both expanding the range of jobs performed and tuning them to reflect normal usage as closely as possible.

We will be keeping our fingers crossed that the products entered for the next test – on a server platform – will be less prone to wobbliness than those seen here.

Technical details

Test environment. All products were tested on identical machines with AMD Phenom II X2 550 processors, 4GB RAM, dual 80GB and 1TB hard drives, running Microsoft Windows XP Professional, with Service Pack 3. For the full testing methodology see http://www.virusbtn.com/vb100/about/methodology.xml.

Any developers interested in submitting products for VB's comparative reviews, or anyone with any comments or suggestions on the test methodology, should contact [email protected]. The current schedule for the publication of VB comparative reviews can be found at http://www.virusbtn.com/vb100/about/schedule.xml.

twitter.png
fb.png
linkedin.png
hackernews.png
reddit.png

 

Latest reviews:

VBSpam comparative review Q4 2024

In the Q4 2024 VBSpam test we measured the performance of 11 full email security solutions and one open‑source solution against various streams of wanted, unwanted and malicious emails.

VBSpam comparative review Q3 2024

The Q3 2024 VBSpam test measured the performance of ten full email security solutions and one open‑source solution.

VBSpam comparative review Q2 2024

The Q2 2024 VBSpam test measured the performance of ten full email security solutions, one custom configured solution and one open‑source solution.

VBSpam comparative review Q1 2024

The Q1 2024 VBSpam test measured the performance of nine full email security solutions, one custom configured solution and one open‑source solution.

VBSpam comparative review

The Q4 2023 VBSpam test measured the performance of eight full email security solutions, one custom configured solution, one open-source solution and one blocklist.

We have placed cookies on your device in order to improve the functionality of this site, as outlined in our cookies policy. However, you may delete and block all cookies from this site and your use of the site will be unaffected. By continuing to browse this site, you are agreeing to Virus Bulletin's use of data as outlined in our privacy policy.