VB100 comparative review on Windows XP SP3

2011-04-01

John Hawes

Virus Bulletin
Editor: Helen Martin

Abstract

With a staggering 69 products on this month's VB100 test bench, the VB lab team hoped to see plenty of reliability and stability. But, while the majority of products were well behaved, the team was woefully disappointed by a handful of unruly participants. John Hawes has all the details.


Table of contents
Introduction
Platform, test sets and submissions
Results
Agnitum Outpost Security Suite Professional 7.1
AhnLab V3 Internet Security 8.0.4.6
Antiy Ghostbusters 7.1.5.2760
ArcaBit ArcaVir 11.2.3205.1
AvailaSoft AS Anti-Virus 1.0.0.1
Avast Software avast! Free Antivirus 62
Avertive VirusTect 1.1.48
AVG Internet Security Business Edition 2011
Avira AntiVir Personal 10.0.0.611
Avira AntiVir Professional 10.0.0.976
BitDefender Antivirus Pro 2011
Bkis BKAV Professional Internet Security 3245
Bullguard Antivirus 10.0.172
CA Internet Security Suite Plus 7.0.0.115
CA Total Defense r12 Endpoint Protection Client
Central Command Vexira Antivirus Professional 7.1.38
Check Point Zone Alarm Security Suite 9.3.037.000
Clearsight Antivirus 2.1.48
Commtouch Command Anti-malware 5.1.10
Comodo Internet Security Premium 5.3.176757.1236
Coranti 2010
Defenx Security Suite 2011
Digital Defender 2.1.48
eEye Digital Security Blink Professional 4.7.1
EmsiSoft Anti-Malware 5.1.04
eScan Internet Security Suite 11.0.1139.924
ESET NOD32 Antivirus 4
Filseclab Twister AntiVirus V7 R3
Fortinet FortiClient 4.1.3.143
Frisk F-PROT Antivirus for Windows 6.0.9.5
F-Secure Client Security 9
F-Secure Internet Security 2011
G DATA AntiVirus 2011
Hauri ViRobot Desktop 5.5
Ikarus T3 virus.utilities 1.0.258
iolo System Shield 4.2.1
K7 Total Security 11.1.0025
Kaspersky Anti-Virus 6.0 for Windows Workstations
Kaspersky Internet Security 2011
Kaspersky PURE
Keniu Antivirus 1.0
Keyguard Internet Security Antivirus 1.1.48
Kingsoft Internet Security 2011 Advanced
Kingsoft Internet Security 2011 Standard-A
Kingsoft Internet Security 2011 Standard-B
Lavasoft Ad-Aware Total Security
Logic Ocean GProtect 1.1.48
McAfee VirusScan Enterprise + AntiSpyware Enterprise 8.8
Microsoft Forefront Endpoint Protection 2010
Nifty Corporation Security 24
Norman Security Suite 8.00
Optenet Security Suite V. 10.06.69
PC Booster AV Booster 1.1.48
PC Renew Internet Security 2011
PC Tools Internet Security 2011S
PC Tools Spyware Doctor with AntiVirus 8.0.0.624
Preventon Antivirus 4.3.48
Qihoo 360 Antivirus 1.1.0.1316
Quick Heal Total Security 2011
Returnil System Safe 2011
Sofscan Professional 7.2.27
Sophos Endpoint Security and Control 9.5
SPAMfighter VIRUSfighter 7.100.15
GFI/Sunbelt VIPRE Antivirus 4.0.3904
Symantec Endpoint Protection 11.0.6200.754
Trustport Antivirus 2011
UnThreat Antivirus Professional 3.0.17
VirusBuster Professional 7.0.44
Webroot Internet Security Complete 7.0.6.38
Results tables
Conclusions
Technical details

Introduction

When Windows XP first came out, George W. Bush was still in his first year of presidency. The 9/11 attacks took place between the platform’s release to manufacture and going on retail sale, as did the launch of the first generation iPod. Wikipedia was less than a year old, Google was just starting to turn a profit, while the likes of Facebook, Skype, YouTube and World of Warcraft were yet to come. Computers themselves were not too different from today of course, although the Pentium 4 was the hottest chip on the block and x64 was still a couple of years away. Skip forward almost a decade, and XP is still with us – not just hanging on by its fingertips but firmly remaining the most popular desktop platform (some estimates put it on over half of all desktop systems, and most agree that it runs on at least 40%). It is familiar, cheap, (comparatively) reliable and very popular. To most of the world’s computer users, it’s just the way computers work.

The operating system’s popularity with users is, if anything, surpassed by its popularity with developers, so it was almost inevitable that we would be deluged with products of all shapes and sizes for this month’s comparative, from the old and familiar to the new and scary. We knew there would be more than enough to keep us busy this month.

Of course, the platform’s maturity and stability also mean there has been plenty of time for refinement and quality control, so we hoped that we might see a trend in products towards the sort of stability and reliability that has been woefully lacking in some quarters of late.

Platform, test sets and submissions

Setting up Windows XP has become such a familiar and oft-repeated task that it requires very little effort these days. In fact, we simply recycled bare machine images from the last run on the platform a year ago, tweaking and adjusting them a little to make them more at home on our current hardware and network set-up, and re-recording the snapshots ready to start testing. As usual, no updates beyond the latest service pack were included, and additional software was kept to a minimum, with only some network drivers and a few basic tools such as archivers, document viewers and so on added to the basic operating system.

With the test machines ready good and early, test sets were compiled as early as possible too. The WildList set was synchronized with the January 2011 issue of the WildList, released a few days before the test set deadline of 16 February. This meant a few new additions to the core certification set, the bulk of which were simple autorun worms and the like. Most interesting to us were a pair of new W32/Virut strains, which promised to tax the products, and as usual our automated replication system churned out several thousand confirmed working samples to add into the mix.

The deadline for product submission was 23 February, and as usual our RAP sets were built around that date, with three sets compiled from samples first seen in each of the three weeks before that date, and a fourth set from samples seen in the week that followed. We also put together entirely new sets of trojans, worms and bots, all gathered in the period between the closing of the test sets for the last comparative and the start of this month’s RAP period. In total, after verification and classification to exclude less prevalent items, we included around 40,000 samples in the trojans set, 20,000 in the set of worms and bots, and a weekly average of 20,000 in the RAP sets.

The clean set saw a fairly substantial expansion, focusing on the sort of software most commonly used on home desktops. Music and video players, games and entertainment utilities dominated the extra 100,000 or so files added this month, while the retirement of some older and less relevant items from the set kept it at just under half a million unique files, weighing in at a hefty 125GB.

Some plans to revamp our speed sets were put on hold and those sets were left pretty much unchanged from the last few tests. However, a new performance test was put together, using samples once again selected for their appropriateness to the average home desktop situation. This new test was designed to reproduce a simple set of standard file operations, and by measuring how long they took to perform and what resources were used, to reflect the impact of security solutions on everyday activities. We selected at random several hundred music, video and still picture files, of various types and sizes, and placed them on a dedicated web server that was visible to the test machines. During the test, these files were downloaded, both individually and as simple zip archives, moved from one place to another, copied back again, extracted from archives and compressed into archives, then deleted. The time taken to complete these activities, as well as the amount of RAM and CPU time used during them, was measured and compared with baselines taken on unprotected systems. As with all our performance tests, each measure was taken several times and averaged, and care was taken to avoid compromising the data – for example, the download stage was run on only one test machine at a time to avoid possible network latency issues. We hope to expand on this selection of activities in future tests, possibly refining the selection of samples to reflect the platforms used in each comparative, and perhaps also recording the data with greater granularity.

We had also hoped to run some trials of another new line of tests, looking at how well products handle the very latest threats and breaking somewhat with VB100 tradition by allowing both online updating and access to online resources such as real-time ‘cloud’ lookup systems. However, when the deadline day arrived and we were swamped with entrants, it was clear that we would not have the time to dedicate to this new set of tests, so they were put on hold until next time.

The final tally came in at 69 products – breaking all previous records once again. Several of these were entirely new names (indeed, a couple were unknown to the lab team until the deadline day itself). Meanwhile, all the regulars seemed to be present and correct, including a couple of big names that had been missing from the last few tests. With such a monster task ahead of us, there was not much we could do but get cracking, as usual crossing all available digits and praying to all available deities for as little grief as possible.

Results

Agnitum Outpost Security Suite Professional 7.1

Version 3415.320.1247

Agnitum kicks off this month’s comparative in its usual solid style. This is the full ‘Pro’ version of the suite solution, which has recently been joined by a slightly pared-down free edition, still offering a good range of protection layers. The installer came as a 94MB executable, with the latest updates thoughtfully built in, and the set-up process followed the usual steps of language selection, EULA and so on; it took a couple of minutes to get through, and a reboot was needed to complete.

The GUI hasn’t changed much for a while, remaining clear and simple with not much in the way of fancy frills to get in the way of things. The product includes a comprehensive set of firewall, HIPS, web filtering and anti-spam components. Configuration is not hugely in-depth (for the anti-malware component at least), but a good basic set of controls are provided. Testing ran smoothly, unhindered by unexpected behaviour or difficulties operating the solution. We were once again impressed by some judicious use of result caching to ensure items that had already been checked were not processed again, and this efficiency helped us keep the overall test time to well within the expected bounds (when planning our testing schedule we roughly allocate 24 hours to each product for full testing).

Scanning speeds and on-access lags were decent to start with, both speeding up hugely in the warm sets, and while RAM and CPU consumption were perhaps a little above average, impact on our new sets of standard activities was minimal.

Detection rates were decent as ever, with solid scores in most areas, and the WildList caused no problems. The clean sets were also handled well, with only a single item labelled as adware, and a VB100 award is duly earned by Agnitum. This brings the company’s tally in the past two years to seven passes and one fail, with four tests not entered – all of the last six entries having resulted in passes.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 88.57%
Worms & bots: 96.89%
Polymorphic: 100.00%
False positives: 0

AhnLab V3 Internet Security 8.0.4.6

Build 925; engine version 2011.02.23.31

AhnLab is a pretty regular participant in our comparatives, and the company’s product is generally well behaved (the occasional wobbly month notwithstanding). This month’s submission was a 155MB executable, including latest updates, and ran through its installation process fairly uneventfully. An option to apply a licence was declined in favour of a trial version, and we were also offered the choice of including a firewall – this was not enabled by default, so was ignored. The process completed in under a minute and needed no reboot.

The product is reasonably clean and efficient-looking, although some of the configuration was a little hard to find. Thankfully, past experience had taught us to search thoroughly to make sure all configuration options were checked. Intrusion prevention and firewalling is provided in addition to the anti-malware component, and there are some extra tools as well. Testing ran through smoothly without any major problems – even the log viewer, which has caused some pain in the past, proved solid and stable.

Scanning speeds were not super fast, but lag times were low, with fairly low use of RAM too. CPU use was a little higher though, and the time taken to complete our set of tasks was around average.

Detection rates were very good, continuing an upward trend observed in recent tests, and the WildList and clean sets presented no problems at all. AhnLab earns a VB100 award, making six passes and four fails in the last two years, with two tests not entered – five of the vendor’s last six entries have passed.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 94.05%
Worms & bots: 98.35%
Polymorphic: 99.99%
False positives: 0

Antiy Ghostbusters 7.1.5.2760

Version 2011.02.23.20

Antiy was an interesting newcomer to our line-up this month. We have been in contact with the company for some time now, and have long looked forward to the product’s debut in our comparatives. Antiy Labs hails from China, with branch offices in Japan and the US, and has been operating for over a decade. It makes its scanning engine available as an SDK, which sees it used in various firewalls, UTMs and other security devices, according to the company’s website.

The product was sent in as a 50MB executable, which had some fairly recent updates included, but for optimum performance we installed and updated the product online on the deadline date. This was not as simple as it might have been, as the product is only available in Chinese; however, a thorough usage guide was kindly provided, and once Chinese support had been added to the test system it was fairly straightforward to figure out what to click and when. The set-up process took only a few minutes, including updating, with no need to reboot.

The main product GUI looks slick and professional (although of course much of the actual content was unintelligible to us), and navigating wasn’t too difficult thanks to a combination of the guide provided, basic recognition of some characters, and a general sense of where things tend to be in anti-malware product interfaces. The initial stages of testing ran through very nicely, with all on-demand tests zipping through without difficulty, but the on-access component proved elusive. We could find no evidence of the on-access scanner in initial trials of our archive tests, but this was inconclusive since we found that the on-demand component did not detect the EICAR test file either. Various other attempts, including checking that files were detected by the on-demand scanner before copying them around the system and even executing them, produced no results, and a request for information from the submitters went unanswered. Whether or not the product even has an on-access component thus remains a mystery, but either way as it does not appear to be enabled by default it would not be possible to include it in our official tests.

This also meant there was no point in running our standard performance measures, but on-demand scanning speeds were pretty zippy, and the product powered through the infected sets in good time too. The logs showed some fairly disappointing scores, with coverage of polymorphic items particularly poor, but the RAP sets showed a steady, if not super-high detection rate.

The WildList showed a fair few misses, with a handful of false alarms in the clean set too, and of course no obvious on-access capability was found, giving us several reasons to deny Antiy a VB100 award for the time being. However, the product impressed the team and looks like a good bet for some rapid improvements.

ItW: 87.02%
ItW (o/a): %
Trojans: 23.91%
Worms & bots: 72.88%
Polymorphic: 19.82%
False positives: 4

ArcaBit ArcaVir 11.2.3205.1

Update 2011.02.24.12:54:56

ArcaBit has made a few appearances in our comparatives over the last few years, and has shown some steady improvements both in performance and stability.

The install package weighed in at 95MB and needed no additional updates; it ran through in good time with no surprises. The product is a full suite including firewall, anti-spam, mail and web monitors, and some intrusion prevention components.

The interface has been adjusted and improved a little of late, and is now looking complete and polished. The layout is fairly usable and it responded well even under pressure; no stability problems of any kind were observed during the full test run, which completed in good time.

Scanning speeds were pretty good, and on-access lags were not bad either, while use of RAM and impact on our activities set were about average and CPU use was fairly low. Scores were just about reasonable, with a fairly notable step down mid-way through the RAP sets.

The WildList was handled without problems, but both the clean set and the speed sets threw up a handful of false alarms, including items from Microsoft, a component of MySQL and the popular Joomla wiki system. This was enough to deny ArcaBit a VB100 award this month, leaving it with just one pass in the last two years, from a total of five attempts.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 63.06%
Worms & bots: 72.11%
Polymorphic: 93.63%
False positives: 7

AvailaSoft AS Anti-Virus 1.0.0.1

When newcomer AvailaSoft first came to our attention we noted some interesting test results quoted on the company’s website – two testing labs, previously unknown to us, were quoted as rating the product very highly indeed. So far our attempts to contact these labs to find out more about their methodology – and encourage them to join testing community endeavours such as AMTSO – have gone unanswered. AvailaSoft itself is based in Duluth, GA, USA, with offices in several other regions, and was founded in 1996.

The install package weighed in at a very reasonable 61MB, and after the minimum number of set-up stages it zipped through its activities in double-quick time, with a reboot at the end. Getting the interface up proved less speedy however, as on boot-up the test machine seemed to take rather a long time to get its act together (we hope to add some boot speed checks to our test suite in the very near future to get more accurate measures of this kind of thing). The GUI eventually opened, however, and proved reasonably pleasant to operate, if a little drab and grey. Options were a little odd in places, with the list of possible actions to take on detection being ‘auto-treat’, ‘clean’ or ‘delete if disinfection fails’, which seemed to overlap each other and provide no actual choice. The interface was generally responsive, but prone to odd spells of slowing down, where buttons would take some time to elicit a response.

Scanning was similarly sluggish but generally well-behaved, although handling large quantities of infected items proved a heavy burden and many scans had to be aborted after slowing to a point of unbearable drag. On occasion, scans simply stopped with no sign of any results or logs. By breaking up the sets into smaller chunks we managed to get through most of the tests in the end, although it took several times the allotted 24-hour time period to do so.

Scanning speeds were very slow, and on-access lag times enormous, with one particular job – which usually takes less than a minute on a bare system – dragged out to several hours. This seemed to improve somewhat on warm runs. Impact on our activities suite was fairly heavy, and while RAM use was around average, CPU use actually showed a large reduction over the same job running on a bare system – this suggests that much of the extra time being added to the various jobs carried out actually left the processor idle.

Looking over the results we saw some confusing variation, with no apparent correlation between the scores recorded and those of other products using the same engine, or even with the same product in different detection modes. So we went back and repeated the tests, finding them once again slow and prone to sudden and unexplained death. Each time a scan failed to complete and report results, it was necessary to repair the affected sample set and re-run the job in smaller chunks.

Eventually we managed to get at least one set of scan logs for each area of the test sets, by running on up to six of our test systems for several further days, but even combining all the various runs together showed far lower scores than anticipated. With no further time available, we finalized the results as the combined best of all jobs. The results for the WildList set, after more than 20 runs through in both modes, seemed to be reliably accurate at least, showing good coverage of the polymorphic items but a fair number of other samples not detected. As a result, no VB100 award can go to AvailaSoft just yet, despite an enormous amount of work on our part.

ItW: 91.43%
ItW (o/a): 91.43%
Trojans: 37.35%
Worms & bots: 46.13%
Polymorphic: 71.09%
False positives: 0

Avast Software avast! Free Antivirus 62

Version 6.0.1000; engine and virus definitions version 110223-1

Avast made some serious cosmetic and technical improvements with its version 5, released not long ago, and was heartily praised in these pages (see VB, January 2010, p.17). Version 6 popped out rather unexpectedly, and we were intrigued to see what further strides had been made.

The 62MB install package, provided with all updates included, looked fairly similar to previous submissions, involving only a few steps, one of which is an offer to install the Google Chrome browser. A brief scan is also included as part of the set-up, but the whole thing is still complete in under half a minute. No reboot is required, although the application sandboxing system – which seems to be the main addition in this new version – does require a restart to become fully functional.

The interface remains much as before, the colours looking perhaps a little less bold and impressive, but the layout is generally sensible and easy to operate. The new sandbox caused a few problems in our speed tests, as prompts regarding innocent packages with potentially suspect capabilities interrupted measures. Eventually, the settings were tweaked to automatically apply the sandbox rather than prompting all the time. However, we had a few further issues using this setting, with the machine becoming unresponsive a few times and needing to be reimaged to a clean operating system to enable us to continue with tests – all this before even engaging in any malware tests. When these issues were not blocking progress, things zipped along with their customary speed, and even with the issues we still got all the necessary jobs done in under 24 hours.

As usual, scanning speeds were fast, and lag times very low, with low use of memory and a small effect on the time taken to complete our set of activities, although CPU use was closer to the average for this month’s test.

With the final results processed we saw some stomping good scores, highly impressive in all sets. The WildList and clean sets were handled without a glitch, earning Avast another VB100 award for its free product; the company boasts an impeccable 12 out of 12 record in the last two years of our comparatives.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 96.80%
Worms & bots: 98.95%
Polymorphic: 100.00%
False positives: 0

Avertive VirusTect 1.1.48

Definitions version 13.6.215

Avertive has appeared in a couple of tests recently, being part of a set of products derived from the same toolkit, a front end and SDK to the VirusBuster engine developed by Preventon, whose own version first took part in late 2009 (see VB, December 2009, p.16). The number of these entries continues to grow, with Avertive already one of the more familiar names on the list.

The product comes as a 67MB installer and runs through a very standard set of steps, with no reboot needed to complete installation. An Internet connection is needed to apply a licence key, without which much of the configuration is inaccessible, but even with the time taken to complete this step, only a minute or so is needed in total to gets things up and running.

The interface is pared-down and simple, but provides a decent range of controls covering most of the standard bases. The only issue that has troubled us in the past is a lack of control over the logging system, which defaults to overwriting logs once they have reached a certain size: 10MB for on-demand scans and 2MB for on-access activity. This presents a problem for us in gathering results of our large scans of course, but could also pose issues for real-world users: since the default setting is to log every file scanned, it would be easy to run a scan job which turned up an infection, but could not tell you at the end what was found or where (of course, with the default settings some action would have been taken to combat the threat, but it’s usually best to be aware of what’s been done to your system even in the name of good security). Fortunately, after some trial and error, we managed to increase the log sizes by making some simple registry tweaks.

The product proved as solid and stable as on previous occasions, with a nice simple process to get all the tests complete. Slow scanning of some polymorphic samples – which were heavily represented in some of our sets – dragged out the testing process somewhat, but with careful organization we just about got it all done in not much over a day.

Scanning speeds were fairly average and on-access lag times a little lighter than many, with low use of CPU cycles and RAM use. Impact on our activities suite was not excessive either.

Detection rates were pretty decent, with an interesting two-step pattern in the RAP scores, and after a few unlucky months, where settings of the on-access component denied Avertive certification, this time all went well in the WildList, in both modes. With no problems in the clean sets either, the company can finally claim its first VB100 award after two previous failed attempts.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 86.49%
Worms & bots: 95.91%
Polymorphic: 100.00%
False positives: 0

AVG Internet Security Business Edition 2011

Version 10.0.1204; virus database version 1435/3463

AVG continues to consolidate its position as a well-known and widely trusted security brand, expanding and diversifying its capabilities with regular acquisitions, and its premium products have established a solid record in our tests.

The current version came as a 149MB installer package, including updates, and the install process is reasonably rapid and straightforward – the only incidents of note being the offer of a browser toolbar and the choice of joining a feedback scheme. With no reboot required, the process is complete within a couple of minutes.

The interface has a sober and sensible feel to it, and somehow seems a little less cluttered than previous entries. On top of the standard anti-malware protection are extras including a rootkit scanner and AVG’s LinkScanner safer browsing system. Configuration for all is exemplary in its clarity and comprehensiveness. Stability was rock-solid, with a nice simple scheduler helping to ensure time was well used, and all tests were completed well within the allotted 24 hours.

This was helped by some good use of result caching to speed up repeat scans of previously checked items, and the product powered through the speed tests in excellent time, showing very light lag times on access too. With RAM use not too high and CPU drain fairly noticeable, the set of standard tasks ran through almost as quickly as on the baseline systems.

Scores were excellent across the board, with impressive reliability throughout the reactive part of the RAP sets and only a slight decrease in the proactive week. The WildList and clean sets presented no problems, and AVG easily earns another VB100 award – making 11 passes in the last two years, with just one test not entered.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 93.55%
Worms & bots: 98.61%
Polymorphic: 99.99%
False positives: 0

Avira AntiVir Personal 10.0.0.611

Virus definition file 7.11.03.177

Avira continues to thrive with its combination of efficiency and superb detection rates, its free product snapping at the heels of a couple of others already looked at this month. The product has a soothing longevity of design, with changes introduced slowly to give an impression of evolution rather than sudden and drastic renewal.

The current iteration of the free-for-home-use personal edition was supplied as a 48MB installer, with an additional 38MB of updates, which were simple to apply using a standard built-in process. The set-up is straightforward and rapid, with (obviously) no licence code or file to apply, although there is an offer to register online. With no reboot required the process is complete in less than a minute.

The interface is clean and simple, with little by way of additional modules, but comprehensive configuration controls are easily accessed via an ‘expert mode’ button. Stability was generally as solid as ever, although a couple of scan jobs in our speed tests seemed to linger long after they had completed and been told to shut – simply ending the task with a right-click was all it took to get things moving again though. Tests were completed in excellent time, with just a few hours at the end of an afternoon and some jobs running overnight meaning several hours were cut from the expected day of testing.

Scanning speeds were very fast, as usual, and on-access measures showed a light footprint too, with low use of RAM and CPU and a light impact on our set of tasks.

Detection rates were pretty hard to beat, as is also usual for Avira, and even the proactive RAP set was more than 90% covered. The WildList was demolished and no issues emerged in the clean sets, only a couple of items alerted on as adware. Avira thus earns another VB100 award quite comfortably. This free version of the product has only entered four tests in the last couple of years, but has aced all of them.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 98.51%
Worms & bots: 99.45%
Polymorphic: 100.00%
False positives: 0

Avira AntiVir Professional 10.0.0.976

Virus definition file 7.11.03.177

Avira’s Pro edition is fairly similar to the free version on the surface, and although the installer package is a few MB larger, the same update bundle was used. The install process felt fairly similar, although the application of a licence key file took the place of the registration step. Again, no reboot was needed and everything was over with in under a minute.

The interface looks and feels much the same. Configuration was excellent, and stability again generally solid, although we saw the same occasional minor snags when closing the ‘Luke Filewalker’ scanner module. We were happy to see another product out of the way in considerably less than 24 hours, freeing up more time for other, less zippy solutions.

Scanning speeds were again super fast, with very low impact on file accesses, and performance measures closely mirrored the free version.

Scores were identical to the free edition, as might be expected given that both used the same detection data, and the lab team once again nodded their approval as set after set was demolished with remarkably high scores throughout – setting a high bar for others to aim for. A VB100 award is earned with style, giving Avira’s Pro product line 10 passes out of 12 entries in the last two years, and a 100% record in the last six tests.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 98.51%
Worms & bots: 99.45%
Polymorphic: 100.00%
False positives: 0

BitDefender Antivirus Pro 2011

Version 14.0.28.351 of branch 14.24; engine version 7.3681

BitDefender is another major firm whose reputation continues to grow with the company itself. As usual it is well represented in OEM and rebranded products, with some half a dozen of this month’s list including the company’s engine in some form or other.

The current mainline product came in as a fairly large 265MB package, with all updates included. The set-up process was considerably longer and more involved than most. A ‘quick’ scan early on took close to five minutes, and a large number of steps followed, including options to remove other solutions already present on the system, to disable the Windows Defender system, and to share updates with other BitDefender users (presumably some kind of Torrent-style data-pooling system). After what seemed to be the last of many steps, listing the included features as anti-virus and identity protection, a ten-second pause was followed by another option: whether or not to contribute anonymous data to a feedback system. There was then another ten seconds of silence, then the offer of a demo video – fortunately we didn’t have Flash Player installed on the test systems, so we could finally get testing under way.

As we have come to expect with BitDefender products, the interface has multiple personalities, with different degrees of complexity depending on the skills and knowledge of the operator. We opted for the most advanced mode, of course, which we found to provide an excellent level of controls in a nice usable style. The simpler versions also seemed clear and well laid out, and the styling is attractive. Stability was generally decent, although during one large scan of infected items there was a scanner crash, with no log data to be found, so nothing to show for several hours’ worth of machine time. Nevertheless, decent progress was made elsewhere and the full test suite was completed in good order.

Scanning speeds were OK, with caching of results apparently no longer in effect on demand, where it is perhaps of less use than on access. Here, lag times were very light indeed, and did speed up enormously in the warm runs. CPU use was a little higher than many this month, but memory consumption was low, as was slowdown of our set of tasks.

Detection rates were splendid, as usual, with excellent scores in the main sets and a very slow decline across the weeks of the RAP sets – the proactive week a whisker short of keeping all scores above 90%. The WildList caused no difficulties, and without a single alert in any of the clean sets BitDefender proves well worthy of a VB100 award. The company has a respectable record of seven passes and two fails in the last two years, with three comparatives not entered; four of the last six tests have been passed, from five entries.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 95.48%
Worms & bots: 99.53%
Polymorphic: 100.00%
False positives: 0

Bkis BKAV Professional Internet Security 3245

Definition version 3245; engine version 3.5.6; pattern codes 3.337.949

Bkis first appeared on the VB radar around a year ago, and has rapidly gone from a fairly rocky start to achieving several VB100 awards and some superb scores in recent months.

The company’s current ‘Pro’ product came as a 212MB install package, with no need for further updates. The installation process was remarkably rapid, with only one step covering the install location and creation of desktop shortcuts. A reboot was needed after the fast copy process, but nevertheless the whole thing was completed in excellent time.

The interface is a hot and fruity orange colour, and provides fairly simple access to a reasonable level of options covering the basic requirements but not much more. As in recent tests, stability was excellent, with no problems even under the heaviest strain, and despite rather sluggish scanning times all tests completed within 24 hours as hoped.

On-access lag times were fairly heavy, and scanning speeds not too zippy except in the archive set where things were not being probed too deeply. While RAM usage was fairly low, and impact on our suite of activities similarly good, CPU use when busy was pretty high.

Detection rates were once again excellent, with stunning scores across the sets. Guessing from the rapid improvements since earlier entries however, it seems likely that heuristic strength has been tweaked upwards to improve scores, and at last this seems to have gone a step too far, with a handful of false alarms generated in our clean sets, including components of a common freeware file compression tool and an obscure part of Microsoft Office. Bkis thus misses out on a VB100 award this month, despite an impressive performance; the Pro edition had passed all three of its previous entries in the last year.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 99.48%
Worms & bots: 99.59%
Polymorphic: 100.00%
False positives: 3

Bullguard Antivirus 10.0.172

Bullguard is an occasional to semi-regular participant in our testing, with its iconoclastic approach to interface design and general reliability making the product a welcome sight on any roster of submissions.

The latest edition came in as a 137MB install package, with no further updates needed, and ran through in very rapid time, with just a couple of clicks required. The whole thing was done within a minute, with no reboot needed.

The GUI design is somewhat on the wacky side, somehow blending cool and functional with warm and friendly, but after a little exploration it proved perfectly usable. Large buttons lead to an unexpected selection of main areas, with asymmetry another odd addition to the mix. Controls are fairly plentiful however, once dug out, and stability was excellent. Logging was in a rather gnarly XML format – nice for displaying to the user, but awkward to process with our standard scripts. However, some smart result caching meant that many tests powered through in excellent time and the full test suite was completed in under a day.

Scanning speeds were quite good, and on-access lags a little heavy at first but much quicker once the product had familiarized itself with things. RAM use was higher than most, but CPU use was very low, and impact on our activities was quite low too.

Detection rates were superb, with only the slightest decrease through the reactive weeks of the RAP sets, and the proactive week achieving the enviable heights of more than 90%. The WildList and clean sets caused no problems, and a VB100 award is duly earned; Bullguard now has four passes from four entries in the last two years, having skipped the other eight tests, with two of those passes coming in the last six tests.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 97.07%
Worms & bots: 99.63%
Polymorphic: 100.00%
False positives: 0

CA Internet Security Suite Plus 7.0.0.115

AM SDK version 1.4.1.1512; signature file version 4209.0.0.0

CA’s project to outsource the bulk of the work on its anti-malware solutions seems to be more or less complete, with the release of the fully reworked corporate product. The consumer version, ISS+, has become a familiar sight in recent tests, and has presented few major headaches to the test team.

As usual, installation was performed online at the request of the submitters. The main installer weighed in at 154MB, and online updating took a few minutes. The rest of the set-up process was fairly brisk and straightforward, and could be dealt with within a few minutes with little effort.

The interface is snazzy and stylish, if a little baffling at times; there are several components, including firewalling, intrusion prevention and parental controls, but the configuration is scattered and often less than clear. Stability seems fine though, with no crashes or hangs at usual levels of pressure. When running the seriously strenuous tests for our malware detection measures though, some cracks began to show. Like several others of late, the developers seem to have decided that it would be a good idea to store all detection results in memory, only writing out to file at the end of a scan. Presumably, in everyday usage there is some marginal performance gain from this approach, although it seems unlikely to be much given the size of most real-world results logs. In a testing environment this almost invariably causes problems. On this occasion scans began at a lightning pace (as we have come to expect from the excellent engine underlying the CA product range), but steadily grew slower and slower as RAM was eaten up with gusto. A first attempt at scanning the main test sets only (without even the RAP sets) ran for 18 hours and was consuming over 500MB of RAM before it froze out, leaving us with no option but to reboot the machine and abandon all the data not saved to disk. Scans were run in smaller chunks, each one carefully measured to hit the happy zone where speed hadn’t slowed down too much and crashes were unlikely.

As a result of the extra work and time involved in running over 20 jobs in place of one, testing took rather more than the 24 hours we had allocated each product; although not too much more thanks to good speeds in the on-access run over the infected set.

Thanks to smart caching of results over the clean sets, scanning speeds went from good in the cold measures to excellent in the warm, while file access lag times were not bad either. In the performances measures we saw a fairly low addition to our activities’ run time, while CPU and RAM use were both fairly high.

Detection rates were fairly respectable in general, with impressive reliability in the trojans and RAP sets – the team behind the detection part of the product seem to be maintaining things quite nicely. A single item of adware was identified in the clean set, and there were no problems in the WildList, earning CA a VB100 award for its consumer product. The solution has been having a rather tough time of late, with only two passes from six attempts in the last two years; this is the first pass in the last six tests, three of which were not entered - hopefully this will mark the start of a new chapter for CA.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 80.18%
Worms & bots: 96.96%
Polymorphic: 99.96%
False positives: 0

CA Total Defense r12 Endpoint Protection Client

Product version 12.0.528; signature version 4209

CA’s business solution has had a major revamp, which we were first exposed to in the last Windows comparative in late 2010 (see VB, December 2010, p.27). This was not the most pleasant experience, and we hoped a degree of familiarity would help things along this month.

With the installer package recycled from the previous encounter, there was fortunately no need to repeat the lengthy process of downloading the 4GB DVD iso image we were asked to use. The time saved in avoiding this chore was quickly used up though, as the install requested on the deadline day revealed that the product cannot be installed on Windows XP from the package provided. Instead, it must be set up on a supported platform (Windows 7 or a recent server edition) and deployed from there. In great haste (as we needed to run an online update before the deadline day expired), a precious machine was taken away from its usual duties and set up with Windows 7. Installing the management system is a rather complex process with a number of dependencies, a guide tool helping by listing those not yet met. These included the ISS system, Flash Player for the interface, and some changes to the firewall, as well as the local password which didn’t come up to the product’s safety standards. With the system installed we then faced further hurdles with the licensing scheme, which appears to need 2 a.m. to pass before it accepts new licences, and then running updates, which proved rather baffling and was not helped by the progress window being hidden in some kind of secured zone, having been reported as ‘not fully compatible with Windows’. We finally managed to get the latest definitions in place just as the deadline came to an end.

Next day, safely isolated from further updates, we tried deploying to the machine which would be used for the test proper, having navigated the pretty, but not entirely intuitive management interface in what we hoped was the right way. A discovery job found our test machines readily enough, but try as we might, remote installation seemed unwilling to run. Urgently requesting help from the submitters we were put in touch with a support operative, who promised some details of changes to the WMI system which might help, but when no further advice was forthcoming we resorted to further experimentation. As usual in such circumstances, Google was our friend, leading us to the murky world of CA user forums. Here we learned that a simple install bundle, including all required updates etc., can easily be created on the management system and copied over to clients manually (perhaps it would have been easier had the original submission been provided in this format).

With this figured out, the install actually proved rather simple, with the standard half-dozen steps of any normal installer and a reboot at the end. All this was complete in under a minute – albeit more than two days after first starting the process. The client interface is clean and crisp, a huge improvement over the previous edition, with a good range of options laid out in a simple and accessible manner. Despite the Flash underpinnings, it seemed stable and responsive at all times, and with the zippy scanning engine under the hood it made short work of most of our tests.

Again, scanning speeds were quite good and file access lag times light, but the performance measures showed quite a lot of RAM usage, a fairly heavy impact on our activities suite and a massive amount of CPU use. These figures looked so out of place when compiling the final graphs that we re-ran the tests to confirm them, but got almost identical results on a second run through.

In the infected sets things were also a little problematic. Although the on-access run went by in a flash, on-demand scans were once again hampered by the storage of all data in memory, the overworked test system slowly grinding to a halt as its resources were eaten up. One attempt at running the standard scan of the full sets froze up with more than 1GB of memory taken up. Resorting once more to running multiple smaller jobs, and ripping results out of the raw SQL database files created at the end of each scan, we finally got the required data, which showed some perfectly respectable scores, with admirable consistency across the RAP sets. The WildList and clean sets were well handled, and a VB100 award could finally be granted, after several days of hard work. Over the longer term, CA’s business solutions have a rather better record than its consumer ones, with seven passes and three fails in the last two years, two tests having been skipped; the last six tests show two passes, two fails and two no-entries.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 78.26%
Worms & bots: 96.06%
Polymorphic: 99.96%
False positives: 0

Central Command Vexira Antivirus Professional 7.1.38

Virus scan engine 5.2.0; virus database 13.6.217

Vexira has become a regular participant in our tests over the last few years, since starting up a highly successful partnership with the ubiquitous VirusBuster.

The installer submitted measured 65MB, with an additional 69MB archive of updates to add in. The set-up process included all the usual steps, split over rather more stages than most, with the option to join a feedback system rather deviously hidden on the same screen as the EULA and checked by default. Running through it all took less than a minute though, with the final screen somewhat confusingly reaching completion and leaving a progress bar at around 70% of the way across. No reboot was needed to complete the process, but we restarted anyway after the manual application of updates, just to be safe.

The interface is very familiar after dozens of appearances on the test bench in recent years, enlivened somewhat by Vexira’s gaudy red colour scheme. The layout is a little unusual but generally usable once one has got to know its quirks. However, a scheduler system proved beyond our limited powers, failing to run as we had apparently failed to properly set the user/password settings – ideally this would be checked by the product before accepting the job. Despite this minor setback, things mostly went smoothly and there were no issues with stability.

Scanning speeds were not super fast but on-access lags seemed OK, with impressively low measures in all of our performance drain tests.

With everything looking set to be completed comfortably inside the allocated time slot – the on-access run over the main sets taking somewhat longer than average but not too much – the on-demand scan threw a heavy and ugly spanner in the works. Having been a popular product with the test team for several years, the developers have flung themselves firmly into our bad books by leaping headfirst onto the bandwagon of storing detection data in memory rather than writing it to a proper log file incrementally; this meant yet more agonizing waiting, watching RAM consumption steadily rise, with no certainty that results would be safe until all was complete. The full job did, in fact, run without incident, but took just over 56 hours – considerably more than the five or six we would have expected of this product in its previous iterations.

Having survived this trial, results were decent, with good scores in general and a stronger than usual showing in the RAP sets. The WildList and clean sets caused no problems, and a VB100 award is granted despite our grumblings. Since reappearing on our radar just over a year ago, Central Command has achieved an excellent record of seven passes in the last seven tests.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 88.90%
Worms & bots: 97.01%
Polymorphic: 100.00%
False positives: 0

Check Point Zone Alarm Security Suite 9.3.037.000

Anti-virus engine version: 8.0.2.48; anti-virus signature DAT file version: 1045962880

Check Point’s Zone Alarm is a bit of a classic name in security, the free firewall offering having been a common sight for many years. The premium suite version – with anti-malware based on the solid Kaspersky engine – has been around a good while too, and has been a regular, if infrequent, participant in our comparatives for several years.

The current version came as a 148MB installer, with 81MB of updates provided separately. The set-up process includes the option to install a browser toolbar, subscription to an email newsletter, and the option to disable protection after install, for those users installing in conjunction with another anti-malware product. A reboot is needed to complete the process.

The interface is plain and unfussy, with small icons and lots of text. The suite includes the firewall, of course, as well as ‘Program control’, mail and identity protection, and parental control modules, as well as the anti-malware component. Operation is a little fiddly and unintuitive in places, but generally usable, with a good level of options. Stability was good with no issues in any of the tests, and everything was done within less than a day.

Scanning speeds were fairly slow, but lag times were quite light, and while RAM use was around average and additional time taken to perform our set of tasks fairly insignificant, CPU use when busy was sky high – a result confirmed by a repeat run of the full set of measures.

Detection rates were excellent, with rock-solid scores in the RAP sets; on-access scores in the main sets seemed oddly lower than on demand, but the WildList was handled fine in both modes, and there were no problems in the clean sets either, earning Check Point another VB100 award. The company’s infrequent submission pattern, generally only targeting our annual XP test, means only two passes and one fail in the last 12 tests, with the rest not entered.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 92.68%
Worms & bots: 99.17%
Polymorphic: 100.00%
False positives: 0

Clearsight Antivirus 2.1.48

Definitions version 13.6.215

Another in the family of solutions based on the Preventon SDK and VirusBuster engine, Clearsight returns for only its second attempt at VB100 certification, having been denied it last time thanks to a minor technicality in what was clearly a solid product.

The latest version, as expected, was supplied fully updated in a 67MB installer. Setting up followed a simple pattern of welcome screen, EULA, install location, go, with no reboot needed. An Internet connection was required to activate the product and access full controls, but all was over in under a minute.

The interface is highly familiar by now, this version being in a cool blue-and-white colour scheme. Its clear and simple operation made it easy to use and test – the only extra task being a registry tweak to enable full logging. Stability was not an issue even under heavy strain, and the tests took just about the full 24 hours allotted.

Scanning speeds closely mirrored those of others from this range, being a little slower than average over most types of files, but not too much. On-access lag times were around average and performance measures showed low use of resources and minimal impact on activities.

Detection results were also no big surprise, with solid scores averaging around the 90% mark, with a slight decline towards the more recent parts of the RAP sets. The WildList and clean sets were handled nicely, and Clearsight earns its first VB100 certification on its second attempt.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 86.49%
Worms & bots: 95.91%
Polymorphic: 100.00%
False positives: 0

Commtouch Command Anti-malware 5.1.10

Engine version: 5.2.12; DAT file ID: 201102232246

The Command product name has a long history in VB100 testing, dating all the way back to 1998. The company name may have changed with the acquisition of Authentium by Commtouch, but not much seems to have changed in the product itself.

The installer is an ultra-compact 12MB, with only 28MB extra by way of updates. The installation process is pretty simple – although it includes an option to detect ‘potentially unwanted’ items – and needs no reboot to complete. The product interface is similarly abrupt and to the point, with a stark simplicity and minimal configuration, but it manages to get the job done effectively. The solution has a ‘cloud’ component, which had to be disabled for the purposes of the main test suite.

A few problems were encountered during the running of the tests, with several blue screens observed when under heavy pressure. This, along with a tendency to lose or overwrite logs, held us back a little; indeed, even when logging seemed to have run happily, the process of opening the logs and exporting in the main interface regularly took so long that we gave up on it. All log data is stored in Access database format – clearly not the most efficient choice as even the most basic log with only a handful of detections recorded could take several minutes to convert into a displayable format. For the most part, we took the raw database files and ripped the data out ourselves. With these issues slowing us down, testing took perhaps 36 hours – not too troublesome.

Scanning speeds were on the slow side, with file access lag times fairly high, and although RAM usage was perhaps just a fraction above average, CPU use was fairly high too. Impact on our set of standard jobs was around average for the month though.

Detection rates, when full results were finally gathered and analysed, proved respectable, with an interesting upturn in the last week of the RAP sets. A couple of items in the clean sets were alerted on as packed with Themida, while another was labelled adware, but there were no problems and the WildList was handled smoothly too. A VB100 is duly earned, improving Command’s record to three passes and three fails in the last 12 tests, with six tests not entered.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 78.39%
Worms & bots: 87.12%
Polymorphic: 100.00%
False positives: 0

Comodo Internet Security Premium 5.3.176757.1236

Virus signature database version: 7793

Comodo is a relative newcomer to our comparatives, although the company and the product have been around for some time.

The company’s top-of-the-line suite solution came as a 34MB installer, but required full online updating on the deadline day. The set-up process was rather lengthy, partly because it included an extra component called ‘Geek Buddy’ – a support and troubleshooting system covering all aspects of the computer, with live chat and remote control by support staff. Once the initial install and required reboot were out of the way, this component had its own separate update process, which seemed to require a full re-download and re-install, just moments after the initial one. Then another update process began... Eventually everything seemed fully set up and up to date though, and a snapshot of the system was taken for later testing.

The product interface is quite attractive with its near-black background and hot red highlights. As well as the anti-malware and firewall components the suite includes a well-regarded HIPS system, ‘Defense+’, and much else besides. Controls lean towards the text-heavy rather than the blobby icons favoured by many, which makes them less easy to get lost amongst, and an excellent level of configuration is provided throughout. Stability seemed good in general, with some slowdowns in early runs attributed to the cloud component. This was disabled for on-demand scans but as far as we could tell it could not be switched off for the on-access module. Simply disconnecting from the lab network solved this little snag, and the rest of the test suite powered through in good time.

Scanning speeds were on the low side of average, with light lag times on access, very low use of system resources and no great impact on the run time of our activities set.

Detection rates were excellent, and declined only very slightly across the RAP sets. The WildList was handled nicely, and with only two, entirely permissible ‘suspicious’ alerts in the clean sets, Comodo earns its first VB100 award, on its third attempt. We look forward to welcoming the vendor to the test bench again.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 92.23%
Worms & bots: 96.03%
Polymorphic: 90.63%
False positives: 0

Coranti 2010

Version 1.003.00001

Coranti has had something of a rollercoaster ride in its first year of VB100 testing, with some excellent results tempered by the occasional false positive problem (as is always a danger with the multi-engine approach). The name of the product has seen a few changes as well, with the original ‘Multicore’ moniker dropped in favour of a simple ‘2010’ – somewhat odd given that earlier products had been labelled ‘2011’.

The latest version marks something of a departure, as the Norman engine that appeared in earlier tests has been phased out in favour of what is referred to as the ‘Lavasoft Ad-Aware scanning engine’ – this is presumably a combination of Lavasoft’s in-house anti-spyware expertise and the GFI (formerly Sunbelt) VIPRE engine also included in Lavasoft’s mainline Ad-Aware solutions. In addition to the Frisk and BitDefender engines retained from earlier incarnations, this should make for a formidable product, although the lab team did have some concerns based on stability issues encountered with Ad-Aware, and other solutions based on the same engine, in recent tests.

The installer was a lightweight 47MB, but online updates were also required on the deadline date. The install process was fast and simple, taking less than 30 seconds to complete and not demanding a reboot at the end. However, on opening the GUI the bulk of the controls were greyed out and it was clear that no scanning or protection was available. It may be that it simply needed some time to settle down, but in our haste a reboot was initiated, which soon solved things. With the interface fully functional, the online update ran in reasonable time (given that over 260MB of detection data was being fetched).

The interface is something of a joy, being designed for maximum clarity and simplicity, but at the same time providing an impeccably detailed set of configuration controls to satisfy the most demanding power user. Examining deeply into archives on access was the only area we could claim to be lacking. The scheduler received particular praise from the lab team for its nifty design. Despite our earlier fears, the product proved rock-solid as far as stability goes, and although the multi-pronged approach inevitably affected speed over our large test sets, it still got everything done and dusted in excellent time.

Scanning speeds over clean samples were a little on the slow side, as were lag times on access. Although RAM was a little higher than many and CPU use also fairly high, our set of standard tasks ran through in good time.

As predicted, detection rates were stratospheric, with barely a thing missed anywhere, and even the proactive week of the RAP sets was covered extremely well. The clean sets threw up a few detections, but as these were only for Themida-packed items and possible adware there were no problems here. With the WildList also powered through effortlessly, Coranti easily earns another VB100 award after a truly excellent performance. This makes three passes out of five entries in the vendor’s first year of competition, with only one (Linux) comparative not taken part in.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 99.00%
Worms & bots: 99.83%
Polymorphic: 100.00%
False positives: 0

Defenx Security Suite 2011

Version: 2011 (3390.519.1247)

Defenx has become something of a fixture in our comparatives over the past year or so, and has always been a welcome sight thanks to a record of good behaviour and reliability.

The version entered this month came as a 94MB installer, including updates, and took only a couple of clicks to install. The process continued for a couple of minutes after that, mainly taken up with firewall-related steps and checks, and a reboot was needed at the end. The interface reflects the company’s Swiss origins with its red-and-white colour scheme, and looks efficient and businesslike without seeming unfriendly or intimidating. Configuration is not over-generous for the anti-malware component (the full suite also including anti-spam and several other modules), but provides enough controls for most purposes and is easy to navigate and operate. Stability was excellent, with no problems at any point, and the use of caching of results even in infected items meant that the tests were sped through in excellent time.

Aided by the caching, scanning speeds were lightning fast, lag times feather-light, and performance measures stayed well within acceptable bounds.

Scores were solid, as we have come to expect from the VirusBuster engine underlying things, with decent levels across all sets. The WildList and clean sets were handled perfectly, and a VB100 is awarded to Defenx for its efforts. The vendor’s history in our comparatives is impeccable, with seven entries and seven passes, the recent Linux test the only one not entered since the product’s first appearance in last year’s XP comparative.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 88.54%
Worms & bots: 96.78%
Polymorphic: 100.00%
False positives: 0

Digital Defender 2.1.48

Definitions version 13.6.215

Another member of the Preventon club, Digital Defender has been around longer than most with more than a year’s worth of comparatives under its belt. The install process for the familiar 67MB package held no surprises, with a few stages and online activation all dealt with in a minute or so, no reboot required. The interface has a pleasant minty green hue, its layout once again giving us little to worry about, with the same simple design and reasonable set of options. No stability issues were noted, and testing went according to plan, completing within 24 hours.

Scanning speeds were slowish and lag times not too zippy, but resource consumption was low and our set of jobs was not too heavily impacted. Detection rates closely matched those of the rest of the product family, with little to complain about, and the core certification sets were handled without fuss. Digital Defender thus earns a VB100 award, its first since this time last year thanks to a string of bad luck; we fully expect the product to continue to do well.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 86.49%
Worms & bots: 95.91%
Polymorphic: 100.00%
False positives: 0

eEye Digital Security Blink Professional 4.7.1

Rule version 1603; anti-virus version 1.1.1257

Having initially only participated in VB100 tests once a year, in the annual XP test, eEye’s Blink has recently become a more regular participant, and the product has become quite familiar to the test team. Its most notable feature is the vulnerability monitoring system which is the company’s speciality, and which sits alongside anti-malware protection provided by Norman.

The product arrived as a fairly sizeable 157MB install package with an additional 94MB of updates. The installation process is not complex but takes a minute or two, starting off with the installation of some supporting packages and ending with no need to reboot. After installation the firewall seems to be switched off by default, but the anti-malware component – included alongside the vulnerability management and intrusion prevention system – is up and running from the off. The interface is of fairly standard design, with status and configuration sections for each module, and controls are limited but provide the basic requirements. We encountered no problems with stability, and managed to use the scheduler system without any trouble, running the bulk of the testing over a weekend to make the best use of time.

This proved to be a good thing since the product has a somewhat languorous approach to scanning, dawdling dreamily along and showing no sign of urgency. Scanning speeds were very slow, and file access lag times very high, with heavy use of CPU cycles when busy, but RAM was not too heavily drained and our set of jobs did not take much longer than normal to complete.

Detection rates were respectable but not jaw-dropping, with decent coverage in all the sets, the proactive week of the RAP sets showing a slight upturn over the previous week. A couple of suspicious detections in the clean sets were allowable, and the WildList was covered in its entirety, earning eEye a VB100 award. The product’s recent test history has not been great, with a string of problems including missed polymorphic samples and false positives in the last year; it now has three passes and five fails in the last two years, having skipped four tests. The last six tests show a slightly better picture, with two passed, two failed, two not entered.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 86.73%
Worms & bots: 89.16%
Polymorphic: 99.98%
False positives: 0

EmsiSoft Anti-Malware 5.1.04

EmsiSoft dropped its widely recognized ‘A-Squared’ name in favour of a more sober title some time ago, but the product remains familiar and includes references to the old name in several folders and files used by the installed product. Much of the detection is provided by the Ikarus engine.

This month’s submission measured a little over 100MB, including all updates, and ran through the standard steps followed by a lightning-fast installation. With this complete (no reboot was required), a configuration wizard ran through some set-up stages including licensing, updates, joining a feedback system, and an initial system scan. The interface is quite appealing, adorned with a rotating Trojan horse image, and has a few quirks of design but is generally clearly laid out and not too difficult to operate. Configuration is reasonable, but provides no option to simply block access to infected items in the on-access module – something which often causes problems in large-scale testing.

Scanning speeds were fairly slow, but on-access lag times were extremely low, with low use of system memory. CPU cycle use was a little higher than average though, and our suite of standard jobs took a little longer than usual to complete.

Once we got onto the infected sets the need to disinfect or quarantine all samples, or else respond to a pop-up for each and every one, soon caused the expected problems, with the product freezing up entirely and refusing to respond to anything. Even after a reboot it proved unusable, and we had to resort to reinstalling on a fresh machine image. Eventually, by chopping jobs up into smaller chunks, we managed to get a full set of results, which showed some splendid figures. Coverage of core certification sets, however, was not so splendid, with a handful of items missed in the WildList set, and some false alarms in the clean sets. These included one file flagged as the infamous Netsky worm and another as the nasty polymorphic Virut – both were, in fact, innocent PDF handling software. This was plenty to deny EmsiSoft a VB100 award this month, leaving it on a 50-50 record of two passes, two fails in the last six tests.

ItW: 99.33%
ItW (o/a): 100.00%
Trojans: 95.06%
Worms & bots: 98.88%
Polymorphic: 95.58%
False positives: 2

eScan Internet Security Suite 11.0.1139.924

The eScan product range has a long and solid history in our comparatives, dating back to 2003 and covering a wide selection of platforms. Not long after dropping an OEM engine from the product, it has put in some excellent performances of late.

The current version of the premium suite solution came as a 156MB installer, no further updates required, and installed in three or four clicks, with no reboot needed. After the main install came some standard initial set-up stages, and things were soon moving along.

The product interface is a rather funky affair, with a panel of glitzy cartoon icons along the bottom and a slightly more sober display of status information in the main window. Configuration is comprehensive and detailed with good attention paid to a logical, intuitive layout, and testing moved along nicely. Scanning speeds were rather sluggish at first, but after first sight of things some result caching came into play and the process sped up nicely. On access, lag times were impressively low, and memory use was fairly low too, with CPU drain and impact on our suite of standard jobs around average.

Detection rates were pretty decent, with highly impressive scores in all sets – a slight decline towards the newer end of the RAP sets still not taking things below 90%. The WildList and clean sets threw up no issues, and eScan comfortably earns another VB100 award – having not missed a single test in the last two years, it now has nine passes to only three fails: a very respectable record of achievement.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 97.06%
Worms & bots: 99.70%
Polymorphic: 100.00%
False positives: 0

ESET NOD32 Antivirus 4

Version 4.2.71.2; virus signature database: 5901 (20110223)

ESET has an even more illustrious history in our tests, still holding onto its record for the longest unbroken run of certification passes – and indeed comparatives taken part in, the vendor not having missed a test since 2003.

The current product has been in stable form for some time. This month’s submission, a nice small 44MB executable, was installed with the standard steps, enlivened as usual by the enforced choice of whether or not to detect ‘potentially unwanted’ software – the ‘next’ button is greyed out until a selection is made. It doesn’t take long and no reboot is needed, just a short pause before the protection is in place.

The interface is simple and unfussy, but provides a wealth of fine-tuning controls. There is so much here that some of it seems to be a little superfluous and in places overlapping, and we have long had trouble figuring out the controls for scanning inside archives on access. However, it is generally solid and intuitive. Occasionally the interface tends to get itself in a bit of a twist after a scan job, but it invariably sorts itself out within a few moments, and the only other issue noted was the occasional scan display screen not finishing properly, lingering at 99% when logs showed the scan had already completed without problems.

Scanning speeds were OK, and on-access lag times fairly low too, with low use of resources. Impact on our set of activities was a little higher than most, but not too much.

Detection rates were excellent as usual, with most of the sets demolished and there was superb regularity in the reactive part of the RAP sets. A couple of items were flagged as unsavoury in the clean sets, one of them being packed with Themida and another a toolbar, but no problems arose there or in the WildList – thus earning ESET yet another VB100 award to maintain the 100% record it has held for the best part of a decade.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 89.29%
Worms & bots: 98.13%
Polymorphic: 100.00%
False positives: 0

Filseclab Twister AntiVirus V7 R3

Version 7.3.4.9985; definition version 13.35.42143

Filseclab first took part in our comparatives just over two years ago, and has been gamely regular in its appearances ever since, despite as yet no luck in achieving certification. The vendor’s solution is an interesting and unusual one, but provides all the usual features one would expect from an anti-malware product.

The main installer is 53MB, with a 54MB updater also freely available to download from the company’s website. The set-up process is completed in three clicks and about ten seconds, although the updater program is a little less zippy – apparently doing nothing for a minute or so before a window appears showing progress. The interface is quirky but not unclear, with a wide selection of options crammed into a small area. We noted with interest that the support email address shown in the ‘about’ window is at hotmail.com.

Running through the tests is always a little fiddly as the product only records on-access detections when set to erase or clean automatically – otherwise, a pop-up appears noting the detection and asking for a decision as to what to do about it, but no entry is made in the product log until the choice is made. Nevertheless, it seemed to cope with the heavy workload and got through the tests in good time.

Scanning speeds were not incredibly fast, but file access lags were very low and processor cycle use was low too, although memory consumption was fairly high. The set of standard jobs completed in average time.

Detection rates were not too bad in general, but there were quite a few misses in the WildList set (many more polymorphic samples missed on access than on demand), and a fairly large smattering of false alarms in the clean sets. As a result, the product is denied certification once again, but it seems to be showing steady improvement in both solidity and coverage – and it seems likely that Filseclab will reach the VB100 standard in the not too distant future.

ItW: 97.62%
ItW (o/a): 92.81%
Trojans: 66.86%
Worms & bots: 68.29%
Polymorphic: 63.35%
False positives: 19

Fortinet FortiClient 4.1.3.143

Virus signatures version: 10.7; virus engine version: 4.2.257

Fortinet’s main business is in the appliance market, but its client solutions have long been regulars in VB100 tests, with some strong improvement in detection seen over the last few tests.

The installer is a tiny 9.8MB, supplemented considerably by 132MB of updates. The set-up process starts with a choice of free or premium versions, then after a couple more clicks and a pause of 20 seconds or so it’s all ready to go without a reboot. Applying the updates is a simple and similarly speedy process.

The interface is efficient and businesslike, with an intuitive layout and an excellent level of configuration – as one would expect from a primarily corporate solution. Operating proved generally easy and stable, although at one point a considerable amount of work was wasted when the product appeared to delete all logs from the previous day, despite having explicitly been told not to. Even with this delay, testing did not overrun by more than a few hours. We also noted that the on-access scanner is fired when browsing folders containing infected items with the scanner module, which was rather confusing.

Speeds and lag times were fairly average, as were other performance measures, with CPU use perhaps slightly higher than most. Detection rates were highly impressive, showing a continuation of the gradual upward trend noted in recent tests. This appears for the most part to be due to the enabling of ever stronger heuristics, which used to be mainly switched off by default.

Of course, increasing heuristics always comes with its associated risks, and this month it looks like things have been taken a fraction too far: a single item in the clean sets, from Canadian software house Corel, was flagged as a Krap trojan. This false alarm denies Fortinet a VB100 award this month, despite a good showing and flawless coverage of the WildList set. The vendor’s two-year record shows seven passes and now three fails, with only the Linux comparatives not entered; the last six tests show a slightly rosier picture, with only one fail and four passes from five entries.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 93.07%
Worms & bots: 98.08%
Polymorphic: 100.00%
False positives: 1

Frisk F-PROT Antivirus for Windows 6.0.9.5

Scanning engine version number 4.6.2; virus signature file from 22/02/2011 14:06

Frisk is a pretty long-serving company, its first VB100 appearance was in 1999 and it hasn’t missed a comparative since 2007. The product hasn’t seen any major changes since then either, sticking to its tried and trusted formula.

The installer is a compact 30MB, with an extra 30MB zip file containing the latest updates. The set-up process requires three or four clicks and a ten-second wait, then a reboot is demanded to complete the installation. The interface is minimalist but provides a basic set of options, including among them the choice to detect only Microsoft Office-related malware – something of a throwback to the past. Operating is not difficult and stability is generally good, but as usual during large scans of weird and wonderful malware the scanner occasionally died. Its own friendly crash screen – from which several sets of debug info were saved – was presented each time it died mid-task.

Scanning speeds were fairly good, and lag times fairly low. RAM consumption was a little above average, but other performance measures showed a lighter touch.

Detection results were gathered easily enough after repeating several jobs, and showed decent if not exactly mind-blowing scores across the sets. Once again there was a slight upturn in the proactive week of the RAP sets. The WildList and clean sets were properly managed, and Frisk comfortably earns VB100 certification once again. The company’s record has been somewhat patchy over the last few years, with seven tests passed out of a potential 12.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 75.14%
Worms & bots: 90.77%
Polymorphic: 100.00%
False positives: 0

F-Secure Client Security 9

9.01 build 122; anti-virus 9.20 build 16701

F-Secure routinely submits a brace of products these days: one its standard desktop suite, and the other from the ‘client’ branch – presumably a more business-focused effort – but there is usually little difference between the two. This client edition had a 58MB installer and a 125MB update bundle, which was shared by the two solutions.

The set-up process went through several stages including some questions about management systems and which components to install, and needed a reboot to complete. The interface is dominated by a large green tick to indicate all is well, and has a very simplified design which is somewhat awkward to navigate in places. There is little by way of fine-tuning controls. Stability seemed a little suspect, with some scans freezing and reboots required to restore functionality to the product. Running over infected sets was even more rocky, with age-old logging issues rearing their ugly heads once more. A run over the clean sets reported a number of detections, urgently labelled ‘infection’, but on trying to display the log we were instead shown one from a previous scan over the archive sample set. This sort of disinformation could be extremely troubling to a user.

Speeds were very fast once files had been checked out for the first time, and this effect had an even more notable impact on lag times. The batch of standard jobs completed rapidly and resource consumption remained low throughout.

Logging problems continued in the main infected sets, where a large job was left to run overnight only to find that no details could be shown the following morning. The task was repeated using the command-line scanner included with the product, with options tuned to approximate the GUI scanner as closely as possible. The scores turned up in the end were uniformly excellent – more than sufficient to cheer us up after a rather dismal testing spell; RAP scores were particularly impressive. The clean sets were found to contain only a ‘riskware’ item, which is allowed, and the WildList set was covered without problems, thus earning F-Secure a VB100 award without difficulty. This product line has been entered in all desktop tests since late 2009, passing every time.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 96.45%
Worms & bots: 99.61%
Polymorphic: 100.00%
False positives: 0

F-Secure Internet Security 2011

1051 build 106; anti-virus 9.30 build 400

Despite slightly different version details and a change of product title, this looks like a pretty similar product to the last. The 55MB installer and that same 125MB updater install slightly more simply – at least when choosing the automatic rather than step-by-step mode. After a minute or so copying files around and so on, it requests the opportunity to validate itself online, but no reboot is needed to finish things off.

The interface is much like the previous product: simple with a bare-bones set of options under the hood, but it proved reasonably easy to make our way through our tests, helped along by blink-and-you’ll-miss-it scanning speeds in the ‘warm’ scans. Once again we saw some wobbliness in the scanner set-up, with some scan jobs disappearing silently after being set up, and others failing to produce final reports – we saw the same confusion covering the clean set, where the scan progress indicated a detection had been found but the final report could not enlighten us further. Again the command-line tool was used for the more hefty jobs, and proved much more reliable.

With scan speeds and lag times similar to the client solution, memory use seemed a little higher, and a slightly heavier impact on our set of activities was observed.

Detection rates were again superb, with over 90% everywhere. The core certification requirements were comfortably met, and F-Secure picks up a second award this month. The company’s main product line has an exemplary record of ten passes in the past two years, with only the annual Linux tests not entered.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 96.60%
Worms & bots: 99.63%
Polymorphic: 100.00%
False positives: 0

G DATA AntiVirus 2011

Program version: 21.1.1.0 (9/22/2010)

G DATA is always a welcome sight on our test bench thanks to an excellent record of stability and good behaviour, to say nothing of invariably impressive detection levels. The vendor’s dual-engine approach also manages to avoid the excessive sluggishness which is so often associated with this kind of product.

The latest version came as a not too huge 189MB installer, including all the required data, and took only a few straightforward steps to get set up, although a reboot is required. The interface is simple but efficient, concealing a wealth of control beneath its pared-down exterior, and is a delight to operate. At one point we experienced something of an oddity during our performance tests, but this seemed to be something to do with the automation scripts (or possibly some behavioural monitor not liking what they were doing), and the product itself remained stable and solid. All jobs were out of the way within a single working day, well within the allotted 24 hours.

This was partly thanks to the excellent use of results caching to avoid repeating work, which made for some good speed measures. On-access lags looked higher than some in our graph thanks to very thorough checks with scanning depth and breadth turned up high. Resource use was pleasingly low, with our standard jobs running through in reasonable time.

Detection rates were uniformly excellent, with only the tiniest number of samples not spotted and even the proactive week of the RAP sets covered superbly. The WildList was demolished in short order and the only alerts in the clean sets were for password-protected archives, thus G DATA earns another VB100 award with some ease. The vendor’s recent record is pretty strong: eight passes and only a single fail in the last two years, with three tests not entered; four of the passes, as well as that one unlucky fail, have been in the last six tests.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 99.52%
Worms & bots: 99.88%
Polymorphic: 100.00%
False positives: 0

Hauri ViRobot Desktop 5.5

Engine version 2011-02-22.00(6659169)

Hauri has a somewhat sporadic history in our comparatives, entering several tests in a row and then vanishing for a few years. The company’s current product is a combination of the BitDefender engine with some additional detection of its own.

The installer is a sizeable 300MB, but it gets to work fairly rapidly, even taking into account the scan of running processes performed before it gets going. No reboot is required to complete. The interface is clear and sensible, simple to navigate even for an unfamiliar user, and our lab team found it pleasant both to look at and to use. The product generally ran stably, but logging was a bit of an issue, the process of going from the end-of-scan dialog to a saved log taking anything from ten minutes to three hours, depending on the size of the log being exported. We also found the scheduler a little irritating, as despite having set it only to log all detections, it stopped at the first sample spotted and asked if it should continue with the scan. As this detection took place at 8PM on a Friday, and we had hoped to get a few hundred thousand more in the bag by Monday morning, it was a bit of a disappointment to find it sitting there waiting for our decision when we got back after the weekend. Repeating this job meant it took up more than double the expected 24-hour period, even excluding the time we were out of the office.

Scanning speeds were pretty sluggish even with the fairly light default settings, and turning on full scanning of archives resulted in a truly exhaustive and lengthy scan time. On-access measures showed some pretty heavy lag times too, although memory use was low and other performance measures around average.

Detection rates were rather disappointing given the OEM engine included, and we had to repeat things later on to reassure ourselves we had not made some mistake. A second run showed the exact same set of scores however. These were not too dismal, but well short of what was expected, and although the clean set seemed to be handled without problems, a handful of items in the WildList went undetected, and a VB100 award remains just out of reach for Hauri. The product has been entered twice in the last year with a similar lack of success on each occasion.

ItW: 99.33%
ItW (o/a): 99.33%
Trojans: 65.04%
Worms & bots: 64.96%
Polymorphic: 100.00%
False positives: 0

Ikarus T3 virus.utilities 1.0.258

Virus database version 77801

Ikarus earned its first VB100 award last summer, having first taken part in a comparative as long ago as 2001, but then disappearing for several years. The achievement was repeated on Windows 7 in the autumn, and now Ikarus returns to try to make it a hat-trick.

The product is provided as a complete CD iso image, weighing in at 206MB, with an extra 69MB of updates to apply as well. The installation process includes adding the Microsoft .NET framework, if not already available. This is handily bundled into the install package but adds several minutes to an already fairly lengthy task.

The interface has traditionally been a little wobbly, particularly when first trying to open it, but it seemed a little more responsive on this occasion. It is pretty basic, with not many menus or buttons, but manages to provide a rudimentary set of controls to fill most needs. When running under heavy pressure it is particularly ungainly, flickering and juddering like a mad thing, and often needs a reboot after a big job to get back to normal operation. After one fairly reasonable job scanning our set of archive files, things took a turn for the worse, and even a reboot couldn’t help. With the OA module munching up RAM, the interface refusing to open and several standard Windows functions failing to function, we had no choice but to wipe the system and start again with a fresh operating system image. This time it kept going despite the heavy load, getting to the end in reasonable time.

Scanning speeds were OK, and lag times fairly light, with RAM use below average and CPU use a little above average, while the set of activities completed very quickly indeed.

Detection rates were excellent, with splendid scores across the board. However, a single item in the WildList set was missed – a closer look showed this was an exceptionally large file, which has upset some other products of late, implying that Ikarus imposes some limit on the size of files scanned by default. Further investigation confirmed that there was a cap, sensibly set to 8MB, which was considerably smaller than the file in question. However, removing this limit still did not result in detection, even when the file was scanned on its own. Finding this a little odd, we tried re-running the job with the limit left in place, but increased to a size that covered the file in question. This successfully enabled detection, hinting that the controls are less than fully functional. Of course our rules insist on default settings for our official scores, so the eventual detection cannot be counted. In addition, a handful of false alarms were generated in the clean sets, including a Virut alert on a piece of office software, thus Ikarus doesn’t quite make the grade for certification and will have to wait for its third award.

ItW: 99.83%
ItW (o/a): 99.83%
Trojans: 97.27%
Worms & bots: 99.43%
Polymorphic: 95.58%
False positives: 3

iolo System Shield 4.2.1

Specializing in the optimization, clean-up and recovery spheres, iolo has been active in security for a while too, with occasional VB100 entries dating back to 2007. The company achieved its first VB100 award in the last Windows 7 test (see VB, December 2010, p.27), with its current security offering based on the F-Prot engine.

The install process requires an Internet connection, with the initial installer no more than a downloader – only 450KB in size. This fetches the main installer, which is also fairly small at 3MB, and which proceeds to fetch the other components required. The process is not too long or taxing, but a reboot is needed at the end.

The interface is attractive and simply laid out, with minimal clutter, and provides a decent level of configuration in a pleasantly accessible style. The only things missing were a setting to simply block or record detections without any automatic action, and the lack of an option to save log data to a file – leaving us wrangling an ugly and ungainly database format into shape to retrieve results. Occasionally scans seemed to stop at random, and the awkward log format made it difficult to see how far they had gone, or even if any results had been saved. We also saw some scans claiming to have completed but clearly not having covered the full area requested. In the end, however, we managed to pull together what looked to be a complete set of results.

Speed measures were a little slow on demand, with some fairly heavy lag times on access, and with RAM use about average and impact on our suite of tasks average too, CPU use was fairly high.

Our decryption of the logs we gathered showed some fairly respectable scores in most areas, with no problems in the clean sets or with the on-demand scan of the WildList set. On access, however, the same large file which has tripped up a couple of other products was not spotted – probably due, once again, to a cap imposed on the file size to scan, although we could find no visible information on this limit and no clear way to change it if desired. This missed detection was enough to deny iolo its second VB100 award, by a whisker. From three entries in the last two years the vendor now has two fails and one pass.

ItW: 100.00%
ItW (o/a): 99.83%
Trojans: 74.39%
Worms & bots: 86.46%
Polymorphic: 100.00%
False positives: 0

K7 Total Security 11.1.0025

Malware definition version: 9.90.3942

K7 Computing has become a regular in our tests over the last few years, building up a solid record of success and keeping the lab team happy with simple, reliable products.

The latest version was provided as a 71MB installer complete with all required definition data. The install process seems to consist only of a welcome screen and a EULA – in the blink of an eye everything is done and set up, with the product asking if it can be activated. No reboot is required and the process is all over in under ten seconds. This gives instant access to the interface, which is truly something to behold in an eye-watering collection of bright and gaudy reds, yellows, oranges and pinks. The layout, at least, is pleasant and simple, with good navigation, although it is somewhat wordy in places and we found it easy to click in the wrong place where a lot of options were clustered close together.

Running through the tests proved reasonably straightforward, although a couple of scan jobs seemed to have trouble traversing directory structures, occasionally only covering the first of several subfolders of the selected region.

We also hit a problem in the on-access test where a single item seemed to be tripping up the engine, causing a blue screen – several runs over the same batch of samples brought the same result, so the set was split into small chunks to get as much coverage as possible.

Scanning speeds were not very fast, but lag times were not very heavy, and system resource use was low, with a low impact on our set of activities. In the end detection results proved pretty solid too, with respectable scores in all sets, a gradual downturn through the RAP weeks and a slight rally in the proactive week – an unusual pattern that K7 has repeated in three comparatives in a row now.

Both the WildList and the clean set were handled well, and another VB100 award is earned by K7 this month. The company now has a solid record of seven passes and one fail in the last 12 tests, with four not entered; in the last year, K7 has three passes from three entries.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 84.52%
Worms & bots: 95.92%
Polymorphic: 100.00%
False positives: 0

Kaspersky Anti-Virus 6.0 for Windows Workstations

Version 6.0.4.1212 (a)

This month sees a trio of entries from Kaspersky Lab – which, until it skipped last year’s Linux test, was the only vendor with a 100% record of participation in our comparatives since the VB100 award was introduced in 1998.

The product family has evolved considerably over the years. The rather modest title of this, the vendor’s business-focused solution, conceals the multi-faceted nature of what is really a fairly complete suite, including anti-spam, device control and intrusion prevention alongside the anti-malware. The installer is not too huge though, at just under 150MB, and is accompanied as usual by a large archive containing all updates for the company’s wide range of products. The set-up process is fairly lengthy, going through a number of stages including disabling the Windows Firewall, the option to set a password to protect the product settings, and analysis of applications allowed to connect to the network, alongside more standard items like licensing, updates and so on. It requests a reboot to finish things off.

The interface is cool and stylish, with perhaps a little too much emphasis on the funkiness – an odd approach to blending text links and buttons is occasionally confusing, but as a whole it is generally workable, improving greatly with a little familiarity. Fine-tuning is provided in exhaustive depth, with detailed reporting as well, and things were generally smooth and stable. At one point we observed the product crashing, having snagged on a single file in the RAP sets, but when the offending item was removed everything ran through without problems.

File access lags were low, and scanning speeds pretty good, improving immensely in the warm runs. Memory usage was also low, with CPU use a little higher than most, and in the activity test a fairly high impact was observed on the time taken to complete the task.

Detection rates, when finally analysed after the very slow process of exporting log files, proved to be excellent, with only a very slight decline across the RAP sets. The WildList and clean sets were handled expertly, comfortably earning Kaspersky a VB100 award for its business solution. The product’s recent record is pretty solid, with nine passes and two misses in the last two years, with just the one test not entered. The last six tests show five passes.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 91.04%
Worms & bots: 99.24%
Polymorphic: 100.00%
False positives: 0

Kaspersky Internet Security 2011

Version: 11.0.2.5556 (a)

Kaspersky’s consumer suite solution will be a familiar sight to anyone who frequents retail software outlets, with its metallic green packaging. It has been a semi-regular participant in our comparatives for a couple of years now, usually appearing alongside the business variant already discussed here.

The installer is somewhat smaller at 115MB, and the set-up process is considerably simpler, with only a few standard steps, a few seconds processing and no reboot to complete. The interface looks much like the business version, and the usage experience is pretty similar. We found it occasionally slow to respond, and once again found some of the buttons less than clear to use. However, the level of control available was excellent and stability was generally fine, with the known-bad file removed from the RAP sets in advance to ensure a steady run through. Once again, exporting logs was slow but sure.

Memory consumption was fairly low, and CPU use not too high either, while scanning speeds were pretty fast, again speeding up massively in the warm runs. Once again there was a fairly significant impact on the time taken to complete our suite of activities.

Detection rates were splendid, with excellent scores in all sets. Perfect coverage of the WildList and clean sets comfortably earns Kaspersky a second award this month. Our records for the consumer product line look pretty good, with seven passes, a single fail and one test skipped since first appearing in December 2009. Five of the last six entries have earned certification.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 90.73%
Worms & bots: 97.52%
Polymorphic: 100.00%
False positives: 0

Kaspersky PURE

Version: 9.1.0.124 (a.b)

PURE is a fairly new arrival from Kaspersky, an extension of the suite concept while promising an even broader range of protection. This is its first appearance on our test bench.

Much like the standard suite offering, the installer is around 120MB and, coupled with the same update package shared by its stable mates, it runs through very rapidly, the whole job being over with in less than half a minute with no restart needed. The GUI eschews the company’s traditional deep greens, opting instead for a pale, minty turquoise, and has a somewhat simpler and clearer layout – although it sticks to the practice of blending buttons and links in places. Again, an enormous amount of fine-tuning is provided under the hood, with the controls generally easy to find and use, and the overall experience felt nimbler and more responsive than the previous offering.

Scanning speeds closely mirrored those of the rest of the range, while on-access lags were a little heavier. RAM usage was on the low side and CPU use a little high, with impact on the set of activities quite high too.

Detection rates were very similar to the I.S. product, with superb scores in all sets. A clear run through the core certification sets earns PURE a VB100 award on its first attempt.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 93.43%
Worms & bots: 99.43%
Polymorphic: 100.00%
False positives: 0

Keniu Antivirus 1.0

Program version: 1.0.5.1142; virus definition version: 2011.02.23.1008

Keniu has been a regular participant in the last few tests, having first entered in the summer of last year. The company has recently formed an alliance with fellow Chinese security firm Kingsoft, but so far there have been no signs of a merging of their solutions, with Keniu still based on the Kaspersky engine.

The install package is a fraction under 100MB, including all required updates, and the set-up process is fast and simple, with only a few steps, no need to reboot and everything done in less than a minute. The interface is bare and minimalist, with two basic tabs, a few large buttons and a basic set of configuration controls. With sensible defaults and smooth stable running the tests were out of the way in no time.

Scanning speeds were somewhat on the slow side, especially in the archives set, with archives probed very deeply by default. RAM and CPU usage were on the low side, and impact on our activities bundle was not too high.

Detection rates were excellent, as expected from the solid engine underpinning the product, with very high figures in all sets. The clean set threw up no problems, and the WildList was handled fine on demand, but in the on-access run a single item was marked as missed by our testing tool. Suspecting an error, we reinstalled and repeated the test, this time finding several dozen items missed, including the one not spotted the first time, and the product’s internal logs matched those of our testing tool. Running a third install showed another selection of misses – even more this time. In the end, no changes to the product settings or the way the test was run could prod the product into functioning properly. This rather baffling result denies Keniu a VB100 award this month; the vendor’s record shows three consecutive passes in its earlier three entries.

ItW: 100.00%
ItW (o/a): 99.83%
Trojans: 93.57%
Worms & bots: 99.45%
Polymorphic: 100.00%
False positives: 0

Keyguard Internet Security Antivirus 1.1.48

Definitions version 13.6.215

Another from the family of products based on the Preventon set-up, Keyguard was a last-minute addition to this month’s list, our first contact with the company coming on the submission deadline day itself.

The familiar 67MB installer was pushed through its set-up in good order, with the usual connection to the Internet required to activate and access controls. The Keyguard version of the interface has a pleasant spring green colour scheme, with the usual simple but lucid and usable layout and solid levels of stability.

Speeds and overheads were all on the decent side, with low impact on file accesses and activities and low use of resources, while detection rates were decent and respectable. With no problems in the certification sets, Keyguard proves worthy of a VB100 award on its first attempt.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 86.49%
Worms & bots: 95.91%
Polymorphic: 100.00%
False positives: 0

Kingsoft Internet Security 2011 Advanced

Program version: 2008.11.6.63; engine version: 2009.02.05.15; data stream: 2007.03.29.18; virus definitions: 2011.02.24.02

Kingsoft is a major player in the Chinese market, and has been a regular in our comparatives since its first appearance in 2006. The vendor came into this month’s test looking for a change of fortune, after a string of tricky tests upset by problems with polymorphic viruses in our WildList sets.

The vendor’s ‘Advanced’ version came as a compact 68MB installer, which runs through simply in a handful of standard steps with no reboot required. The product interface is bright and cheerful – not the most visually appealing, but clean and simply laid out, with a basic but functional set of configuration controls. Operation was stable and solid throughout, and the tests were completed in good time.

Scanning speeds were not outstanding, but on-access lag times were not bad, and while RAM use was a little higher than some, CPU use was below average, as was impact on our suite of standard activities. Detection rates were far from stellar, with low scores in all our sets. The trojans set was particularly poorly covered, and RAP scores fluctuated unpredictably but never achieved anything close to a decent level. Nevertheless, the core certification requirements were met, with no problems in the WildList or clean sets, and a VB100 award is duly earned. The last two years show six passes and four fails, with only the two Linux comparatives not entered; three of those fails were in the last six tests.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 16.70%
Worms & bots: 39.45%
Polymorphic: 96.04%
False positives: 0

Kingsoft Internet Security 2011 Standard-A

Program version: 2008.11.6.63; engine version: 2009.02.05.15; data stream: 2007.03.29.18; virus definitions: 2011.02.23.08

Kingsoft has routinely entered its ‘Standard’ product alongside the ‘Advanced’ one, and this time offers two separate variants on the theme (‘Standard-A’ and ‘Standard-B’), although as usual they are hard to tell apart.

The install process is again fast and simple, and the interface clean, responsive and easy to navigate, with good stability allowing us to get through all the tests in good time.

Scanning speeds and lag times closely matched those of the ‘Advanced’ edition, while RAM use was a little higher and CPU use a little lower, with impact on our activity set a little higher too. As expected, detection rates were even worse, with some truly terrible scores in the RAP sets – the proactive week score bizarrely some way better than the others.

Despite this poor showing, the WildList set was covered fully and there were no issues in the clean sets, so a VB100 award is earned, just about. That makes for four passes and four fails in the last dozen tests, with four not entered; in the last year the product has had two passes and two fails, with two tests skipped.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 8.47%
Worms & bots: 35.68%
Polymorphic: 96.04%
False positives: 0

Kingsoft Internet Security 2011 Standard-B

Program version: 2008.11.6.63; engine version: 2009.02.05.15; data stream: 2007.03.29.18; virus definitions: 2011.02.23.08

There’s not much more to say about the third entry from Kingsoft, with very little to distinguish it from the other two in terms of user experience, with the install process and interface identical to the other two. Even the fine detail of the version information is unchanged.

Scanning speeds were a little slower, and lag times a little higher in some cases, with more RAM consumed than either of the others, but fewer CPU cycles, while the impact on our activity suite was much the same.

Detection rates were fairly abysmal, a fraction lower than the other ‘Standard’ edition, but the core certification requirements were met and a VB100 award is earned.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 8.46%
Worms & bots: 35.66%
Polymorphic: 96.04%
False positives: 0

Lavasoft Ad-Aware Total Security

Anti-virus version 21.1.0.28

Lavasoft first entered our comparatives in 2010, and has submitted both its standard product, based on the GFI/Sunbelt VIPRE engine, and this one, combining the might of G DATA with its own anti-spyware expertise, in several recent tests. The Total version has had some unlucky results recently, and has yet to achieve a VB100 award, despite some very strong performances. This month the standard product is absent pending fixes to some issues coping with the heavy stresses of our tests, but we were pleased to see the Total offering return for another stab.

The installer is something of a beast at over 450MB, but that includes all required update data for all the engines. The set-up process runs through a number of stages, including the options to include parental controls and a data shredder system, and setting up some scheduled scanning and backup tasks, before the main installation. This runs for a minute or so, followed by a reboot.

The interface is very similar to G DATA’s, with a few extras and a little rebranding, and as such proved a delight to operate, with its excellent level of controls and solid, reliable running even under heavy pressure. All tests were out of the way well within the allotted 24 hours.

Scanning speeds were not super fast to start with but benefited hugely from the smart caching of previous results, and on-access lag times were not too heavy either. Use of RAM and CPU cycles, and impact on our set of activities, were perhaps slightly above average, but not too heavy.

Most users would consider the reasonable system impact more than made up for by the superb detection levels achieved by the product, which destroyed our test sets with barely a crumb left behind. The RAP set closely approached complete coverage in the earlier two weeks, dropping off only very slightly. The WildList presented no difficulties, and finally the clean set was handled without incident either. Lavasoft’s Total product earns its first VB100 award after its third showing.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 97.60%
Worms & bots: 99.71%
Polymorphic: 100.00%
False positives: 0

Logic Ocean GProtect 1.1.48

Definitions version 13.6.215

Yet another entry from the Preventon family, based on the VirusBuster engine, GProtect was another last-minute arrival, turning up right at the end of the submission deadline day.

This version of the solution had the same 67MB installer, running through the same handful of steps to get set up rapidly with no need to restart, although an Internet connection is needed to activate. The interface is a rather sickly blend of greens, oranges, purples and pastel blues, but with some turning down of the screen it is just about bearable, and provides the usual solid, if basic set of controls. Stability remained excellent, with no problems getting through the full test suite within the expected time.

Scanning times were OK and lag times not too heavy, while RAM use and impact on our set of tasks were fairly low and CPU use not too high either. Detection rates were respectable, with no problems in the core sets, and Logic Ocean duly earns a VB100 award on its first attempt.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 86.49%
Worms & bots: 95.91%
Polymorphic: 100.00%
False positives: 0

McAfee VirusScan Enterprise + AntiSpyware Enterprise 8.8

Scan engine version: 5400.1158; DAT version: 6266.0000

McAfee has recently been having a bit of a tough time handling some of the polymorphic strains replicated in large numbers for our test sets. However, with a lot of work having been put into ironing out these issues, things looked good for a return to form.

The product came as a 37MB installer with the DAT package measuring 85MB, and the set-up process was simple and straightforward, with a reboot not demanded but subtly recommended. The GUI remains grey and sober but efficient and simple to use. A full and complete range of controls is provided, as one would expect from a major corporate solution.

Running through the tests proved no great chore, as stability was rock-solid throughout and everything behaved just as expected. Scanning times were pretty good to start with and sped up enormously in the warm scans. Overheads were not bad either, and there was low drain on CPU cycles and minimal impact on our set of activities. Detection rates were pretty good, with a step down in the second half of the RAP sets, but the WildList was handled fine and the clean sets threw up only a handful of adware alerts – presumably from the anti-spyware component which has been added to the product title since previous entries.

A VB100 award is duly earned, doubtless to great relief at McAfee, making two passes and two fails from four entries in the last six tests; the two-year picture is much brighter, with eight passes and two fails, with two tests not entered.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 85.04%
Worms & bots: 94.76%
Polymorphic: 100.00%
False positives: 0

Microsoft Forefront Endpoint Protection 2010

Version: 2.0.657.0; anti-malware client version: 3.0.8107.0; engine version: 1.1.6502.0; anti-virus definition version: 1.97.2262.0

Microsoft has generally alternated its Forefront and Security Essentials products in our server and desktop tests respectively, but this pattern is shaken up a little this month with the corporate product appearing.

The installer is compact at 19MB, with 63MB of updates also provided. The set-up process is fairly simple, with a half-dozen steps to click through and no reboot required, and all is done with in under a minute. The product interface is similarly brief and to the point, only providing a minimal set of controls and in some places mincing words to a rather confusing degree. However, it is generally usable and it ran stably throughout the test suite. From past experience we knew to expect long scanning times over large sets of infected samples, but leaving this over a weekend proved a successful tactic and no testing time was wasted.

Over clean files scan times were not too bad, and on-access measures proved fairly light, with low use of resources and one of the fastest average times taken to complete our set of tasks. Detection rates were pretty solid, with a very gradual decline across the RAP sets, and the WildList set proved no problem at all. Our clean set threw up only a handful of adware alerts, hinting that we may want to clean out some of the less salubrious items from the popular download sites, and a VB100 is thus comfortably earned. Forefront has taken part in only five tests in the last two years, only two of the last six comparatives, but has an excellent record with every entry achieving certification.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 90.96%
Worms & bots: 99.12%
Polymorphic: 100.00%
False positives: 0

Nifty Corporation Security 24

Version 3.0.1.50; client 5.63.2

Nifty has become a regular participant in our comparatives, the Japanese brand providing its own quirky interface over the Kaspersky engine, and showing no signs of adding translated versions. As usual, therefore, we relied heavily on usage instructions included with the submission, aided somewhat by our limited ability to understand the markings on the interface.

The install process seemed to require that Japanese language support be added to the system, rather sensibly, but even then much of the display was garbled and not usable as a guide. It ran through half a dozen or so incomprehensible steps before rebooting the system. On boot up, we found the GUI as strange and interesting as ever, with quirks both in layout and operation; it frequently fades into semi-transparency when not focused on. Nevertheless, it seemed fairly stable, and proved OK to operate as long as no complex configuration was needed.

As in previous tests, on-demand scans over infected sets took an enormously long time. No real reason could be found for this; the main cause of such slowdowns elsewhere is the foolish attempt to store all log data in RAM until the end of the scan, but here the standard Windows event system is used as the only available logging, and memory use did not seem to increase too dramatically. Scans would simply start off very rapidly and gradually slow to a crawl. So, having prepared for this, we set the product up on several systems at once and ran various jobs over the weekend, with most of them finished by Monday. In total around five full machine days were used up getting through the tests – considerably more than the allotted 24 hours.

No such problems were encountered when scanning clean files though, with a light touch in the on-access lag measures and initially sluggish on-demand scans speeding up hugely for the warm runs. CPU use was perhaps a little higher than average, but RAM use and impact on our activities were fairly standard.

As expected from the Kaspersky engine, detection rates were excellent across the board, with little missed anywhere, and with no issues in the core certification sets Nifty earns another VB100 award. The product has taken part in all six desktop tests in the last two years, failing only once; the last six tests show three passes from three entries.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 93.53%
Worms & bots: 99.45%
Polymorphic: 100.00%
False positives: 0

Norman Security Suite 8.00

Product Manager version 8.00; anti-virus version 8.00; scanner engine version 6.07.03; NVC version 8.1.0.88

Norman has hit its stride again recently after a run of difficulties, and is now back on a winning streak with no problems encountered in the last few tests. The vendor returned this month doubtless hoping to continue its streak of success.

The Suite solution was provided as a 112MB installer, including all the required updates, and it ran through in only a handful of steps. The process was all over in good time, but needed a reboot to complete. The interface is a little bizarre at times, for a start being a little too large for the browser-based window it is displayed in, thus requiring a pair of scroll bars which only move a tiny way. The window size is locked so the issue cannot be fixed by the user. The layout is unusual and sometimes confusing, with a limited set of options and a quirky approach to just about everything – but with practice and patience it is just about usable. Less forgivable is its disregard for instructions, with samples routinely removed or disinfected despite all settings being firmly set to avoid such behaviour. Otherwise stability seemed good, with no hitches to prevent us completing the set of tests in good time.

What did impede things somewhat was the scanning speed, which was slow in the extreme, mainly thanks to the sandbox component looking at things in great depth. As we have suggested here before, this might benefit from some sort of memory of what it’s already run to avoid such unnecessary duplication of work. On-access lag times were also fairly high, and use of CPU cycles was well up too, although RAM use was not much above average and our set of tasks was completed in reasonable time.

Detection rates were not bad, with respectable scores throughout the sets, and once again the WildList was handled well. The clean sets threw up only a single suspicious alert, on a rather bizarre piece of software which claimed to be an entertaining game but in fact seemed to simulate the experience of driving a bus. Being quite forgiven for this result, Norman earns a VB100 award once again, making a total of four passes and two fails in the past six tests, with the longer view showing six passes and four fails, with two tests not entered, in the last two years.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 86.80%
Worms & bots: 89.16%
Polymorphic: 99.98%
False positives: 0

Optenet Security Suite V. 10.06.69

Build 3304; last update 21 February 2011

Optenet first entered our tests at the end of last year with a successful run on Windows 7, and returns for more of the same. Based on the ever popular Kaspersky engine, its chances looked good from the off.

The product installer was 105MB including updates, and ran through a series of set-up steps including the providing of a password to protect the settings and a request for online activation before a reboot was requested to complete the process.

The interface is another browsery affair, which can be a little slow and occasionally flaky, but it is at least clearly laid out and provides a reasonable level of fine-tuning. From a tester’s point of view the most annoying aspect is the tendency to log out and require a password every time it is revisited after more than a few moments.

Scanning speeds were reasonable, but on-access lag times seemed a little high, and while resource use was fairly low our suite of standard jobs took a while to run through as well. Detection rates were pretty solid, with a lot of partial detections ruled out under our rules thanks to being labelled as ‘suspicious’ only. The clean set threw out none of these alerts though, and certainly no full detections, and with the WildList covered admirably Optenet earns another VB100 award, making it two from two attempts.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 76.74%
Worms & bots: 91.82%
Polymorphic: 100.00%
False positives: 0

PC Booster AV Booster 1.1.48

Definitions version 13.6.215

This is the second time on the test bench for PC Booster, whose product is another in the Preventon line. The vendor’s previous entry, in last December’s Windows 7 test, was thrown off course by an unlucky technicality, with the on-access component not checking packed files on read or on write. This month, given the results of a plethora of similar products, all seemed to be on course for a smoother run.

The installer was once again 67MB and completed in a few simple steps, with no reboot but a brief spell online required to activate a licence for full functionality. The interface has a crisp, cool blue-and-white colour scheme, with the layout unchanged from the rest of the range; tests ran through according to a well-oiled schedule, completing in good order with no stability issues.

Speeds were average on demand, and reasonable on access, with no outrageous drain on system resources, and our set of jobs ran through in decent time. Detection rates were respectable, with decent coverage in all areas. With no issues in the main certification sets PC Booster qualifies for its first VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 86.49%
Worms & bots: 95.91%
Polymorphic: 100.00%
False positives: 0

PC Renew Internet Security 2011

Version 1.1.48; definitions version 13.6.215

Yet another from the same stable, PC Renew – appearing for the first time this month – makes rather cheeky use of the standard phrase ‘internet security’, generally used to imply a multi-layered suite product but here providing little more than standard anti-malware protection, based on the common VirusBuster engine.

With no change in the set-up process or interface, the only other area worth commenting on is the colour scheme, which here stuck to a fairly standard blue and white, with a touch of warmth in the orange swirl of the logo. For some reason some of the speed tests seemed a fraction slower than other similar products, but only by a few seconds a time, and on-access measures reversed the trend by coming in a touch lighter. Resource use was also fairly similar to the rest of the range, being reasonably light in all areas and not impacting too heavily on our set of standard tasks.

Not surprisingly, detection rates were not bad either, with no serious complaints in any of the sets, and with the core sets covered without problems another newcomer joins the list of VB100 award winners.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 86.49%
Worms & bots: 95.91%
Polymorphic: 100.00%
False positives: 0

PC Tools Internet Security 2011S

Version 2011 (8.0.0.624); database version 6.16970

PC Tools’ products have been fairly regular participants in our tests since 2007, although the Internet Security line has only taken part since 2009, following the company’s takeover by Symantec. After a slightly wobbly start the product has amassed a good run of passes of late. Although the underlying detection technology has changed considerably, the look and feel remains much as it did when we first tested it several years ago.

The install package was fairly large, at 209MB, and ran through a fairly standard set of stages. Towards the end, the machine froze completely, not responding to any stimulus, and a hard restart was required. After that all seemed fine though, and a subsequent reinstall did not reproduce the problem. The interface is clear and friendly, with large status indicators covering the firewall, anti-spam and various ‘guard’ layers, but configuration of the latter is fairly basic, generally limited to on or off. Tests proceeded rapidly, although at one point while scanning the main clean set the scanner – and indeed the whole system – froze once again and a push of the reset button was required, but even with this interruption and the re-run it necessitated, the complete set of tests was finished within the allotted 24 hours.

Scanning speeds were fairly slow to start off with but sped up hugely on repeat runs. On-access overheads were light in some areas but heavy in others, notably our sets of media and documents and miscellaneous file types. Here, no sign of smart caching was evident – which is odd, given that it would be most useful in this mode. We could find no way of persuading the product to scan more than a defined list of extensions on access. Use of system resources was fairly high in all areas, and our suite of standard activities was quite heavily impacted, taking noticeably longer than usual to complete.

Detection results showed very good scores in most areas, with some excellent figures in the first half of the RAP sets, dropping off notably in the later two weeks. No problems cropped up either in the WildList set or (other than the one-off system freeze) in the clean sets, and PC Tools earns another VB100 award. The vendor’s two-year history shows entries in all desktop tests, with five passes and a single fail from six entries; all three entries in the last six tests have resulted in passes.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 93.85%
Worms & bots: 98.44%
Polymorphic: 100.00%
False positives: 0

PC Tools Spyware Doctor with AntiVirus 8.0.0.624

Database version 6.16970

The second entry from PC Tools is the company’s well-known Spyware Doctor brand, which has a long history in the anti-spyware field. This also has a rather longer history in our tests than the I.S. version, dating back to 2007.

The product itself is fairly similar in look and feel, with the installer somewhat smaller at 185MB, and the set-up process running through the same set of stages – successfully this time – with no reboot requested at the end. The interface is also similar, although with fewer modules than the full suite edition, and provides fairly basic configuration controls.

Speeds and performance measures were pretty comparable, with slow cold speeds in the on-demand scans and much faster in the warm runs. Fairly heavy lag times were observed in the same sets as for the I.S. product, but less so in the sets of archives and executables, and there was high use of memory and processor cycles and a fairly heavy slowdown when carrying out our set of tasks.

Detection rates were just about identical, with solid scores in the main sets and decent coverage of the RAP sets, declining from high levels in the earlier part to lower but still respectable levels in the latter half. The core sets proved no problem, and a second VB100 award goes to PC Tools this month. The Spyware Doctor line has an identical record to the suite, with six entries in the last dozen tests, the last five of them passes.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 93.85%
Worms & bots: 98.44%
Polymorphic: 100.00%
False positives: 0

Preventon Antivirus 4.3.48

Definitions version 13.6.215

The daddy of them all, Preventon’s own product has been entering our tests since late 2009, with a record of strong performances occasionally upset by minor technicalities.

The install and user experience is much like the rest of the range, with the installer a fraction larger at 69MB but the process unchanged, completing quickly with no reboot but needing a connection to the web to apply a licence and to access full configuration. The GUI remained stable and usable throughout our tests, with its simple set of options allowing us to progress rapidly through them, completing within 24 hours as usual.

Speeds were (unsurprisingly) fairly similar to the rest of the group, perhaps a fraction slower but no more than can be attributed to rounding errors and so on. Performance measures showed the expected light use of resources and a nice low impact on our suite of tasks. Detection rates were fairly steady across the sets and there were no issues in the clean or WildList sets, thus Preventon earns another VB100 award. Having entered five of the last nine tests, Preventon now has three passes under its belt, with one pass and two unlucky fails in the last year.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 86.49%
Worms & bots: 95.91%
Polymorphic: 100.00%
False positives: 0

Qihoo 360 Antivirus 1.1.0.1316

Signature date 2011-02-19

Qihoo (apparently pronounced ‘Chi-Fu’) is another of the wealth of solutions active in the bustling Chinese market space – this one based on the BitDefender engine. Having entered our tests on several occasions in the last couple of years, the product has a decent record of passes – but has also put us through some rather odd experiences.

The latest version came as a 110MB install package, including signatures from a few days before the submission deadline. Set-up was fast and easy, with no need to restart and the process was complete in half a minute or so. The interface is fairly attractive, with bright colours and clear icons, a decent level of configuration options and a decent approach to usability.

Stability seemed OK, and the oddities noted in previous tests were kept to a minimum. However, once again we noted that, although the on-access component claimed to have blocked access to items, this was not the experience of our opener tool, and often the pop-ups and log entries would take some time to appear after access was attempted (and apparently succeeded) – implying that the real-time component runs in something less than real time.

This approach probably helped with the on-access speed measures, which seemed very light, while on-demand scans were on the slow side. RAM consumption was high, although CPU use was about average, and impact on our set of everyday jobs was not heavy.

Detection rates, when finally pieced together, proved just as excellent as we expect from the underlying engine, with very high scores in all areas, and with no issues in the core sets a VB100 award is duly earned. Since its first entry in December 2009, Qihoo has achieved six passes and a single fail, with three tests not entered; the last six tests show three passes and a fail from four entries.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 97.25%
Worms & bots: 99.65%
Polymorphic: 100.00%
False positives: 0

Quick Heal Total Security 2011

Version: 12.00 (5.0.0.2), SP1

Quick Heal is one of our more venerable regulars, with entries dating back to 2002 – and the vendor hasn’t missed a test since way back in August 2006.

The current product revels in the now popular ‘Total Security’ title and offers a thorough set of suite components, including all the expected firewalling and anti-spam modules. As such, the installer package weighs in at a sizeable 205MB. The set-up process is fast and easy though, with only a couple of steps to click through and less than a minute run time, with no reboot needed.

The interface is glitzy and shiny without overdoing things, and has a slightly unusual, but not unusable design. Options – once they have been dug out – are fairly thorough, and stability was good, allowing us to zoom through most of the tests in good time. We had some problems with some of our performance measures, where some of the automation tools were apparently being blocked by the product, and at one point a scheduled job we had prepared to run overnight failed to activate. However, it’s possible that we missed some important step out of the set-up procedure. Nevertheless, we got everything done in reasonable time.

Scanning speeds were OK in some areas but a little on the slow side in others, while on-access lag times were a little heavy. Memory use was a little on the high side, but CPU use was not too bad, and our set of tasks was completed in good time.

Detection rates proved pretty decent across the sets, and had ‘suspicious’ detections been included in the count they would have been considerably higher. The core certification sets were well handled, and a VB100 is well deserved by Quick Heal. The vendor’s record shows ten passes and two fails in the last two years, with all of the last six tests passed.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 82.64%
Worms & bots: 92.72%
Polymorphic: 100.00%
False positives: 0

Returnil System Safe 2011

Version 3.2.11937.5713-REL12A

We first looked at Returnil’s offering last summer (see VB, August 2010, p.21), when it went by the name ‘Virtual System’ in reference to the sandboxing/virtualization set-up that is at the core of its protective approach. It also includes the Frisk malware detection engine, which is the main aspect we looked at on this occasion.

The installer is compact at only 40MB, and takes only a few moments to complete, with a reboot requested after 30 seconds or so. The interface is bright and colourful, and fairly easy to use, although the configuration section seems mainly focused on the virtualization system and on providing feedback on incidents, with little by way of actual options for the scanning or protection. With sensible defaults and good stability though, testing progressed nicely and was completed in short order.

Scanning speeds were rather slow, and on-access lags a little heavy, with low use of memory and minimal impact on our suite of tasks, but very heavy use of CPU cycles.

Detection rates were pretty decent in most sets, with a slow decline in the RAP sets and once again that slight and unexpected upturn in the proactive week. The core sets were handled well, and Returnil earns another VB100 award. Having entered four of the last five tests, skipping only the recent Linux test, Returnil can now boast three passes and only a single fail.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 78.88%
Worms & bots: 91.46%
Polymorphic: 100.00%
False positives: 0

Sofscan Professional 7.2.27

Virus scan engine 5.2.0; virus database 13.6.217

Another new name but not such a new face, Sofscan was another last-minute arrival with its product closely modelled on some others taking part this month, and the test’s most popular detection engine once again driving things. The installer package measured 66MB, with an extra 62MB zip file containing the latest updates.

The set-up process featured all the usual steps including, as we have observed with a few others this month, the option to join a community feedback system and provide data on detection incidents. This was disguised as the ‘accept’ box for a EULA and was pre-selected by default. It doesn’t take long to get set up, and no reboot is needed to complete.

The interface is a familiar design, dating back many years now and showing its age slightly in a rather awkward and fiddly design in some areas, but providing a decent level of controls once its oddities have been worked out. Operation seemed a little wobbly at times, with some tests throwing up large numbers of error messages from Windows, warning of delayed write fails and other nasties. We also experienced problems with logging to memory rather than disk once again, with our large tests slowing to a crawl and taking days to get through. Worried by the repeated write warnings, we broke things up into several jobs and re-imaged the test system in between runs, and eventually got everything done, after about four full days of run time.

Scanning speeds came in rather slow, and lags were pretty heavy, with high use of system resources – processor drain was particularly high. Impact on our suite of activities was not too significant though. Detection rates were pretty good, tailing off somewhat in the RAP sets but showing good form in the core certification tests and earning Sofscan its first VB100 certification.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 88.90%
Worms & bots: 96.33%
Polymorphic: 100.00%
False positives: 0

Sophos Endpoint Security and Control 9.5

Sophos Anti-Virus 9.5.5; detection engine 3.16.1; detection data 4.62G

Sophos is another of our most regular participants, with a history going all the way back to 1998 and only two tests not entered, both of which were over five years ago.

The vendor’s main product is provided as a 75MB installer, with additional, incremental updates in a svelte 4MB package. Set-up follows the usual path, with a few corporate extras such as the removal of ‘third-party products’ (i.e. competitor solutions), and the option to install a firewall component, which is unchecked by default. No reboot is needed to finish the process, which is completed in under a minute.

The interface is stern and sober with little unnecessary flashiness, providing easy access to standard tasks and settings, with some extreme depth of fine-tuning also available if required. HIPS and ‘live’ online lookups are included, but not covered by our testing at the moment – the live component had to be disabled to avoid delays in our tests. Stability was solid, with no problems under heavy pressure, and testing ran through in decent time.

Speed times and on-access lags were good with default settings where only a preset list of extensions are covered. With a more in-depth set of settings only the archive set was heavily affected, the others still getting through in good time. Resource consumption was low and our suite of standard tasks ran through quickly with little time added.

Detection rates were solid, with good coverage across the sets and a slow decline into the most recent parts of the RAP sets. The core certification sets proved no problem, and Sophos comfortably earns another VB100 award. The company’s recent records show only a single fail and 11 passes in the last two years, with all of the last six tests passed with flying colours.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 91.70%
Worms & bots: 87.69%
Polymorphic: 100.00%
False positives: 0

SPAMfighter VIRUSfighter 7.100.15

Definitions version 13.6.215

The people behind VIRUSfighter specialize in fighting spam (as the company name makes admirably clear), but have been producing anti-malware products for some time too. When we first looked at their solutions they were using the Norman engine, but of late they have been based on VirusBuster, using the Preventon SDK but adding a fair amount of their own work to things.

The installer came in at 68MB including all updates, and the set-up process was zippy and to the point, with a request for the user’s email details the only notable aspect. Everything is done in under a minute, with no need to reboot. The interface is a khaki green, the logo adorned with military chic, and the layout fairly clear and simple. Some options were a little baffling though – checkboxes marked ‘turn on/off’ beg the question of whether checked means on or off. Although the layout is different, much of the wording is similar to other products based on the same SDK, with perhaps a few extra settings over and above those provided by the others. We also found that registry entries used elsewhere to ensure logs were not thrown out after a certain time were missing, or at least not where we expected, so we had to run tests in smaller jobs to ensure all data was kept for long enough for us to harvest it.

Speeds were much as expected: fairly average on demand but pleasantly light on access, with fairly low use of resources and little impact on standard tasks. Detection rates were also respectable, with a decline into the later parts of the RAP sets but no serious issues, and full coverage of the WildList and clean sets. A VB100 award thus goes to SPAMfighter, its second from five entries in the last seven tests.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 86.37%
Worms & bots: 95.91%
Polymorphic: 100.00%
False positives: 0

GFI/Sunbelt VIPRE Antivirus 4.0.3904

Definitions version: 8516; VIPRE engine version: 3.9.2474.2

The VIPRE product has been taking part in our tests for 18 months or so now, with some decent results and, in recent tests at least, signs of overcoming some nasty issues with stability which made its earlier appearances something of a chore.

The installer is pretty small at only 16MB, but contains no detection data initially, requiring the extra 62MB of the standard installer bundle to get things fully set up. The initial job is thus very rapid, with just a couple of clicks required, and all is complete in about ten seconds, before a reboot is demanded. After the reboot a set of set-up stages must be run through, and a demo video is offered to guide one through using the product. This is probably not necessary for most users, with the GUI fairly simple and clearly laid out, and with little by way of fine controls to get lost in – most areas seem limited to little more than on or off. Thankfully stability was generally good, even in the on-access runs which have given us some problems in the past. However, it remains unclear what the product’s approach to actions on detection is, with some runs seeming to go one way and others another.

Scanning times were very slow over some sets, such as our collection of media and document files, but surprisingly quick over executables, which one would expect to be looked at most closely. On-access lag times showed a similar pattern, with some good speed-up in the warm runs improving things considerably Resource use was low in terms of memory but perhaps a fraction above average in CPU use, and impact on our suite of activities was barely noticeable.

Detection rates were excellent, continuing a steady upward trend noted over several months. The RAP scores were very high in the reactive weeks, with something of a drop in the proactive week as expected. The clean sets were covered without problems, and after double-checking a selection of files which were not initially denied access to but alerted on slightly behind real time, the WildList set proved to be well handled too. A VB100 award is thus well earned, making for four passes and a single fail in the vendor’s five entries so far; the last year shows three passes and three no-entries.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 97.99%
Worms & bots: 99.67%
Polymorphic: 99.79%
False positives: 0

Symantec Endpoint Protection 11.0.6200.754

Definitions: 21 February 2011 r2

Symantec is another long standing regular in VB100 testing, but has been somewhat unpredictable in its entries of late, with its last appearance as long ago as August 2010. Finally back on our list, we expected to see a solid performance.

The installer seemed to cover the entire corporate product range, with multiple platforms supported and management tools etc. included, so weighed in at a chunky 1.3GB. For the standalone anti-malware solution the set-up process was fairly short and simple though, running through a standard set of stages for a business product, and offering to reboot at the end, but not demanding one immediately. The interface is fairly bright and colourful for a corporate offering, with large, clear status displays. A hugely detailed range of controls can be found under the hood, again with a clear layout and good usability.

Scanning speeds were good in most areas – slower than most over archive files thanks to scanning internally by default, while on-access lag times were perhaps a little on the heavy side but nowhere near some of the extremes seen this month. Resource usage was a little above average, but a good time was recorded over our suite of standard tasks.

Detection rates were pretty good, with a fairly sharp drop through the RAP sets but solid coverage in most areas, and the core certification sets caused no unexpected issues, thus comfortably earning Symantec a VB100 award this month. After several tests skipped, the company’s test history now shows six passes and a single fail over the last two years, with two entries (both passed) in the last six tests.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 93.13%
Worms & bots: 98.25%
Polymorphic: 100.00%
False positives: 0

Trustport Antivirus 2011

11.0.0.4606

Trustport is one of the handful of multi-engine products that routinely vies for the highest set of scores in our tests, marking out the top right corner of our RAP quadrant as its own. We have been testing the vendor’s products since June 2006, during which time a range of engines have been used, but of late the company seems to have settled on a fairly winning combination of BitDefender and AVG.

The twin cores make for a fairly large install package, although not too huge at 188MB including all required updates. The set-up process is fairly speedy, with no deviations from standard practice, and all is done in a minute or so with no need to restart.

The interface is a little unusual, with multiple mini-GUIs rather than a single unified console, but it proves reasonably simple to operate with a little exploring, and provides a solid set of controls, as one would expect from a solution aimed at the more demanding type of user. Under heavy pressure the interface can become a little unstable, occasionally doing strange things to general windowing behaviour too, and we observed log data being thrown away a few times despite having deliberately turned the limits to the (rather small) maximum possible. We had no major problems though, and testing took not too much more than the assigned 24 hours.

Scanning speeds were a little on the slow side, particularly over archives, thanks to very thorough default settings, and on-access lag times were fairly heavy too. Although resource usage looked pretty good, we saw quite some impact on our set of standard activities.

This heaviness was more than counterbalanced by the detection rates though, which barely dropped below 99% in most areas, with even the proactive week of the RAP sets showing a truly superb score. The WildList was brushed aside, and perhaps most importantly the clean set was handled admirably, easily earning Trustport another VB100 award. The company’s recent test record is excellent, with nine passes in the last dozen tests, the other three not entered; the last year shows four passes from four entries.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 99.16%
Worms & bots: 99.85%
Polymorphic: 100.00%
False positives: 0

UnThreat Antivirus Professional 3.0.17

DB version: 8516

Yet another new name, and another last-minute arrival on the test bench, UnThreat is based on the VIPRE engine, which gave us a few worries as we prepared to try it out for the first time.

The installer was pretty compact at under 8MB, although 60MB or so of updates were needed in addition. The set-up process presented a very large window but didn’t have much to fill it with, zipping through in no time at all and requesting a final reboot after only 10 seconds or so. The interface is fairly nice and attractive, in a dappled grey shade with large, clear buttons and icons. The layout is lucid and sensible. The family relationship was clear in some areas, with some sets of controls closely mirroring those in VIPRE, but in other areas we actually found more detailed configuration available, which was a pleasant surprise.

Speed measures ran through safely, with an appealing animated graphic to keep the user entertained during the scanning process. The expected slow times were observed over most file types, although executables were well handled. Lag times were pretty hefty too, again with good improvements in the warm runs, and with low RAM use and CPU drain not too high either, the impact on our activities was pretty slight. Detection tests proved rather more of a challenge though. An initial run over the main sets was left overnight. When it still hadn’t finished at the end of the following day, it was left for another night. In the end it took 42 hours to complete, and by the end the scanning process was using 1.2GB of RAM, the test machine just about holding its own and remaining responsive. Unfortunately, the scan seemed to have frozen at the moment of completion and failed to write any logs out to disk. Scans were re-run in a dozen or so smaller chunks, each taking from four to eight hours, and this approach produced much better results, with no repeats of the earlier logging failure. Moving on to the on-access tests, we saw similar problems to those experienced with other OEM versions of the same engine, with any kind of stress causing an immediate collapse.

Detection seemed to stay up for a few hundred detections, then either switched itself off silently, or stopped detecting but continued to delay access to any file for a considerable period. The set was broken into smaller and smaller chunks, each one being run separately with the product given plenty of breaks in between to recover from the ordeal of having to look at a few dozen files. Testing continued for several more days, and in the end a complete set of results was obtained, closely matching those of the VIPRE product, with the same engine and updates but in massively less time.

This meant solid scores across the board, with a great showing in the RAP sets and no problems in the core certification sets, earning UnThreat its first VB100 award at first attempt. A lot of work was involved, with perhaps 15 working machine-days devoted to getting it through the full suite of tests – we have to hope GFI/Sunbelt passes on the improvements it has made to its own product to its OEM partners sometime soon.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 97.99%
Worms & bots: 99.67%
Polymorphic: 99.79%
False positives: 0

VirusBuster Professional 7.0.44

Virus scan engine 5.2.0; virus database 13.6.217

VirusBuster is another old hand at the VB100, with entries running back over a decade and the vendor’s last missed entry way back in 2007. As usual we’ve seen several entries spawned from this engine this month, with most achieving good results, which bodes well for VirusBuster itself. However, those most closely modelled on the original engine have had some nasty issues this month, with scan slowdowns and memory drainage, which left us somewhat apprehensive.

The 69MB installer tripped through rapidly, with nothing too taxing to think about and no reboot needed before applying the 62MB offline update bundle. The interface is very familiar, having barely changed in many years, but somehow still seems to bewilder and baffle with its awkward and non-standard layout and approach to controls, which are actually provided in decent depth once they are dug out.

Running through the speed sets proved simple, with scanning speeds and lag times around average and resource use and impact on everyday tasks fairly low. Getting through the larger infected sample sets proved harrowing as feared though, with several crashes and several scans taking huge amounts of time to complete. After leaving it over several nights – taking it off during the days to get on with more urgent tasks – results were finally put together, showing the expected decent scores across the sets, with a slight decline in the latter half of the RAP sets. The core sets were well handled, and VirusBuster earns another VB100 award. The long view shows passes in all of the last six tests, three fails and nine passes in the last two years.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 88.90%
Worms & bots: 96.33%
Polymorphic: 100.00%
False positives: 0

Webroot Internet Security Complete 7.0.6.38

Security definitions version 1892; virus engine version 3.16.1

Entering into its fifth year of VB100 entries, Webroot has a good record of passes thanks to the Sophos engine that provides the bulk of the malware detection. However, the product has yet to earn much popularity with the test lab team thanks to its control-free interfaces and long run times getting through tests. As usual, we hoped for an improvement but, after an exhausting few weeks, feared more of the same.

The installer provided measured close to 300MB, but was a custom build for testing including a wide range of extras and several versions of the virus data. Some special steps were involved in the set-up too, but the main process ran through the basic simple steps, completing fairly rapidly and needing a reboot at the end.

Performance tests proved somewhat difficult as a number of our scripts and tools seemed to be being prevented from running. No warnings were displayed by the product however, and no log entries could be found referencing the actions carried out. Delving into the controls, we eventually found some settings to whitelist applications, and added everything used by our tests, but still they were not allowed to function properly. In the end, we had to completely disable the firewall portion of the product to get a simple job like fetching files with wget to work.

With this done, we saw some decent scanning speeds, especially in warm runs where unchanged files are ignored. Lag times were very low too, and resource use and impact on tasks were also kept to a minimum.

This did little good in our larger jobs, but some special controls disabling the default quarantining action promised to speed things through, and with these enabled we set off the main detection task with high hopes. Close to 60 hours later, it all seemed finished, and we moved on to the on-access tests. These were performed on-write as on-read protection appeared not to be present. Again, it took several days to complete the process of copying the main sample sets from one place to another. Logs were at least comprehensive and complete though, and results were finally harvested, showing the expected solid scores, declining slightly in the newer parts of the RAP sets. A fine showing in the core sets earns Webroot a VB100 award, the vendor’s fourth from four entries in the last two years, and perhaps its most exhausting (for us) so far.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 93.05%
Worms & bots: 98.47%
Polymorphic: 100.00%
False positives: 0

Results tables

Conclusions

Another giant test, with another record-breaking roster of products entered, and once again we considerably overshot our target completion date. However, the main cause of this was not the large number of products. Nor was it the perhaps rather ambitious plan to introduce some new, untried and rather time-consuming performance measures into the busiest test of the year – nor the absence of half the lab team through illness for the bulk of the testing period. The main issue was with a handful of unruly, unstable, slow and unreliable products – perhaps a dozen or so taking up the whole lab for a full two weeks. The other 55 or so were completed in less than three weeks and, had all products behaved as well as we hoped – or, indeed, as well as the majority did – we could easily have squeezed in another 30 or so in the time we had available.

The bulk of wasted time was the result of inadequate or unreliable logging facilities, and lack of user controls. Products which insist on quarantining, disinfecting and so on by default are fairly commonplace – it’s a fairly sensible approach given the lack of interest most users display in their own security. However, even if most users are not interested in controls, and would be unlikely to set their products to log only, or deny access only, when submitting products for a large-scale comparative it seems fairly obvious that this would be a useful thing to have available. Presumably many of the companies producing security solutions these days, putting products together based on engines developed elsewhere, do not have access to malware samples to use for QA, but that is a pretty poor excuse for not getting the QA done. Stability of a piece of security software should not be undermined by having to work a little harder than usual, and passing that instability on to the entire machine is likely to be pretty unforgivable to most users.

Logging is another area of difficulty, and another one where testers perhaps have somewhat special needs. However, this is something else which is made very clear when submissions are called for testing, and one which is clearly going to be important in a large-scale test. Inaccurate or incomplete logs of what has been observed and carried out on a machine would be utterly unacceptable in a business environment, and most consumers would be unhappy to find that their security solution had fiddled with their system but couldn’t tell them anything about what it had done or why. The growing popularity of logging to memory, and only outputting to file at the end of a scan, seems targeted specifically at irritating testers. The benefits are presumably in faster scanning times and less use of disk, but presumably most normal users would see little of this benefit, as there would rarely be much written to logs anyway. The only people with large amounts of data to log are those who have too much data to be comfortably stored in memory without causing horrible side effects: the slowdowns and collapses and fails we have seen so many of this month.

Having dealt with the dark side, there are also good things to report this month. We saw a good ratio of passes this month, with only a few products not qualifying for certification, partly of course thanks to our extreme efforts in the face of difficulties, but mainly due to good detection rates and low rates of false alarms. Those not making it were generally denied by a whisker, with only a few showing fair numbers of false positives or significant samples not covered. In a couple of unlucky cases, selection of default settings led to items being missed which could otherwise easily have been detected. In general though, performances were good. As well as the simpler measure of certification passes, we saw some excellent scores in our RAP sets, with a general move towards the upper right corner of the quadrant. We saw several new names on this month’s list, a few of whom had some problems, but several put in strong showings and there are a number of proud new members of the VB100 award winners’ club.

We also saw some interesting results in our performance measures, which we’ll continue to refine going forward, hopefully making them more accurate and reliable as we fine-tune the methodology over the next few tests. We also hope, now that the lab has a little breathing space, to get back to work on plans to expand coverage of a wide range of protective layers and technology types. The overheating, overworked lab hardware may need a little downtime first though – as might the similarly hot and tired lab team – to recover from what has been quite an ordeal.

Technical details

Test environment. All products were tested on identical machines with AMD Phenom II X2 550 processors, 4GB RAM, dual 80GB and 1TB hard drives running Windows XP Professional SP3.

For the full testing methodology see http://www.virusbtn.com/vb100/about/methodology.xml.

Any developers interested in submitting products for VB's comparative reviews should contact [email protected]. The current schedule for the publication of VB comparative reviews can be found at http://www.virusbtn.com/vb100/about/schedule.xml.

twitter.png
fb.png
linkedin.png
hackernews.png
reddit.png

 

Latest reviews:

VBSpam comparative review Q4 2024

In the Q4 2024 VBSpam test we measured the performance of 11 full email security solutions and one open‑source solution against various streams of wanted, unwanted and malicious emails.

VBSpam comparative review Q3 2024

The Q3 2024 VBSpam test measured the performance of ten full email security solutions and one open‑source solution.

VBSpam comparative review Q2 2024

The Q2 2024 VBSpam test measured the performance of ten full email security solutions, one custom configured solution and one open‑source solution.

VBSpam comparative review Q1 2024

The Q1 2024 VBSpam test measured the performance of nine full email security solutions, one custom configured solution and one open‑source solution.

VBSpam comparative review

The Q4 2023 VBSpam test measured the performance of eight full email security solutions, one custom configured solution, one open-source solution and one blocklist.

We have placed cookies on your device in order to improve the functionality of this site, as outlined in our cookies policy. However, you may delete and block all cookies from this site and your use of the site will be unaffected. By continuing to browse this site, you are agreeing to Virus Bulletin's use of data as outlined in our privacy policy.