2014-06-02
Abstract
‘We hope soon to be able to provide a better reflection of the growing diversity of the security solution market.’ John Hawes
Copyright © 2014 Virus Bulletin
We are at a fairly major milestone for VB, with this month’s issue not only the 300th, but also the last in the current monthly magazine-style format.
Last month, my colleague Martijn discussed some of the changes that will come with the change in format, expanding the scope and diversity of the material we cover as well as increasing the frequency of publication of new articles. That diversification will also, we hope, extend to the content of the VB conference, opening up lines of communication and information-sharing on a wider range of topics and including a wider portion of the security community in the debate.
Alongside the conference and the magazine, there is of course a third string to VB’s bow, namely our testing and certification activities, in which we have been engaged from the very beginning. A glance at the first issue of Virus Bulletin, from July 1989, reveals a (rather damning) technical review of Dr Solomon’s Anti-Virus Toolkit; the first VB100 comparative review appeared in 1998, and our first public comparative review of anti spam solutions was published in 2009.
The idea of sharing information unites all of these activities. In the context of the magazine of the past, the web content of the future, and both the presentations and the inter-person, inter-company networking opportunities of the conference, VB acts as a facilitator for sharing amongst others. In the testing arena, it is VB itself doing the sharing – sharing information both with our readers and with the participants in the tests.
The results of our tests provide in-depth information for users and potential users of the products we look at, but just as importantly, testing provides product developers with information on how well they are doing, what issues their products may have, and even how they should go about improving things.
We see the role of testing not merely as highlighting good points and inadequacies, but also providing concrete and actionable information that can help make products better. As a small, but hard-working test team producing large amounts of data, we have always done our best to render that data digestible for the general audience, but we have also always endeavoured to provide product developers with more detailed information where required, and where possible.
This is not always easy. Not so many years ago, when polymorphic viruses were a more common sight, we often had vendors missing single samples from our test sets thanks to tiny and rarely occurring errors in their detection methods. Our policy was, and remains, to avoid sharing the official test set samples of such items, instead providing fresh copies replicated from them – if we simply sent the single freak sample, we could not be sure that detection had been fixed properly, as opposed to bodged into place for that one instance.
Of course, the replicated copies would not always (indeed hardly ever) combine the exact set of features that caused the original miss, and we would keep producing new ones until we found another that did. Occasionally, this meant churning out over a million replications before we found one that would allow the developers to figure out where they had gone wrong. A lot of work for a small team, but we did it, and we still do where necessary.
The changes within VB are set to include a significant expansion of the test team, which should give us more time to devote to improving the data our tests provide. Of course, much of the extra manpower will rapidly be absorbed by a range of new tests already in the pipeline, as well as adjustments and expansions to the current set of tests, but we hope soon to be able to provide a better reflection of the growing diversity of the security solution market, and the diversity of the threat landscape, with more comprehensive tests looking at protection in general, regardless of the technology providing it. We also hope to be able to combine data from multiple testing approaches to measure the effectiveness of different combinations of layered protection, and much more besides.
Information sharing is not a one-way street of course, and we extend our gratitude to all our readers, correspondents, test participants and conference attendees for their feedback, advice, criticism and support over the years.