Virus Bulletin
Copyright © 2019 Virus Bulletin
In this test – which forms part of Virus Bulletin's continuously running security product test suite – 11 full email security solutions and five blacklists of various kinds were assembled on the test bench to measure their performance against various streams of wanted, unwanted and malicious emails.
The news in these test reports tends to be good: email security products are an important first line of defence against the many email-borne threats and, especially against the bulk of opportunistic threats, they perform really well. The news in this report is no exception, with all 11 full solutions obtaining a VBSpam award and six of them performing well enough to earn a VBSpam+ award.
However, it is important to look beyond the spam catch rates: block rates of malware and phishing emails, though still high, were significantly lower than the block rates of ordinary spam emails.
As in the previous report, we include details of the products’ performance against 'malware' and 'phishing' emails, defined as emails with a malicious attachment and emails with a malicious link, respectively. It should be noted that the distinction between these categories isn't always clear – for example when an email has a PDF attachment that includes a link to a phishing website (we classify this as 'phishing', arguing that the attachment itself isn’t malicious).
One thing we continue to note in these tests, and in our lab throughout the year, is that block rates for malware and phishing emails are significantly lower than those for spam in general. This isn't so much because malicious spam is significantly harder to block, but because these campaigns tend to be sent in a more professional manner, which helps them stay under the radar.
We always emphasise the point that the numbers in this test are to be taken within the context of the test. It is especially important to understand that the 'spam catch rates' as seen in this test reflect ordinary spam only, and readers should pay extra attention to the malware and phishing catch rates.
Spam catch rates continued to be high, with many products blocking 99.9% or more of the spam, but block rates of malware and phishing were significantly lower. All participating full solutions achieved a VBSpam award, with six vendors – Axway, Bitdefender, ESET, Fortinet, IBM and Safemail – performing well enough to achieve a VBSpam+ award.
Bitdefender, FortiMail and IBM were the only products that didn’t miss a single email with a malicious attachment. No product attained a perfect score in the phishing category, with ESET and Libra Esva performing best, each missing only two emails.
Janusmail is new to the VBSpam test lab. The hosted solution is run from Italy, a country that has delivered several well-performing products in these tests. Janusmail proved to be no exception, and with a spam catch rate of more than 99.9% and a final score of 99.60 the product easily achieves a VBSpam award on its debut.
Also new is Kasperksy's DNS-based blocklist that checks the IP address of every incoming email.
SC rate: 99.72%
FP rate: 0.00%
Final score: 99.71
Malware catch rate: 93.33%
Phishing catch rate: 97.84%
Project Honey Pot SC rate: 99.69%
Abusix SC rate: 99.73%
Newsletters FP rate: 0.5%
Speed:
10% | 50% | 95% | 98% |
SC rate: 99.96%
FP rate: 0.00%
Final score: 99.96
Malware catch rate: 100.00%
Phishing catch rate: 98.80%
Project Honey Pot SC rate: 100.00%
Abusix SC rate: 99.95%
Newsletters FP rate: 0.0%
Speed:
10% | 50% | 95% | 98% |
SC rate: 99.99%
FP rate: 0.00%
Final score: 99.99
Malware catch rate: 99.26%
Phishing catch rate: 99.52%
Project Honey Pot SC rate: 100.00%
Abusix SC rate: 99.99%
Newsletters FP rate: 0.0%
Speed:
10% | 50% | 95% | 98% |
SC rate: 99.97%
FP rate: 0.00%
Final score: 99.94
Malware catch rate: 100.00%
Phishing catch rate: 97.84%
Project Honey Pot SC rate: 99.99%
Abusix SC rate: 99.97%
Newsletters FP rate: 1.1%
Speed:
10% | 50% | 95% | 98% |
SC rate: 99.89%
FP rate: 0.00%
Final score: 99.89
Malware catch rate: 100.00%
Phishing catch rate: 97.60%
Project Honey Pot SC rate: 99.97%
Abusix SC rate: 99.88%
Newsletters FP rate: 0.0%
Speed:
10% | 50% | 95% | 98% |
SC rate: 99.91%
FP rate: 0.05%
Final score: 99.60
Malware catch rate: 99.26%
Phishing catch rate: 98.56%
Project Honey Pot SC rate: 99.70%
Abusix SC rate: 99.95%
Newsletters FP rate: 1.6%
Speed:
10% | 50% | 95% | 98% |
SC rate: 99.97%
FP rate: 0.02%
Final score: 99.86
Malware catch rate: 99.26%
Phishing catch rate: 99.52%
Project Honey Pot SC rate: 100.00%
Abusix SC rate: 99.96%
Newsletters FP rate: 0.5%
Speed:
10% | 50% | 95% | 98% |
SC rate: 99.95%
FP rate: 0.00%
Final score: 99.95
Malware catch rate: 99.26%
Phishing catch rate: 99.28%
Project Honey Pot SC rate: 99.96%
Abusix SC rate: 99.94%
Newsletters FP rate: 0.0%
Speed:
10% | 50% | 95% | 98% |
SC rate: 99.54%
FP rate: 0.05%
Final score: 99.27
Malware catch rate: 88.15%
Phishing catch rate: 77.46%
Project Honey Pot SC rate: 99.71%
Abusix SC rate: 99.50%
Newsletters FP rate: 0.0%
Speed:
10% | 50% | 95% | 98% |
SC rate: 99.23%
FP rate: 0.00%
Final score: 99.23
Malware catch rate: 87.41%
Phishing catch rate: 70.50%
Project Honey Pot SC rate: 99.24%
Abusix SC rate: 99.23%
Newsletters FP rate: 0.0%
Speed:
10% | 50% | 95% | 98% |
SC rate: 99.93%
FP rate: 0.12%
Final score: 99.27
Malware catch rate: 99.26%
Phishing catch rate: 98.56%
Project Honey Pot SC rate: 99.96%
Abusix SC rate: 99.93%
Newsletters FP rate: 1.6%
Speed:
10% | 50% | 95% | 98% |
SC rate: 99.52%
FP rate: 0.09%
Final score: 99.08
Malware catch rate: 84.44%
Phishing catch rate: 86.09%
Project Honey Pot SC rate: 98.31%
Abusix SC rate: 99.72%
Newsletters FP rate: 0.0%
SC rate: 98.28%
FP rate: 0.00%
Final score: 98.28
Malware catch rate: 88.15%
Phishing catch rate: 83.45%
Project Honey Pot SC rate: 98.93%
Abusix SC rate: 98.17%
Newsletters FP rate: 0.0%
SC rate: 97.76%
FP rate: 0.00%
Final score: 97.76
Malware catch rate: 88.15%
Phishing catch rate: 80.82%
Project Honey Pot SC rate: 97.45%
Abusix SC rate: 97.81%
Newsletters FP rate: 0.0%
SC rate: 65.62%
FP rate: 0.00%
Final score: 65.62
Malware catch rate: 2.96%
Phishing catch rate: 47.96%
Project Honey Pot SC rate: 92.75%
Abusix SC rate: 61.01%
Newsletters FP rate: 0.0%
SC rate: 88.18%
FP rate: 0.00%
Final score: 88.18
Malware catch rate: 56.30%
Phishing catch rate: 53.96%
Project Honey Pot SC rate: 95.24%
Abusix SC rate: 86.98%
Newsletters FP rate: 0.0%
True negatives | False positives | FP rate | False negatives | True positives | SC rate | Final score | VBSpam | |
Axway | 5681 | 0 | 0.00% | 737 | 267213.8 | 99.72% | 99.71 | |
Bitdefender | 5681 | 0 | 0.00% | 108.6 | 267842.2 | 99.96% | 99.96 | |
ESET | 5681 | 0 | 0.00% | 13.4 | 267937.4 | 99.99% | 99.99 | |
FortiMail | 5681 | 0 | 0.00% | 78.4 | 267872.4 | 99.97% | 99.94 | |
IBM | 5681 | 0 | 0.00% | 287.6 | 267663.2 | 99.89% | 99.89 | |
Janusmail | 5678 | 3 | 0.05% | 231.8 |
267719 |
99.91% | 99.60 | |
Libra Esva | 5680 | 1 | 0.02% | 82 | 267868.8 | 99.97% | 99.86 | |
Safemail | 5681 | 0 | 0.00% | 143.6 | 267807.2 | 99.95% | 99.95 | |
Spamhaus DQS | 5678 | 3 | 0.05% | 1245.6 | 266705.2 | 99.54% | 99.27 | |
Spamhaus rsync | 5681 | 0 | 0.00% | 2061.4 | 265889.4 | 99.23% | 99.23 | |
ZEROSPAM | 5674 | 7 | 0.12% | 174.4 | 267776.4 | 99.93% | 99.27 | |
Abusix Mail Intelligence* | 5676 | 5 | 0.09% | 1295.8 | 266655 | 99.52% | 99.08 | N/A |
IBM X-Force Combined* | 5681 | 0 | 0.00% | 4596.8 | 263354 | 98.28% | 98.28 | N/A |
IBM X-Force IP* | 5681 | 0 | 0.00% | 5996.8 | 261954 | 97.76% | 97.76 | N/A |
IBM X-Force URL* | 5681 | 0 | 0.00% | 92134.6 | 175816.2 | 65.62% | 65.62 | N/A |
Kaspersky DNSBL* | 5681 | 0 | 0.00% | 31670.4 | 236280.4 | 88.18% | 88.18 | N/A |
*These products are partial solutions and their performance should not be compared with that of other products.
(Please refer to the text for full product names and details.)
Newsletters | Malware | Phishing | Project Honey Pot | Abusix | STDev† | ||||||
False positives | FP rate | False negatives | SC rate | False negatives | SC rate | False negatives | SC rate | False negatives | SC rate | ||
Axway | 1 | 0.5% | 9 | 93.33% | 9 | 97.84% | 120.2 | 99.69% | 619.2 | 99.73% | 0.51 |
Bitdefender | 0 | 0.0% | 0 | 100.00% | 5 | 98.80% | 1 | 99.997% | 110 | 99.95% | 0.17 |
ESET | 0 | 0.0% | 1 | 99.26% | 2 | 99.52% | 0 | 100.00% | 13.4 | 99.99% | 0.05 |
FortiMail | 2 | 1.1% | 0 | 100.00% | 9 | 97.84% | 2 | 99.99% | 76.4 | 99.97% | 0.08 |
IBM | 0 | 0.0% | 0 | 100.00% | 10 | 97.60% | 10.2 | 99.97% | 279.8 | 99.88% | 0.37 |
Janusmail | 3 | 1.6% | 1 | 99.26% | 6 | 98.56% | 117.8 | 99.70% | 114 | 99.95% | 0.21 |
Libra Esva | 1 | 0.5% | 1 | 99.26% | 2 | 99.52% | 1.2 | 99.997% | 80.8 | 99.96% | 0.12 |
Safemail | 0 | 0.0% | 1 | 99.26% | 3 | 99.28% | 14.2 | 99.96% | 131.8 | 99.94% | 0.2 |
Spamhaus DQS | 0 | 0.0% | 16 | 88.15% | 94 | 77.46% | 111.6 | 99.71% | 1134 | 99.50% | 0.61 |
Spamhaus rsync | 0 | 0.0% | 17 | 87.41% | 123 | 70.50% | 294 | 99.24% | 1767.4 | 99.23% | 0.74 |
ZEROSPAM | 3 | 1.6% | 1 | 99.26% | 6 | 98.56% | 15.2 | 99.96% | 159.2 | 99.93% | 0.22 |
Abusix Mail Intelligence* | 0 | 0.0% | 21 | 84.44% | 58 | 86.09% | 658 | 98.31% | 637.8 | 99.72% | 1.7 |
IBM X-Force Combined* | 0 | 0.0% | 16 | 88.15% | 69 | 83.45% | 416.6 | 98.93% | 4182.6 | 98.17% | 1.42 |
IBM X-Force IP* | 0 | 0.0% | 16 | 88.15% | 80 | 80.82% | 991.8 | 97.45% | 5007.4 | 97.81% | 1.64 |
IBM X-Force URL* | 0 | 0.0% | 131 | 2.96% | 217 | 47.96% | 2821.6 | 92.75% | 89315.4 | 61.01% | 15.59 |
Kaspersky DNSBL* | 0 | 0.0% | 59 | 56.30% | 192 | 53.96% | 1852.6 | 95.24% | 29820.2 | 86.98% | 5.94 |
*These products are partial solutions and their performance should not be compared with that of other products. None of the queries to the IP blacklists included any information on the attachments; hence their performance on the malware corpus is added purely for information.
† The standard deviation of a product is calculated using the set of its hourly spam catch rates.
(Please refer to the text for full product names and details.)
Speed | ||||
10% | 50% | 95% | 98% | |
Axway | ||||
Bitdefender | ||||
ESET | ||||
FortiMail | ||||
IBM | ||||
Janusmail | ||||
Libra Esva | ||||
Safemail | ||||
Spamhaus DQS | ||||
Spamhaus rsync | ||||
ZEROSPAM |
0-30 seconds | 30 seconds to two minutes | two minutes to 10 minutes | more than 10 minutes |
(Please refer to the text for full product names.)
Products ranked by final score | |
ESET | 99.99 |
Bitdefender | 99.96 |
Safemail | 99.95 |
FortiMail | 99.94 |
IBM | 99.89 |
Libra Esva | 99.86 |
Axway | 99.71 |
Janusmail | 99.60 |
Spamhaus DQS | 99.27 |
ZEROSPAM | 99.27 |
Spamhaus rsync | 99.23 |
(Please refer to the text for full product names and details.)
Hosted solutions | Anti-malware | IPv6 | DKIM | SPF | DMARC | Multiple MX-records | Multiple locations |
Janusmail | ClamAV; others optional | √ | √ | √ | √ | √ | √ |
Safemail | ClamAV; proprietary | √ | √ | √ | √ | √ | √ |
ZEROSPAM | ClamAV | √ | √ | √ |
(Please refer to the text for full product names.)
Local solutions | Anti-malware | IPv6 | DKIM | SPF | DMARC | Interface | |||
CLI | GUI | Web GUI | API | ||||||
Axway | Kaspersky, McAfee | √ | √ | √ | √ | ||||
Bitdefender | Bitdefender | √ | √ | √ | √ | ||||
ESET | ESET Threatsense | √ | √ | √ | √ | √ | √ | ||
FortiMail | Fortinet | √ | √ | √ | √ | √ | √ | √ | |
IBM | Sophos; IBM Remote Malware Detection | √ | √ | √ | |||||
Libra Esva | ClamAV; others optional | √ | √ | √ | √ | ||||
Spamhaus DQS | Optional | √ | √ | √ | √ | ||||
Spamhaus rsync | Optional | √ | √ | √ | √ |
(Please refer to the text for full product names.)
(Please refer to the text for full product names.)
The full VBSpam test methodology can be found at https://www.virusbulletin.com/testing/vbspam/vbspam-methodology/.
The test ran for 18 days, from 12am on 17 August to 12am on 4 September 2019.1
The test corpus consisted of 273,992 emails. 268,126 of these were spam, 38,928 of which were provided by Project Honey Pot, with the remaining 229,198 spam emails provided by Abusix. There were 5,681 legitimate emails ('ham') and 185 newsletters, a category that includes various kinds of commercial and non-commercial opt-in mailings.
219 emails in the spam corpus were considered 'unwanted' (see the June 2018 report2) and were included with a weight of 0.2; this explains the non-integer numbers in some of the tables.
Moreover, 135 emails from the spam corpus were found to contain a malicious attachment while 417 contained a link to a phishing or malware site; though we report separate performance metrics on these corpora, it should be noted that these emails were also counted as part of the spam corpus.
Emails were sent to the products in real time and in parallel. Though products received the email from a fixed IP address, all products had been set up to read the original sender’s IP address as well as the EHLO/HELO domain sent during the SMTP transaction, either from the email headers or through an optional XCLIENT SMTP command3.
For those products running in our lab, we all ran them as virtual machines on a VMware ESXi cluster. As different products have different hardware requirements – not to mention those running on their own hardware, or those running in the cloud – there is little point comparing the memory, processing power or hardware the products were provided with; we followed the developers’ requirements and note that the amount of email we receive is representative of that received by a small organization.
Although we stress that different customers have different needs and priorities, and thus different preferences when it comes to the ideal ratio of false positives to false negatives, we created a one-dimensional 'final score' to compare products. This is defined as the spam catch (SC) rate minus five times the weighted false positive (WFP) rate. The WFP rate is defined as the false positive rate of the ham and newsletter corpora taken together, with emails from the latter corpus having a weight of 0.2:
WFP rate = (#false positives + 0.2 * min(#newsletter false positives , 0.2 * #newsletters)) / (#ham + 0.2 * #newsletters)
while in the spam catch rate (SC), emails considered ‘unwanted’ (see above) are included with a weight of 0.2.
The final score is then defined as:
Final score = SC - (5 x WFP)
In addition, for each product, we measure how long it takes to deliver emails from the ham corpus (excluding false positives) and, after ordering these emails by this time, we colour-code the emails at the 10th, 50th, 95th and 98th percentiles:
(green) = up to 30 seconds | |
(yellow) = 30 seconds to two minutes | |
(orange) = two to ten minutes | |
(red) = more than ten minutes |
Products earn VBSpam certification if the value of the final score is at least 98 and the ‘delivery speed colours’ at 10 and 50 per cent are green or yellow and that at 95 per cent is green, yellow or orange.
Meanwhile, products that combine a spam catch rate of 99.5% or higher with a lack of false positives, no more than 2.5% false positives among the newsletters and ‘delivery speed colours’ of green at 10 and 50 per cent and green or yellow at 95 and 98 per cent earn a VBSpam+ award.
1 Due to a hardware issue, we decided to run the test for two days longer than usual.
2 https://www.virusbulletin.com/virusbulletin/2018/06/vbspam-comparative-review