Starting this month, Malwarebytes began participating in the antivirus software for Windows comparison test performed by AV-test.org. This is uncharted territory for us, as we have refrained from participating in these types of tests since our inception. Although recent testing results show Malwarebytes protecting against more than 97 percent of web vector threats and detecting and removing 99.5 percent of malware during a scan on any machine, we still maintain reservations about the entire testing process.
Why participate now?In the past, we’ve avoided AV comparison tests because we felt their methods did not allow us to demonstrate how our product works in a real environment. By testing only a small portion of our product’s technologies, AV comparison tests are often unable to replicate Malwarebytes’ overall effectiveness. However, we understand the importance of independent reviews for those considering a Malwarebytes purchase, so we decided to participate.
Malwarebytes is not a traditional antivirus, and detecting files based on signatures—which is what the testing companies review—is only one of the methods we use to protect our customers from threats. We probably never will be the best performer in this category; it simply isn’t our focus. We mostly rely on other methods, such as hardening, application behavior, and vector blocking defenses that disrupt malware earlier in the attack chain.
What did the test miss?Some of our best technologies block malware before it has the chance to execute. Our application behavior and web protection modules, for example, stop threats earlier in the attack—at the point of delivery instead of the point of execution. However, the URLs tested only represent the final stage of an attack (i.e. the URL pointing to the final payload EXE).
In addition, testers often do not replicate the original infection vector used by malware campaigns, such as malspam, exploits, or redirects. Instead, they download the malware directly, bypassing typical delivery methods. By doing this, they´re controlling the environment, but also missing out on the trigger for many of our detections.
What exactly is checked in these monthly AV-Test.org tests?
- Detections (specifications)
- Detection of URLs pointing directly to malware EXEs (i.e. “web and email threats” test)
- On-demand scan of a directory full of malware EXEs (i.e. “widespread and prevalent malware” test)
- Performance impact, such as browsing slowdown, application load slowdown, slowdown of file copy operations, etc.
- Usability test, with focus on false positives
Unsolicited testsA number of times in the past, Malwarebytes has been included in tests that we were not aware of or in which we didn’t choose to participate. Some even compared our free, limited scanner against fully functional AVs. No surprises there: while the other vendors may have scored higher in their detections, our free scanner still outperformed them in remediation and removal.
Change the testsIf the tests miss out on our best protection modules, you would expect us to try and change the testing methods altogether, right? We did look into this, and it’s not entirely off the table. We feel sure that using live malware or duplicating real-life attacks would show our excellence, but these conditions are hard to replicate for a controlled and equal testing environment.
What we would like to see is a test for zero-day effectiveness, and not a test based on relatively old samples and infection vectors. But again, we also understand that this is hard to achieve for a testing organization that likes to have some control over the environment and in order to create a level playing field.
When and where can we expect to see your test results?As of November 27, 2018, AV-Test.org will include results for our flagship consumer product, Malwarebytes for Windows versions 3.5 and 3.6. AV-Test.org publishes their results publicly every two months. The November 2018 results are the summary of tests performed during September and October. Our participation is only in the “Windows Antivirus” test for home users.
We still do not believe in the “pay-to-play” model, and especially the “pay-to-see-what-you-missed” model that some organizations use. (AV companies, for an additional fee, can see the samples they did not catch in the test and develop fixes in the product for future tests/use.) Nonetheless, we want to give our customers some idea of what we are capable of, even when the playing field is skewed.
We would just like you to keep in mind that, when reviewing our scores, these tests only show part of the whole picture. Many of our best protection modules have been left out of the test entirely—which basically misses what Malwarebytes is truly capable of.
So what would you rather have: a product that does well on AV tests, or a product that detects, blocks, and cleans up threats in the real world?