- 1 Post By hackerman1
- 1 Post By hackerman1
18th November 2014, 06:53 #1
VB100 comparative review on Windows 8.1
"2014-11-14 John Hawes
The VB test team put 14 corporate products and 34 consumer products through their paces on Windows 8.1 "
VB100 comparative review on Windows 8.1: https://www.virusbtn.com/virusbullet...tive#id4258595
Direct-link to report (PDF): https://www.virusbtn.com/pdf/magazin...omparative.pdf
18th November 2014, 15:54 #2
Among the known names only Norman Security Suite and PC Pitstop PC Matic Home Security appear to have flunked the test...
18th November 2014, 21:21 #3
Hi Andy !
Look at the RAP-quadrant, almost at the bottom of the report.
Kaspersky are low on proactive detection.
Outpost are low on both reactive & proactive.
Both of them usually perform better.
Summary information about RAP testing
"The unique RAP (Reactive and Proactive) tests measure detection rates using the freshest samples available at the time products are submitted to the test, as well as samples not seen until after product databases are frozen.
This provides a measure of both the vendors' ability to handle newly emerging malware,
and their accuracy in detecting previously unknown malware.
The four-test RAP averages quadrant allows at-a-glance comparison of detection rates by these criteria."
"The RAP tests are run according to the following procedures:
RAP samples are split into four sets. The set known as 'week +1' is gathered in the period from one to seven days after the product submission deadline.
The 'week -1' set covers the deadline day itself and the six previous days.
The 'week -2' set includes samples gathered eight to 14 days before the deadline,
and the 'week -3' set consists of samples gathered 15 to 21 days before the deadline."
"For each product entered for a review, we measure detection using our standard on-demand scanning procedure;
this uses default product settings and ignores detections labelled as 'suspicious' only.
Scores used in the per-test RAP quadrants are labelled 'Proactive' (the 'week +1' score),
and 'Reactive' (the average of the scores for weeks -1, -2 and -3).
Scores used in the four-test RAP averages quadrant are the averages of each score over the last four tests.
In the per-test quadrants, products with false positives in the test in question are marked by striking through the product identifier.
For the four-test RAP averages quadrant, such scores are excluded when calculating averages."
Full details of the RAP scheme: here
The X-axis (horizontal) is detection of "new" malware ("0-day"), and the Y-axis (vertical) is detection of "old" malware.
So a good antimalware-program should be at the upper-right corner.
Last edited by hackerman1; 18th November 2014 at 21:48.
19th November 2014, 03:18 #4
The report is huge Btw, which according to the report, appear to be among the best-performers?
19th November 2014, 06:02 #5
Yes, itīs a lot to read.
I havenīt had time yet, I only read the part about my antimalwareprogram, Emsisoft AntiMalware (EAM)....
But as I said above: "a good antimalware-program should be at the upper-right corner."
Take a look at the RAP-image....
20th November 2014, 11:45 #6
The following vendors achieved a VB100 award: Agnitum, Arcabit, Avetix, AVG, Avira, Bitdefender, BluePex, BullGuard, Check Point, Defenx, Emsisoft, eScan, ESET, ESTsoft, Faronics, Fortinet, G Data, Ikarus, iSheriff, K7, Kaspersky Lab, Kingsoft, Kromtech, Lavasoft, Microsoft, Optimal Software, Panda, Qihoo 360, Quick Heal, Roboscan, Tencent, ThreatTrack, Total Defense, TrustPort, ULIS and Wontok.
8 products including Avast, Norman, PC Pitstop, CYREN command, etc, did not get VB100.
20th November 2014, 20:12 #7
You donīt have to read the report to find out which antimalware-programs achieved a VB100.
For just a quick overview take a look at the summary: https://www.virusbtn.com/vb100/archive/summary