Like Tree2Likes
  • 1 Post By hackerman1
  • 1 Post By hackerman1

Thread: VB100 comparative review on Windows 8.1

  1. #1
    hackerman1 is offline Moderator
    Join Date
    Dec 2008
    Location
    Sweden
    Posts
    1,525

    Default VB100 comparative review on Windows 8.1

    "2014-11-14 John Hawes
    Virus Bulletin

    The VB test team put 14 corporate products and 34 consumer products through their paces on Windows 8.1 "

    VB100 comparative review on Windows 8.1: https://www.virusbtn.com/virusbullet...tive#id4258595

    Direct-link to report (PDF): https://www.virusbtn.com/pdf/magazin...omparative.pdf
    HappyAndyK likes this.

  2. #2
    HappyAndyK's Avatar
    HappyAndyK is offline Site Administrator
    Join Date
    Jun 2008
    Posts
    7,284

    Default

    Among the known names only Norman Security Suite and PC Pitstop PC Matic Home Security appear to have flunked the test...

  3. #3
    hackerman1 is offline Moderator
    Join Date
    Dec 2008
    Location
    Sweden
    Posts
    1,525

    Default

    Hi Andy !

    Look at the RAP-quadrant, almost at the bottom of the report.
    Kaspersky are low on proactive detection.
    Outpost are low on both reactive & proactive.
    Both of them usually perform better.

    http://www.virusbtn.com/virusbulleti...umer-large.jpg


    Summary information about RAP testing

    "The unique RAP (Reactive and Proactive) tests measure detection rates using the freshest samples available at the time products are submitted to the test, as well as samples not seen until after product databases are frozen.
    This provides a measure of both the vendors' ability to handle newly emerging malware,
    and their accuracy in detecting previously unknown malware.

    The four-test RAP averages quadrant allows at-a-glance comparison of detection rates by these criteria."

    "The RAP tests are run according to the following procedures:

    RAP samples are split into four sets. The set known as 'week +1' is gathered in the period from one to seven days after the product submission deadline.
    The 'week -1' set covers the deadline day itself and the six previous days.
    The 'week -2' set includes samples gathered eight to 14 days before the deadline,
    and the 'week -3' set consists of samples gathered 15 to 21 days before the deadline."

    "For each product entered for a review, we measure detection using our standard on-demand scanning procedure;
    this uses default product settings and ignores detections labelled as 'suspicious' only.
    Scores used in the per-test RAP quadrants are labelled 'Proactive' (the 'week +1' score),
    and 'Reactive' (the average of the scores for weeks -1, -2 and -3).
    Scores used in the four-test RAP averages quadrant are the averages of each score over the last four tests.
    In the per-test quadrants, products with false positives in the test in question are marked by striking through the product identifier.
    For the four-test RAP averages quadrant, such scores are excluded when calculating averages."

    Full details of the RAP scheme: here


    The X-axis (horizontal) is detection of "new" malware ("0-day"), and the Y-axis (vertical) is detection of "old" malware.
    So a good antimalware-program should be at the upper-right corner.
    Last edited by hackerman1; 18th November 2014 at 21:48.

  4. #4
    HappyAndyK's Avatar
    HappyAndyK is offline Site Administrator
    Join Date
    Jun 2008
    Posts
    7,284

    Default

    The report is huge Btw, which according to the report, appear to be among the best-performers?

  5. #5
    hackerman1 is offline Moderator
    Join Date
    Dec 2008
    Location
    Sweden
    Posts
    1,525

    Default

    Yes, itīs a lot to read.
    I havenīt had time yet, I only read the part about my antimalwareprogram, Emsisoft AntiMalware (EAM)....
    But as I said above: "a good antimalware-program should be at the upper-right corner."
    Take a look at the RAP-image....
    HappyAndyK likes this.

  6. #6
    HappyAndyK's Avatar
    HappyAndyK is offline Site Administrator
    Join Date
    Jun 2008
    Posts
    7,284

    Default

    The following vendors achieved a VB100 award: Agnitum, Arcabit, Avetix, AVG, Avira, Bitdefender, BluePex, BullGuard, Check Point, Defenx, Emsisoft, eScan, ESET, ESTsoft, Faronics, Fortinet, G Data, Ikarus, iSheriff, K7, Kaspersky Lab, Kingsoft, Kromtech, Lavasoft, Microsoft, Optimal Software, Panda, Qihoo 360, Quick Heal, Roboscan, Tencent, ThreatTrack, Total Defense, TrustPort, ULIS and Wontok.

    8 products including Avast, Norman, PC Pitstop, CYREN command, etc, did not get VB100.

  7. #7
    hackerman1 is offline Moderator
    Join Date
    Dec 2008
    Location
    Sweden
    Posts
    1,525

    Default

    You donīt have to read the report to find out which antimalware-programs achieved a VB100.
    For just a quick overview take a look at the summary: https://www.virusbtn.com/vb100/archive/summary

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

Log in

Log in

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22