Anti-Spam Vendor Analysis

advertisement
Anti-Spam Vendor Analysis
November 2003
MiaVia, Inc.
Not for Distribution
Anti-Spam Vendor Analysis
Table of Contents
The Players
1
Anti-Spam Solution Deployment Methods
4
Customer Acquisition
5
Partnerships
6
Spam Identification Methods
7
Filtering Accuracy Comparisons
9
Message Processing Speed
12
Other Anti-Spam Solution Features
13
Pricing
14
Appendix 1: Solution Deployment Methods of Anti-Spam Vendors
15
Appendix 2: List of Customers Mentioned by AS Vendor
17
Appendix 3: Spam Identification Methods by Vendor
18
Appendix 4: Spam Filtering Accuracy Test Results
20
Anti-Spam Vendor Analysis, November 2003
The Players
Existing anti-spam solution providers include software giants such as Microsoft (embedding anti-spam
features in their Exchange server products) and major anti-virus firms who have added new and richer
anti-spam functions to complement their anti-virus offerings.
Leading anti-spam specialty firms include Brightmail, Postini and MessageLabs.
There are nine well-financed newcomers who have received venture capital financing in 2003.
Beyond these twenty or so players, there are at least several dozen lower-tier companies, including antispam specialty firms and other software providers who have added some form of anti-spam solution to
their product line. These include a few companies that have built proprietary offerings around the
SpamAssassin, the highly prevalent open-source solution.
Email server companies, such as Sendmail, Mirapoint, Rockliffe, Stalker and others have partnered with
various anti-spam vendors and have embedded those offerings as part of their email server suite.
Some of the more unconventional anti-spam approaches include trusted/bonded email sender programs,
such as those offered by Habeas and Ironport, which provide information to filters rather then serving as
filtering systems.
Software Giants



IBM Corporation (Lotus/Domino)
Microsoft
Network Associates, Inc.



Sophos (ActiveState)
Symantec Corporation
Trend Micro, Inc.
The major email server and anti-virus vendors include some form of anti-spam functions in their products
or as add-on products. Generally speaking, the biggest of these offers the weakest solutions. Symantec
owns a significant interest in Brightmail and therefore may have less incentive than other powerhouses to
invest directly in its own anti-spam technology. Trend Micro has licensed Postini’s engine, while Network
Associates and Sophos have both acquired third party solutions (Deersoft and ActiveState, respectively).
While Microsoft will provide support for third-party solutions, it can be expected to compete vigorously in
the anti-spam arena and is reputed to have filed as many as 40 patents in the field. Network Associates,
having acquired the SpamAssassin development team through its purchase of Deersoft, has avowed that
it will continue investing in the space, through internal means and possibly further acquisitions.
Major Anti-Spam Incumbents





Brightmail, Inc.
CipherTrust
Clearswift
MessageLabs
Postini Corporation




SurfControl plc
Tumbleweed Communications
Vircom, Inc.
Zix Corp.
The clear leaders in terms of market presence among the anti-spam specialists include Brightmail, Postini
and MessageLabs. Each has reported impressive sales growth and Postini recently raised $10 million is
series D venture financing.
MiaVia, Inc.
Not for Distribution
1
Anti-Spam Vendor Analysis, November 2003
Well-Financed Newcomers
The spam problem has attracted numerous new challengers to the incumbent players. Eight new antispam companies have received a combined total of nearly $50 million in investment in 2003, on top of the
$10 million raised by Postini this year alone.
Anti-Spam Venture Funding
Vendor
2003 Funding
Prior Funding
Investors
$4.5 million
$1.1 million angel
round (2001)
Ignitition Partners
IPO
Israel Seed Partners
Newcomers
Cloudmark
Series A
7/2003
Commtouch, Inc (CTCH)
$1.6 million
private placement
Argos Capital Management
A private individual investor
Corvigo
$ 5.5 million
Sequoia Capital
Series A
FrontBridge Technologies
$8.0 million
$6.5 million Series
B 12/2002
Series C
8/2003
MailFrontier, Inc.
Bank of America Venture Partners
Sierra Ventures
Waitte Ventures
$10.0 million
$5 million
Series B
Menlo Ventures
New Enterprise Associates
(8/2003)
MessageGate
$5 million
Boeing Ventures
Series A
Polaris Ventures
8/2003
Northwest Venture Associates
Newbury Ventures
MX Logic Inc.
$ 5.5 million
Adams Street Partners
Series A
Grayhawk Venture Partners
10/2003
Proofpoint Inc.
Vista Ventures
$ 9.0 million
Series B
$7 million
RRE Ventures
Series A
Benchmark Capital
7/2003
Mohr, Davidow Ventures
inVentures Group
Stanford University
Total, Newcomers
$49.1 million
$19.6 million
Incumbent Financing
Postini
$10 million
Bessemer Venture Partners
Series D
August Capital
9/2003
Mobius Venture Capital
Sun Microsystems
Summit Accelerator Partners
AltoTech Ventures
Pacifica Fund
Brightmail
NA
$35 million
Accel Partners
Crosslink Capital
Technology Crossover Ventures
Symantec
MiaVia, Inc.
Not for Distribution
2
Anti-Spam Vendor Analysis, November 2003
Other Anti-Spam Vendors
There are probably many other anti-spam companies in addition to the firms listed below. A few of those
listed have licensed an AS solution of one of the above vendors, as noted below. Some of the vendors
are anti-spam specialists, while others offer other products. A few have assembled proprietary AS
solutions built on SpamAssassin, indicated in the table below by asterisks.
Vendor
Products/Services
Anti-Spam Specialists
* Barracuda Networks
AV/AS appliance; AS based on SpamAssassin
Block All Spam
Email hosting & AS service
ePrivacy Group
Traffic throttling appliance for AS
Eridani Star System
UK-based AS specialist
Mail-Filters.com
AS software and service, signature-based
Mailshell
OEM AS supplier
Roaring Penguin Software Inc.
Open source and commercial AS
Sendio
Appliance version of Mailblocks challenge/response
Singlefin
AV/AS/attachment/content (service & appliance)
* SpamCop
The Titan Key
Client AS based on SpamAssassin
Challenge response system
Vendors Offering Multiple Products, Including Anti-Spam
Aladdin Knowledge Systems Ltd.
Public co. Content & software security products
* Alien Camel Pty Ltd
Anti-virus; anti-spam based on SpamAssassin
* Barbedwire Technologies
Network security; anti-spam based on SpamAssassin
Bluecat Networks
Appliances for managing DNS, AV, AS
Cobion
German Web content filtering & AS
Computer Mail Services, Inc.
AV/AS & email content security solutions
Easylink Services Corporation
AV/AS/content filtering; large line of other products
* Fortress Systems Ltd.
AV/AS software; AS based on SpamAssassin
GFI Software Ltd.
Security and messaging software
Gordano Ltd
Messaging software
* Inova
Brazilian sub. Int’l co. AS based on SpamAssassin
IntelliReach
Email management solutions
Ipswitch, Inc.
Messaging, network monitoring, file transfer software
Lightspeed Systems, Inc.
Network control, AS, porn filtering, file share control
* Lyris
Email marketing/AS; AS based on SpamAssassin
NEMX
AV/AS/secure content management for MS Exchange
NetIQ Corporation
Systems/security management, web analytics
Net-Sieve
AV/AS/content filtering
Nokia, Inc.
Diversified mobile communications conglomerate
Solid Oak Software
Client-based web content filtering
Sunbelt Software
Diverse line of Windows software tools
Sybari Software Inc.
AV/AS email security
Webwasher AG
Web content filtering, Mailshell AS OEM
MiaVia, Inc.
Not for Distribution
3
Anti-Spam Vendor Analysis, November 2003
Anti-Spam Solution Deployment Methods
Anti-spam solutions began as client software applications but have evolved into server-based solutions
offering greater control to system administrators and relieving some or all of the burden of spam
management from end users. Three server-based approaches are employed:

Server software deployed as an email gateway, relay, or MTA add-on at the customer site.

An appliance, or server software provided on a pre-configured appliance device, which then may be
inserted into the customer’s network as an email gateway or relay in front of one or more mail servers

A hosted service that accepts the customers email flow, performs filtering operations at the vendor’s
site, then forwards mail to customers according to its spam and virus evaluations and according to
customer preferences.
Although a few vendors offer more than one of these options, most restrict themselves to one method of
delivering the solution. About 70% of the vendors offer server software, 40 percent offer a hosted service
and the remaining 30% offer an appliance. This relationship is true for the roughly five dozen companies
included in this report and also for a selection of companies considered among the top twenty due to their
market presence or funding:
Solution Deployment Methods of Major AS Vendors
Server Software
Appliance
Service
Brightmail
Proofpoint
CipherTrust
FrontBridge
Clearswift
Sophos (ActiveState)
Corvigo
MessageLabs
Cloudmark
SurfControl
Network Associates
MX Logic
Commtouch
Symantec Corporation
Tumbleweed
Postini Corporation
MailFrontier
Trend Micro
MessageGate
Tumbleweed
Network Associates
Vircom
MiaVia, Inc.
Not for Distribution
4
Anti-Spam Vendor Analysis, November 2003
Customer Acquisition
According to an April 2003 report by Ferris Research 1 global business anti-spam seat deployment will
increase from 11 million in mid-2003 to over 500 million seats in 2008 with revenues from anti-spam
services topping $1 billion. Corporate anti-spam services will total about $55 million in 2003 and over
$850 million in 2008. At the same time, the global ISP anti-spam seat deployment will increase from
about 175 million in mid-2003 to nearly 1.2 billion seats in 2008. In terms of revenues, this translates to
$66 million in 2003 and over $200 million in 2008.
Various industry analysts have estimated that between 50% and 75% of enterprises have adopted some
form of anti-spam solution already. Judging from the types of customers listed by some of the anti-spam
vendors in this report, there is no sector or industry that is immune from spam as the types of companies
who have purchased AS solutions span all industry categories.
Some anti-spam vendors appear to be achieving significant market penetration. Appendix 2 provides a
list of customers mentioned by AS vendors on their web sites or in their press releases. Additionally, the
following vendors claim notable customer acquisition achievements:
Postini: "Over 1,700 customers"
Sunbelt Software: “Over 1,000 installations in 4 months” …Lists over 700 customers on web site.
SurfControl: has more than 20,000 customers worldwide, signed up another 1,400 new customers in the
first-quarter - including the DTI and Norwich Union.
Brightmail: announced an additional 200 new enterprise customer wins in the third quarter of 2003.
MessageLabs: increased FY 2003 revenues nearly 100% over the previous year, with North American
market revenues alone increasing nearly 300% and Asia Pacific revenues increasing more than 200%.
The company expanded its client base to more than 7,000 businesses worldwide, signing new global and
enterprise clients including Andrews & Kurth, Arnold Worldwide, The Bank of New York, Bertelsmann,
BP, Diageo, EMI Music, ICI, Random House and StorageTek(TM).
MX Logic: "more than 500 customers"
1
InternetNews.com, “Online Portals Won't Defeat Spam,” April 29, 2003,
www.internetnews.com/xSP/article.php/2198791
MiaVia, Inc.
Not for Distribution
5
Anti-Spam Vendor Analysis, November 2003
Partnerships
Distribution and technology partnerships play a significant role in revenue generation for the more
established anti-spam vendors. MessageLabs, for example, reported that 60% of its revenues are derived
through resellers.
As seen in the list below, partnerships with anti-spam vendors are flourishing. Undoubtedly much of the
recent infusions of venture capital into the AS sector will be used to forge new partnerships with fledgling
anti-spam vendors.
Anti-Spam Vendor Partnerships
Brightmail
Frontbridge
Postini
Borderware
Symantec
Ironport
CriticalPath
Openwave
Edox
IBM
Syntegra
Oracle
Sun
Sprint
Sophos
Cable & Wireless
Symantec
AT&T
Trend Micro
Trend Micro
McAfee
Cloudmark
ZoneLabs
Sendmail
Mail-Filters.com
Sybari
FilterLogix
Elron Software
MessageLabs
BT
Cable & Wireless
Easynet GmbH
MCI
MiaVia, Inc.
Not for Distribution
Sophos
McAfee
Ositis
Rockliffe
SurfControl
Microsoft
Check Point
Cisco
IBM
Nokia
6
Anti-Spam Vendor Analysis, November 2003
Spam Identification Methods
While anti-spam solutions can be compared on a variety of features, including how spam messages are
handled, what reporting and other administrative controls are provided, which operating systems are
supported, etc., the foremost feature is generally considered to be the level of filtering accuracy. Accuracy
is determined by the spam identification method. While this report will review spam accuracy
measurements in a subsequent section, this section summarizes the spam identification methods
employed by the various vendors. The information sources used to produce this review consisted
primarily of public information, such as vendor web sites, white papers and published articles.
The “Cocktail” Approach
In general, most vendors rely on multiple spam identification techniques in order to compensate for the
inability of any one technique to produce an adequate level of filtering by itself. For the 57 vendors
profiled in this report, the average number of spam identification methods used per vendor was
approximately six, and only five vendors employ fewer than four methods.
The techniques most frequently combined include:






Blacklists (51 vendors)
Whitelists (46 vendors)
Message header attributes indicating spamming behavior (53 vendors)
Keyword or key-phrase matching (47 vendors)
Statistical model, usually Bayesian, but not always (41 vendors)
Some form of digital “fingerprinting” (32 vendors)
Less frequently employed spam identification methods include:











Comparing embedded links to a spam link library (7 vendors)
Machine learning or artificial intelligence aimed at processing natural language, such as Corvigo’s
"Intent-based filtering” based on natural language processing, Proofpoint’s “machine learning
algorithms” and SurfControl’s “neural network” approach.
Specific attached file types (Barracuda Networks, Symantec)
Foreign language character sets (Sunbelt Software)
Image file content (Aladdin Knowledge Systems)
OCR text detection in image files (Cobion)
Number of message recipients per message (Clearswift)
Programming code detection (Cobion)
Statistical analysis of header information (Commtouch)
Number of user “spam” votes (MailFrontier)
Challenge/response (Block All Spam, Easylink Services, Singlefin, The Titan Key)
Seven of the profiled vendors incorporate or base their anti-spam product on SpamAssassin. These
vendors include Alien Camel, Barbedwire Technologies, Barracuda Networks, Inova, Lyris, Roaring
Penguin Software, SpamCop.
The ways in which various spam identification methods are combined into an anti-spam cocktail are not
usually clear from reviewing each vendor’s product information and are often kept secret. In some cases
an overall score is derived by assigning weights to each different method while in others a series of tests
is performed and any spam-signifying result may be taken as conclusive evidence of a message being
spam.
MiaVia, Inc.
Not for Distribution
7
Anti-Spam Vendor Analysis, November 2003
Vendors Using Singular Spam Identification Methods
A few vendors have rejected the “cocktail” approach, or at least claim to do so, and have built systems
using one or two spam identification methods:




Cloudmark uses only a statistical model of spam messages
Corvigo uses a machine-learning approach which also can be thought of a as a model of spam based
on natural language processing
Mail-Filters.com uses a spam signature database in combination with header analysis.
Net-Sieve uses Bayesian analysis in combination with heuristics.
There do not appear to be any vendors among the group studied in this report that rely completely on a
message body fingerprinting approach to spam identification, and none that subject each sample from
which fingerprints are derived to a manual message inspection process.
Topical Classification
Most anti-spam vendors provide a binary classification system that causes each message to be classified
as either “spam” or “not spam.” A few vendors attempt to provide a more granular approach so that users
can customize the operation of the filter to differing tastes. Postini, for example, grades messages
according to broad categories such as “Adult,” “Offers to good to be true,” and “Make Money Fast.” Since
all of these categories exemplify content most recipients wish to avoid the categories are probably not
terribly useful. Cobion also attempts to judge the categories of messages. Again, the categories are broad
and tend to be of types universally disliked, such as pornography, hate, criminal, etc.) so there is probably
little value to such a classification scheme. Further, the categories to which messages belong are
determined from analyzing passages of text, which can fail whenever messages do not contain text
passages (such as image-based messages).
MiaVia, Inc.
Not for Distribution
8
Anti-Spam Vendor Analysis, November 2003
Filtering Accuracy Comparisons
With so many vendors to choose from it is understandably difficult for potential customers to evaluate the
different vendors, their methodologies and their claims. Short of trying products that seem like a fit,
customers must rely on third-party tests to help them sort out the good from the bad. Third party tests
suggest significant accuracy differences among competing solutions and a point to a need for
improvement in both identifying spam and avoiding false positive errors.
Unfortunately it is quite difficult for third party testers to submit anti-spam solutions to truly rigorous and
meaningful tests. The challenge arises from possible biases involving samples messages, how the tested
products are configured and how accuracy is measured. The biggest challenge is probably getting a good
message sample set: how can a filter tester acquire enough messages that represent both spam and
non-spam to simulate actual operating conditions? If too few messages are used as samples and if they
are not sufficiently representative then filter accuracy comparisons may be quite biased.
With these limitations in mind, this report reviews three anti-spam comparison tests that have been
performed during 2003: two tests conducted by PC Magazine and one test for Network World Fusion by
the consulting firm Opus One. In each case the results are somewhat suspect, particularly with respect to
the false positive rates, because relatively small sets of non-spam messages were used (fewer than
10,000 messages per test). The tests also used somewhat different methodologies and did not include
the same products in each test.
If the tests are considered
indicative, the results indicate
extremely wide variation in both
positive error rates and spam
identification rates. The results also
suggest substantial room for
improvement with regard to filtering
accuracy. This finding may help
explain why a recent Infoworld
of 902 IT managers revealed that
spam software was considered the
over-hyped technology of the year.2
Test 1
PC Magazine, 25-Feb-03, “Corporate Antispam Tools…”
http://www.pcmag.com/article2/0,4149,849390,00.asp
Tested 2,500 per day over several days during February,
2003.
false
Test 2
Opus One (for Network World Fusion)
Buyer's Guide: Anti-spam Test: Spam in the wild
http://www.nwfusion.com/reviews/2003/0915spam.html
Tested 7,840 spam messages and 3,648 non-spam
messages during June, 2003
survey
antimost
Test 3
PC Magazine, Nov. 11, 2003 “Enterprise Spam Tools…”
The highest spam detection rates
across
http://www.pcmag.com/article2/0,4149,1357946,00.asp
all three tests were produced by
Tested "4 days worth of messages from a catch-all account"
MessageLabs (96%) and Postini
(94%),
however the false positive rates
associated with these results
ranged from 4 to 5 messages per thousand, and Postini’s false positive rate was measured at 13-14
messages per thousand in the other two tests.
Most of the notable anti-spam vendors’ anti-spam tools were included in at least one of the three tests,
but five of the companies that received VC backing in 2003 were not included in any of the tests
(Commtouch, Corvigo, MessageGate, MX Logic and Proofpoint). Neither McAfee nor Symantec were
included.
Only Postini appeared in all three tests, while Brightmail and Frontbridge appeared in both PC Magazine
tests. The table below presents a summary of the test results, while details from all three tests are
included in Appendix 4.
2
InfoWorld, July 25, 2003
MiaVia, Inc.
Not for Distribution
9
Anti-Spam Vendor Analysis, November 2003
Summary of Anti-Spam Accuracy Test Results*
Vendor
Spam %
Comment
2.9%
80.5%
Test 2
0%
88.9%
Test 3; improved greatly over Test 1
Ciphertrust
0.24%
94.1%
Test 1
Clearswift
2.3%
48.5%
Test 2
Cloudmark
1.3%
82.0%
Test 2
23.4%
83.4%
Test 2; over-reliance on keywords
0.7%
77.9%
Test 2
20.5%
23.1%
Test 2
0.3%
62.1%
Test 3; test excluded use of blacklists
56.3%
3.6%
Test 2
iHateSpam
2.9%
74.4%
Test 3
MailFrontier
0.5%
93.0%
Test 3; improvement over Test 2
0.48%
96.0%
Test 1
MX Logic
0.5%
77.0%
Test 2
Postini
1.4%
84.9%
Test 3; Test 2: 0.4% f+ , 94% spam %
Singlefin
2.9%
86.2%
Test 2
SurfControl
3.3%
76.5%
Test 2; worsened from Test 1
Trend Micro
0.8%
60.3%
Test 3; improved over Test 2
Tumbleweed
1.2%
81.3%
Test 2
ActiveState (Sophos)
Brightmail
Computer Mail Services
Corvigo
EasyLink
Frontbridge
GFI
MessageLabs
False + %
*The most recent test data are shown if a product appeared in more than one test. In the most recent test Postini’s results declined
and Frontbridge’s spam detection rate fell. Full results for all three tests are in Appendix 4.
From the above data it is clear that some systems performed much better than others in terms of filtering
accuracy. The poorest results were noted among companies that are not anti-spam specialists,
suggesting a comparative advantage of specialization.
Since there are two dimensions to filter accuracy measurements, it is helpful to arrange the data
graphically into an x-y chart that shows how each vendor’s solution performed relative to others along
axes representing false positive error rates and false negative error rates. The chart below provides this
view, after first eliminating the worst performers, since the worst performers’ results are so bad they
cause the better performer’s results to be too tightly clustered on a chart to easily discern their
differences. Vendor results were excluded from the chart if their false negative rate exceeded 20% or
their false positive rate exceeded 2%. The chart also includes test data from all three tests if the results
met these thresholds, so in some cases more than one data point is plotted for a single vendor, indicating
different results for different tests.
MiaVia, Inc.
Not for Distribution
10
Anti-Spam Vendor Analysis, November 2003
Of the solutions that performed within the 20% and 2% error thresholds, substantial accuracy differences
are still apparent. Brightmail’s zero false positive error result is admirable, but came at a cost of misidentifying 11% of the spam messages. CipherTrust, Postini and MessageLabs appeared to offer the best
overall compromise between false positives and false negatives, yet the tradeoff still requires a level of
false positives that may be too high for many users (24 – 48 messages per thousand). It should be noted
that variability across the different tests could cause significant biases, which may explain the
substantially different results observed for Postini among the three tests.
The definition of “false positive” can be a significant factor influencing the test results. Some bulk email
items can be identified as spam but might not be considered spam by the intended recipient. As a test of
this phenomenon, the author of this report signed up to receive a variety of opt-in email messages using a
Hotmail address. Hotmail’s email is filtered by Brightmail. The types of subscriptions included general
news (such as ABC News, MSNBC News, various IT publications and academic journals). After receiving
over 4,000 of these messages the result was that Brightmail marked approximately 25% of these opt-in
bulk email messages as spam. This finding, while anecdotal, casts doubt on the accuracy of the false
positive measurements reported in the PC Magazine and Network World Fusion tests.
MiaVia, Inc.
Not for Distribution
11
Anti-Spam Vendor Analysis, November 2003
Message Processing Speed
The Network World Fusion test also included measurements of message processing speed. The results
showed a significant range of speeds, which might be a decisive factor for some customers. The
processing speed may be varied for software-based systems by augmenting the hardware on which it
runs, while appliance-based systems are more rigid. Filtering services, such as Postini, are a different
matter, with speed determined by the vendor’s infrastructure and general Internet latency.
Anti-Spam Message Delivery Speed
Trend Micro*
20
Cloudmark*
20
MailFrontier*
20
Tumbleweed
10
Corvigo
7.25
Clearswift
6.7
Postini
6
EasyLink
3.8
MX Logic
3.6
ActiveState
3.2
SurfControl
3
Vircom
2.6
GFI
2.4
Singlefin
Computer Mail Services
1.25
0.5
Messages Per Second
*The test setup was for 20 messages/second. Products marked with an asterisk theoretically could deliver
at higher speeds.
MiaVia, Inc.
Not for Distribution
12
Anti-Spam Vendor Analysis, November 2003
Other Anti-Spam Solution Features
The most important filtering features are undoubtedly filtering accuracy, speed and deployment method.
However, a variety of secondary features may influence purchase decisions. These additional features
are generally of an administrative nature or provide options for handling messages identified as spam.
There also are differences among filtering software vendors as to which operating systems are supported
and whether software products are directly integrated with other products, such as mail servers or antivirus systems. A detailed analysis of these aspects is beyond the scope of this report. The Network World
Fusion buyers guide (http://www.nwfusion.com/bg/2003/spam/index.jsp) provides rich detail on these
subjects.
Some of the more distinctive message handling and administrative features that were found in compiling
this report include the following:
Customization of spam identification for groups and individual end users, enabling users to adjust the
threshold used to identify spam.
End user quarantine for Exchange: Administrators can choose to quarantine spam on their Exchange
server in a spam folder that end users can view from Microsoft Outlook. The Exchange quarantine allows
end users to access the quarantine without having to authenticate to a web site or learn a new interface.
Emailed quarantine digests that enable end users to retrieve valid messages blocked by spam filters. The
digests provide end users with convenient access to their quarantined messages through their email
inbox, without administrative intervention or additional tools.
Logs and reports on email statistics: system administrators can view logs or generate reports at any time
from the desktop using a Web interface.
Log file rollover for easy archiving and management of log files.
Custom rules editors enable administrators to adapt filtering to handle a particular spam problem that is
not addressed by the filter’s default settings or capabilities. Whitelisting important sender domains, for
example, can provide a good measure of protection against false positive errors.
Error-reporting plug-in for Outlook: With the click of a button, users can submit missed spam (or false
positives) to the filter vendor for analysis. Administrators can receive a copy of all junk and false positive
submissions, providing insight into how effective and accurate the solution is performing from an end user
perspective.
Graphical installation lets administrators choose which components to install, sets default settings and
guides the administrator through the registration process. The installer also supports uninstallation and
upgrading by installing a new version in place over an existing version.
Graphical administration console enables administrators to configure logging levels, create and view
reports, monitor performance statistics, start/stop services, provide automatic failover, and support remote
administration.
Fail-over provides the ability of multiple filter clients to fail over to multiple servers and continuance of mail
flow in the event of complete filter failure.
MiaVia, Inc.
Not for Distribution
13
Anti-Spam Vendor Analysis, November 2003
Pricing
Pricing observations obtained for this report came from figures supplied by vendors themselves to
Network World Fusion for their vendor listings. There appears to be wide variation in pricing, although
some general patterns are evident:
Anti-spam server software tends to be priced on a per-user basis, between $10 - $30 per user per year,
although some of the less prominent vendors offer flat fee software pricing ranging from several hundred
dollars up to $8,000 per CPU. NetIQ offers a hybrid price structure with a flat $2,000 fee plus a relatively
modest $7.50 per user per year.
Pricing for anti-spam appliances also tends to be priced on a per-user basis, either directly or by selling
units capable of handling different sized mail volumes and pricing them accordingly. Prices quoted ranged
as low as $2,500 and as high as $36,000, with most clustered within a range of $10,000 - $20,000.
Anti-spam services tend to be paired with anti-virus as well, and generally are priced in the range of $10 $30 per user per year.
MiaVia, Inc.
Not for Distribution
14
Anti-Spam Vendor Analysis, November 2003
Appendix 1
Solution Deployment Methods of Anti-Spam Vendors
Vendor
Aladdin Knowledge Systems
Alien Camel
Barbedwire Technologies
Barracuda Networks
Block All Spam
Bluecat Networks
Brightmail
CipherTrust
Clearswift
Cloudmark
Cobion
Commtouch
Computer Mail Services
Corvigo
Easylink Services
ePrivacy Group
Eridani Star System
Fortress Systems
FrontBridge
GFI Software
Gordano
IBM (Lotus/Domino)
Inova
IntelliReach
Ipswitch
Lightspeed Systems
Lyris
Mail-Filters.com
MailFrontier
Mailshell
MessageGate
MessageLabs
Microsoft
MX Logic
NEMX
NetIQ Corporation
Net-Sieve
Network Associates
Nokia
Postini Corporation
Proofpoint
Roaring Penguin Software
Sendio
Singlefin
Solid Oak Software
Sophos (ActiveState)
SpamCop
Sunbelt Software
SurfControl
Sybari Software
MiaVia, Inc.
Software
Appliance
Service





















































Not for Distribution
15
Anti-Spam Vendor Analysis, November 2003
Vendor
Symantec Corporation
The Titan Key
Trend Micro
Tumbleweed
Vircom
Webwasher AG
Zix Corp.
MiaVia, Inc.
Software
Appliance
Service






Not for Distribution
16
Anti-Spam Vendor Analysis, November 2003
Appendix 2
List of Customers Mentioned by AS Vendor
Brightmail
Accel Partners
Bechtel Corporation
Borland
Booz-Allen Hamilton
Cisco
ConEdison
Cypress Semiconductor
Deutsche Bank
Eastman Chemical
eBay
John Hancock
Latham and Watkins
Motorola
Lawrence Berkeley
Lawrence Livermore
Terra Lycos
U.S. Department of Housing
and Urban Development
Williams-Sonoma
AT&T Worldnet
Bell Canada
BellSouth
BT Openworld
Cincinnati Bell
Comcast
Critical Path
Demon UK (Thus plc)
Earthlink
Hotmail
MSN
ViaWest
Verizon
Cloudmark
Johns Hopkins Univ.
EMI
Documentum
Dolby Laboratories
Fidelity
Restoration Hardware
University of Nebraska
Nolte
Kelly Paper
Rainmaker
Bertelsmann
Frontbridge
Alan Matkins
Gray Cary
Pinkerton
Sprint
Virgin
Bausch & Lomb
Kyocera
Rand
Sunkist
Waterpik
Brown-Forman
Ogilvy
Sovereign Bank
MiaVia, Inc.
Sybase
Xilinx
GFI
Microsoft
NASA
European Central Bank
US Navy
Telstra
BMW
Siemens
Volkswagen
Fujitsu
The Body Shop
Triumph Adler
Berliner Investment Bank
Bayersdorff A.G.
First National Bank and Trust
British Midlands
Peugeot
Toyota
Caterpillar
PerotSystems
MailFrontier
Wyndham International
Pier 1 Imports
CDW
Public Service of New Mexico
Peet’s Coffee & Tea,
CDW
Viasystems
Bellingham Public Schools
MailShell
Access US
Digicon Communications
NewSouth Communications
PMBx
Boston College
George Washington University
Hood College
Lafayette College
Ohio State University
University of British Columbia
University of California, San
Francisco
University of Groningen
University of Wisconsin
Federal Trade Commission
Ontario Financial
Alza Corporation
Avigen Inc.
AirNet Systems
Arrow Trucks
Exel Transportation
MHF Incorporated
Linear Technology Corporation
Puffer-Sweiven Inc.
Tower Engineering
Not for Distribution
American Red Cross
Big Brothers/Big Sisters
National Kidney Foundation
Ronald McDonald House
United Way
MessageGate
Tribune Company
Postini
Accredited Home Lenders
AmSouth Bank
BRE Properties Inc.
Brodeur Worldwide
Cable &Wireless West Indies
Canon Development Americas,
Inc.
Capital Partners Management
Circuit City
Cooley Godward
DPR Construction
Farm Bureau Financial Services
Fenwick & West LLP
Foley & Lardner
Hormel Foods Corporation
Invesco - National Asset
JL Audio
Lord, Abbett & Co.
Merrill Lynch
MidMark Corporation
Morrison & Foerster LLP
Packeteer
Peapod, Inc.
Pullman & Comley
Rand McNally
SkyWest Airlines
Sterling Capital Mortgage
Sutherland Asbill & Brennan LLP
Vertis Inc.
Virchow Krause
Sophos
Canadian Space Agency
Cingular Wireless
CitiStreet
City of Richmond
Cornell University
Davis Vision Inc. (Highmark Blue
Cross Blue Shield)
Duke University
HP
Indiana University
Macrovision
Pulitzer Inc.
Stanford University
State of Indiana
University of California, Berkeley
(CCS Department)
University of Washington
Vignette Corp.
WestJet
17
Anti-Spam Vendor Analysis, November 2003
Appendix 3
Spam Identification Methods by Vendor
Vendor
Aladdin Knowledge Systems
Alien Camel
Barbedwire Technologies
Barracuda Networks
Block All Spam
Bluecat Networks
Brightmail
CipherTrust
Clearswift
Cloudmark
Cobion
Commtouch
Computer Mail Services
Corvigo
Easylink Services
ePrivacy Group
Eridani Star System
Fortress Systems
FrontBridge
GFI Software
Gordano
IBM (Lotus/Domino)
Inova
IntelliReach
Ipswitch
Lightspeed Systems
Lyris
Mail-Filters.com
MailFrontier
Mailshell
MessageGate
MessageLabs
Microsoft
MiaVia, Inc.
Blacklist
Whitelist
Header
Keyword
URL
Bayesian
Linguistic Fingerprint Heuristics
Rate Limit
Other





















































































































































































Not for Distribution










18
Anti-Spam Vendor Analysis, November 2003
Vendor
MX Logic
NEMX
NetIQ Corporation
Net-Sieve
Network Associates
Nokia
Postini Corporation
Proofpoint
Roaring Penguin Software
Sendio
Singlefin
Solid Oak Software
Sophos (ActiveState)
SpamCop
Sunbelt Software
SurfControl
Sybari Software
Symantec Corporation
The Titan Key
Trend Micro
Tumbleweed
Vircom
Webwasher AG
Zix Corp.
MiaVia, Inc.
Blacklist
Whitelist
Header
Keyword


















































































URL
Bayesian


Linguistic Fingerprint Heuristics


















Not for Distribution
Other








Rate Limit
































19
Anti-Spam Vendor Analysis, November 2003
Appendix 4
Spam Filtering Accuracy Test Results
Vendor
ActiveState (probable folder only)
ActiveState (probably and maybe folders)
Brightmail
Ciphertrust
Clearswift
Cloudmark (spam and potential spam folders)
Cloudmark (spam folder only)
Computer Mail Services
Corvigo (junk and bulk folders)
Corvigo (junk folder only)
EasyLink
Frontbridge
GFI
iHateSpam
MailFrontier
MessageLabs
MX Logic
Postini
Singlefin
SurfControl
Trend Micro
Tumbleweed (spam hi + spam folder)
Tumbleweed (spam Hi folder only)
----------------Test 1-------------False + % False - %
Spam %
----------------Test 2-------------False + % False - %
2.9
7.2
0.05
0.24
28.2
5.9
17.9
51.5
14.9
18.0
16.6
15.4
22.1
76.9
4.0
96.0
1.34
9.3
90.7
0.13
19.6
80.4
Spam %
0.0
11.1
88.9
0.3
37.9
62.1
2.9
0.5
25.6
7.0
74.4
93.0
1.4
15.1
84.9
0.3
26.9
73.1
48.5
85.1
82.0
83.4
84.6
77.9
23.1
82.1
0.48
----------------Test 3-------------False + % False - %
80.5
89.4
71.8
94.1
2.3
1.6
1.3
23.4
16.5
0.7
20.5
0.07
19.5
10.6
Spam %
56.3
96.4
3.6
0.7
10.6
89.4
0.5
0.4
2.9
3.3
0.8
1.2
1.4
23.0
6.0
13.8
23.5
39.7
18.7
27.8
77.0
94.0
86.2
76.5
60.3
81.3
72.2
Test 1: PC Magazine, Feb. 25 2003, “Corporate Antispam Tools…”
Test 2: Opus One, Sept. 15 2003 (for Network World Fusion), “Buyer's Guide: Anti-spam Test: Spam in the wild”
Test 3: PC Magazine, Nov. 11 2003, “Enterprise Spam Tools…”
MiaVia, Inc.
Not for Distribution
20
Download