ESG
REPORT
Commercial Computing Market
Dynamics
Predicting the Future by Observing the Past
By Steve Duplessie
May, 2009
Copyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.
ESG REPORT
Commercial Computing Market Dynamics
Table of Contents
Table of Contents..................................................................................................................................................... i
Executive Summary ................................................................................................................................................ 1
The Transactional Computing Era ........................................................................................................................ 1
Transactional Era Computing Characteristics ................................................................................................. 2
The Distributed Computing Era ............................................................................................................................. 4
A New Market ....................................................................................................................................................... 5
The Internet Computing Era .................................................................................................................................. 9
Creating, Accessing, Storing, and Finding Data in the Cloud........................................................................ 9
The Situation is Only Going to Get Worse ..................................................................................................... 10
New Infrastructure Demands ............................................................................................................................... 10
Conclusion ............................................................................................................................................................ 12
All trademark names are property of their respective companies. Information contained in this publication has been obtained by sources The
Enterprise Strategy Group (ESG) considers to be reliable but is not warranted by ESG. This publication may contain opinions of ESG, which
are subject to change from time to time. This publication is copyrighted by The Enterprise Strategy Group, Inc. Any reproduction or
redistribution of this publication, in whole or in part, whether in hard-copy format, electronically, or otherwise to persons not authorized to
receive it, without the express consent of the Enterprise Strategy Group, Inc., is in violation of U.S. copyright law and will be subject to an
action for civil damages and, if applicable, criminal prosecution. Should you have any questions, please contact ESG Client Relations at
(508)482-0188.
-iCopyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.
ESG REPORT
Commercial Computing Market Dynamics
Executive Summary
Since its inception in the 1960s, commercial computing has gone through three distinctive periods. The
Transactional Computing Era led to the Distributed Computing Era, which continued until the 1990s. The newest
era, the Internet Computing Era, will dominate the next 50 years or more. Commercial computing capabilities
and the people who are responsible for computing operations will have to adapt quickly to the demands of this
new age.
The Transactional Computing Era
The foundation of modern commercial computing was the business transaction. As soon as machines were
capable of executing complex calculations faster than people, those machines became the tools of business.
Scientific advancements only drove the ―lesser‖ capabilities of commercial computing; while molecular modeling
or space telemetry were exciting fields of science, the universal appeal of the relatively simple business
transaction is what propelled (then and now) commercial computing. Science is a noble cause, but money is
what drives the modern world and, as such, it also drove the advancement of commercial computing. As soon
as business advantages could be gained by utilizing technology instead of people, the treadmill of technological
advancement started and never stopped.
The principles of business are fairly easy to understand: if you take more money in than you pay out, you have a
winner. A business function has a cost and when the cost can be lowered—by improving functional efficiency,
for instance—the laws of business dictate that it must be done.
As business functions migrated from humans to computers in the 1960s, it was almost always due to a
combination of cost savings and efficiency improvements of tactical human tasks. If you had 300 accountants
getting paid $10,000 per year and you could automate 90% of the services they provided via a computer, then as
long as the cost of doing so was lower than $270,000 per year (eventually), it made sense to do it. If, along the
way, you discovered that on top of cost reduction, the computer enabled incremental output improvements, so
much the better. Since the birth of commercial computing, task automation was the goal—whether anyone
realized it or not. Removing inconsistent, limited human beings from tactical tasks has proven to yield business
benefit. It has often been messy, sometimes sad, but as long as the basic principles of business remain the
same, it will always be that way.
Early adoption of commercial computing technologies differed from scientific environments in two key ways.
First, science was typically forward-looking—trying to solve the mysteries of what might be next. Business was
typically backward-looking—how do we get better at the things we have always done? Second, science normally
has not been encumbered by a quest for profit, only answers. Businesses seeking answers for the sake of those
answers do not remain viable businesses for very long.
The second wave of the Transactional Era really began in the 1970s. That is when the commercial computing
environment evolved from one primarily interested in cost savings via task automation to one that leveraged
technology to try to model new ways of generating profits. Instead of settling for automating only tactical
functions, early forward-thinking companies began to leverage technology for strategic and competitive
advantages. That is when the IT industry moved from a handful of big players to thousands of hopeful
participants. If all anyone wanted to do was count things or do payroll, no one could have ever competed with
IBM. It took different ways of thinking on the part of businesses and the industry to try to solve far different—and
often more complex—problems, such as how to extend business advantages, create competitive separation, and
apply intelligence and analytics to find new market opportunities.
Transactional Era systems share common market characteristics—they exist within the very core of the
business. That means transactional systems have the highest level of corporate visibility, pose the greatest
-1Copyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.
ESG REPORT
Commercial Computing Market Dynamics
potential risk, and, while they once offered the greatest strategic potential, rapidly became a case of business
necessity. You either kept pace with the technology advancements of your competition or your cost structure
would lead to peril. Transactional systems also tended to have the most well-defined cost to value, meaning
businesses have always been able to attribute hard dollar values to the functions of those systems. You knew
how much each accountant made, so the return on investment was easy to calculate. If you were able to then
improve intangible factors such as increasing customer satisfaction, lowering business risk (hiring well qualified
accountants was not always easy), and increasing revenues (without the same increase in costs), then everyone
was happy. Knowing the exact measurable value of some business unit makes it relatively simple to calculate a
positive return on investment.
IT industry competition developed as it always does: with imitation first. Others attempted to build mainframes
cheaper, faster, and better, but by the time they posed any legitimate threat to IBM, the game was already over.
Building new value was the key. Trying to steal someone else‘s victory without lending any true new value rarely
(if ever) works without some massive environmental shift. The very nature of the business transacted on those
systems—and the data itself—had such obvious value that as long as the incumbent remains in the game, they
are almost assured of continued dominance in that customer account. The risk of change, be it real or
perceived, is much too high for the business to find a positive return on investment substantive enough to merit
actually taking that risk.
The second wave of the Transactional Era was predicated upon new value being created—most often by new
companies applying new usage models on top of, or next to, investments that had already been made. From
Oracle to SAP, no one made a significant long term threat to unseat IBM in the mainframe space. Instead, they
leveraged new usage models on the mainframe, which provided incremental value to the business by generating
new value on top of the existing infrastructure.
Within the three primary layers of commercial computing infrastructure, IBM once owned 90%+ of the core
computing (server/processor layer) market value even when competition was at its highest. Beating someone
who dominates a market they created is a losing proposition—again, until some significant environmental shift
occurs that alters the very dynamics of that market. It simply wasn‘t going to happen at the core. That is not to
say that others were not able to chip away at other areas. EMC was able to convince the entire marketplace that
the storage component of the transactional system should be considered an independent, primary acquisition
instead of an afterthought peripheral. By doing so, EMC created a new industry and spawned many new
imitators—who ultimately spent time fighting a battle that had already been won (core storage, within the data
layer). Like IBM in the very core of computing, EMC has defended it position almost perfectly over the decades.
Without a direct requirement to make a change, such as the inability to keep up with processing demand, a
highly visible business market will almost never move from something that works (no matter how badly) to
something that presents a potential risk.
Even ―free‖ products have been rebuked by consumers in lieu of continuing to spend millions of dollars on
incumbent infrastructure and applications when the ―business is at stake.‖ In essence, the more visible the
market, the more dominant the incumbent vendor has been.
When the transaction system is considered the heart of the business, few consumers are going to take any
substantial risk on an upstart or even an established brand, regardless of how great the technology is.
Due to visibility and businesses‘ ability to attach hard values to the functions and transactions performed on core
transactional systems, the players who dominated each sub-sector were able to effectively control pricing in the
market. As long as it was possible to articulate a clear return on investment, the business would continue to
make those investments. No sensible business person would refuse to spend $1 if they were all but assured
they could make a return of $1.50.
Transactional Era Computing Characteristics
The attributes required by the early transactional consumers of IT infrastructure were easily determined.
Consumers demanded non-stop, bulletproof equipment and services. Since the cost of a transaction was
known, the cost of downtime was also known. Downtime has always had a known hard cost in the core—one
-2Copyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.
ESG REPORT
Commercial Computing Market Dynamics
that goes directly against the bottom line. The soft costs of downtime continue to escalate as the business
becomes more reliant on these systems, but those costs remain elusive in the justification of capital
expenditures. The hard costs of downtime, however, also continue to escalate and as such, consumers have
been willing to spend significantly above generic averages for technologies that offer the real or perceived ability
to deal with a mission-critical environment. Thus, reliability and fault-tolerance were considered the primary
attributes of the transactional computing era.
The next critical attribute of this ―core‖ market has been the ability to scale transactions. Buyers need to know
that they can keep up with demand in a world of ever-increasing unknowns. Being able to add users and
workloads in a predictable fashion has been paramount. Competitors are always quick to point out how they
might offer greater performance, but convincing buyers that they can do so while improving overall uptime has
proven disastrous. The same has held true for those who bring higher availability systems to the market but
force the buyer to risk limited transaction scale.
These characteristics created the foundational methodologies still employed by the vendors supplying goods to
this market. In order to meet market demands, these organizations incur heavy engineering, test, quality
assurance, and support costs. They are often restricted in bringing out new innovations as they are saddled with
legacy architecture and engineering elements that cannot be replaced without impacting perpetual uptime and
scale bonds with the customer. This keeps efficiencies in design, manufacture, and functionality on the part of
the vendor from reaching the customer in any short term, which in turn keeps costs high for all involved. That is
why it appears as though the core transactional world is the last market to receive advancements that have long
been mainstreamed in other areas. It is also why massive environmental change, such as the onslaught of
distributed computing, was thought to be the catalyst leading to the death of the entire core transactional
computing market. History shows us, however, that this is not the case.
TABLE 1: CORE DATA CENTER/MAINFRAME STORAGE IN THE CORE TRANSACTIONAL COMPUTING ERA
Data Growth (+26% CAGR)
Cost per GB (-24% CAGR)
1955 – 20 GB
1965 – 250 GB
1975 – 20,000 GB
1985 – 200,000 GB
1995 – 1,800,000 GB
$7,000,000/GB
$2,000,000/GB
$140,000/GB
$20,000/GB
$900/GB
2005 – 17,000,000 GB
$8/GB
Source: Enterprise Strategy Group, 2008
If you plot out growth on core transactional system capacity from its advent to today—with countless threats and
changes over almost 50 years—it has grown at approximately 26% CAGR, while the cost of capacity has
decreased at an almost linear pace.
After almost 50 years and endless technological, social, philosophical, and economic threats, the
core transactional systems born in the early 1960s have grown in capacity at an average of 26% per
year. The stock market isn’t even close to that growth rate.
The purported end of the Transactional Computing Era in the late 80s and early 90s was marketed with loud
celebration. The mainframe was too rigid, too expensive, and too ―old.‖ The positive results of such perceived
negative capabilities—such as discipline, process, planning, and a known outcome—were not valued in the
1980s by the new generation until systemic failure became apparent. Core transactional systems have historical
limitations in terms of physical connectivity—which was viewed as a negative at one point. With these systems,
all access was denied and IT had to provide both a physical and logical means to enable a user to access them.
The operational burden on IT was centered on getting users attached to the system. Security—both physical
and electronic—along with physical maintenance, had been easier simply because the systems and data assets
were all typically in one physical location.
-3Copyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.
ESG REPORT
Commercial Computing Market Dynamics
To create digital content in a transactional computing system, you had to be invited, connected,
and allowed to do so.
As the Transactional Computing Era matured, new use (application) models, greater connectivity (accessibility to
more functional groups and employees), and widespread acceptance of commercial computing as a core
business function have carried the market steadily upward—even as newer capabilities at lower price points
entered. The primary incumbent vendors continue to dominate this overall segment despite the passing of 50
years—the visibility and perceived risk inherent in making any real change within that environment continue to far
outweigh any potential benefits in the eyes of the business.
The Distributed Computing Era
Technology advancements in commercial computing rapidly accelerated in the early 1980s. The application of
computing technologies outside of commercial computing spawned intense development in smaller and more
powerful processors and specialized interconnects. Specialized computing platforms succeeded by providing
specific turnkey application functions to specific industry—or departmental—needs. Wang built word processing
systems, DEC built manufacturing systems, Prime built engineering systems, and so on. At that point, systems
manufacturers followed the same method of attacking the market laid out by IBM many years earlier. The only
difference was they were attacking adjunct markets and not going directly after the core. They did not, however,
understand that simply finding a new market opportunity wasn‘t good enough to sustain a growing business. The
minicomputer companies believed their key was to be first—and if they were, they would dominate as IBM had in
the core. What they failed to realize was that the customer did not place anywhere near as much value on
departmental systems as they did on the mainframe systems. Mainframes were boardroom talk, VAXs weren‘t.
This batch of next generation commercial computing companies wisely specialized in areas outside of the core—
areas that were becoming more important to the business. The principles applied by the business to core
computing were founded on the same simple and basic business theory: cut costs and increase efficiency. This
was not restricted to the transactional systems alone; these companies, and many others, were able to create
brand new tangential industries by applying specialization to other areas of the business such as
design/engineering, manufacturing, and office automation.
What these companies did wrong was assume that, because it was necessary to design and build every single
component of the solution initially, they would have to continue down that path forever. They believed that
everything they did was a critical piece of intellectual property and eliminating any piece put them at competitive
risk. They certainly were not interested in gaining any leverage by using commodity components that could also
be used by their competitors—until it was too late.
IBM had the resources and depth of talent to participate in all areas of the computing spectrum. It was effectively
a monopoly and controlled prices in the core compute markets. Customers bought almost every piece of their
systems from IBM, so IBM was fine building its own processors, packaging, memory, and every other major
component. IBM was equally fine building all of the software it could. Since it controlled the operating system
and the hardware, who better to bring new value and new income to systems than IBM?
As the mini-computer folks reached their peak—all engineering and developing their own processors, drives,
memory, etc., along with the applications written to execute on their architectures alone—they suddenly found
themselves trapped. Lacking the sheer size and revenue base IBM had to support all of its development efforts
and facing a rapidly advancing environmental shift with a new market emerging in the shape of the workstation
and PC, they were doomed.
By 1988, the Unix workstation and PC markets were firmly established. These markets required commoditization
in order to meet the economic requirements of the consumer—who, it turns out, didn‘t value departmental
-4Copyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.
ESG REPORT
Commercial Computing Market Dynamics
applications with the same zeal as the company valued core business applications. Companies entering this
space quickly learned that building all the components and software themselves was not feasible—even for IBM.
Instead, those who embraced commoditization were the victors early on. People didn‘t care as much about
mission-critical or bulletproof systems in that market; they cared about cost and application availability. Both
concepts were completely at odds with the philosophies of the core transaction systems and their smaller
brethren in the mini-computer space.
The PC technology wars were very short. Intel and Microsoft grabbed the lead and never looked back. Once
that occurred, hardware value shifted from component engineering to who had the best supply chain,
manufacturing, and distribution. The embedded software market was replaced by a slew of new companies that
could now afford to develop portable applications since Microsoft and Intel made their interfaces widely known
and open. These giants knew that the key to their ultimate success and dominance lay in having a vast library of
application software available. If enough good software was available to run on the PC, then they could ride the
demand wave of each successful software program—whether they invented it or not.
Traditional computer companies entered the PC market, usually begrudgingly, and immediately attempted to
control it by pushing the same philosophy of designing, developing, and controlling all of the pieces. While IBM
may have had all the right resources, it didn‘t have the right philosophy, nor did it have the willingness to adapt to
clear market changes. The company did better than most, but it was those without historical encumbrances who
won the day.
As the PC revolution took off, some hybrid players took the more open, commoditized concept of the PC and
applied it to the commercial world where the mini-computer ruled. Apollo and Sun created workstations that, in
essence, combined the benefits of commoditization and open architectures and applied them to specific tasks
within the business community. These workstations had significantly lower cost structures than mini-computers
and often outperformed them. They also opened up their operating systems and processor architectures to
encourage others to write software that could take specific advantage of what they brought to the party.
Accounting systems were the purview of the mainframe. Engineering was still manual. Just as the accountant
was a highly paid, highly skilled, and difficult to find employee years ago, engineers were even more so. Sun
and Apollo were able to build systems with tremendous compute performance characteristics at a fraction of the
cost of a time-shared mini-computer implementation, which allowed engineers to leverage software tools, such
as AutoCAD, which in turn made them so much more efficient and productive that the cycle of business ‗follow
the leader‘ occurred all over again. Soon, every engineer in business was armed with tools that made them so
much more effective—while unknowingly reducing business risk and increasing business value—that within a
few short years, a generation‘s worth of manual engineering tools and process were all but forgotten.
The functionality and market that the mini-computer once owned had been decimated. IBM and others built
applications to attack those functions (time shared) and Sun/Apollo built distributed systems that put the power in
the hands of the creator. There was no room for those in the middle, who had the worst of all possible worlds:
the cost and rigidity of a monolithic systems company with less application value to the user.
A New Market
The Distributed Computing Era began with a brand new market. None of the early participants had any desire or
need to compete with core transactional systems. As more and more PCs and workstations showed up on
employees‘ desks, the end of the first wave of the Distributed Era (mini-computers) drew nearer. It was clear that
individual productivity enhancement could be gained by putting semi-open compute power on each desk and
applying software tools to increase individual efficiency. No one even considered replacing a core business
system with a pile of PCs. Not yet, anyway.
All technology revolutions—whether market/usage- or tech-driven—create new sub-market opportunities. EMC
chipped off the storage in the core to create an entirely new industry. Once disk became a separate market, so
did tape, and so on. As PCs and workstations propagated like wildfire, the software industry came into its own.
For a while, imitators existed—those who wanted to beat Microsoft or Intel, but as before, once established,
those markets were closed without another revolutionary turbulent event or need. New market opportunities
happen when usage kicks in—and causes unforeseen problems along the way.
-5Copyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.
ESG REPORT
Commercial Computing Market Dynamics
A ten person accounting firm was not going to buy a mainframe. When the PC market came into its own,
businesses were able to arm their accountants with systems and tools that rivaled the big companies. There was
no parallel in engineering, so companies large and small alike benefited from the workstation phenomenon
concurrently.
Commercial computing follows the same fundamental paths to market success—or failure.
Almost all commercial computing technology is adopted to create better individual task
efficiencies and to lower the cost per task.
This was true at the genesis of core computing and remains true today.
Realized value of investments made in commercial computing technology accelerates when
individual tools are connected to create larger “group” efficiencies.
Enhancing the productivity of an individual is critical, but accelerated value occurs when a collection of
individuals can provide greater efficiency and productivity gains (along with cost reductions) as a group than
each could alone. In the case of PCs and workstations, that meant enabling collaboration, asset sharing, and
repurposing—and that meant networking.
The quest for increasing or stealing share in existing markets kept Intel and Microsoft pushing ahead.
Companies such as Lotus and Word Perfect brought the world new reasons to invest in desktop technologies.
As businesses began adopting these tools in volume, new market opportunities emerged to solve the unplanned
problems that invariably occur with any new widespread technology wave: operating issues.
The business was capable of supporting core transactional systems, but was not ready to deal with the issues
created by these new individual nodes of compute and data resources. An immediate ―us vs. them‖ mentality
occurred in many commercial business environments. Transactional systems were run by professionals, while
distributed systems were run by mere mortals. Those mortals, especially engineers, were capable of supporting
themselves for the most part—or so they thought. Problems arose when the business realized that the individual
tools provided to individual contributors generated a new corporate asset—data—that had value (albeit nebulous
value) and therefore needed to be secured and protected. Those issues had been solved within the centralized
computing environment, but were brand new in the distributed world. Most corporate IT departments avoided
having anything to do with the new distributed world unless forced to do so.
The second principle of improved group efficiencies meant that once the business realized the benefit of
increased individual productivity, it became natural to extend those benefits beyond the individual. The business
gained huge efficiency by automating the tasks of individuals, but could gain even more by enabling the sharing
of tasks, functions, and information within and across groups. If one engineer developed a new and better way
to do something, it only stood to reason that others within the group would also benefit. But for this to work,
individual machines and data needed to be connected. That is when the networking industry exploded.
IP networking became the eventual de facto standard for connecting all of these pieces together. Centralized
computing centers required batch jobs to share assets, such as printers. IP networking enabled workgroups to
share smaller assets with each other. IP networking prevailed over all of the individual proprietary alternatives
because in the new business world, companies no longer used products from just one manufacturer. IBM
controlled the mainframe, but not the workstation or PC markets. Unless you remained committed to a single
vendor environment, you had to adopt a much more heterogeneous network infrastructure—and that meant IP.
-6Copyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.
ESG REPORT
Commercial Computing Market Dynamics
The only time an imitating product can gain a foothold over an incumbent is when either the use
case caused by an environmental shift creates a situation where the incumbent can no longer
satisfy demand or when the technology/product is commoditized and standardized to a point
where price is the only primary differentiator.
The Distributed Computing Era created many multi-billion dollar industries given opportunity by new technology
use cases and was met with resistance from traditional transactional vendors and users alike. Problems were
created that didn‘t exist in the previous world, which also resulted in brand new opportunities. The IP networking
business found its opportunity because it was paramount to connect all distributed assets in order to best
leverage them. That created industry opportunity for products and expertise. It also pushed the model of
commoditization, which in turn created easier to use products and plummeting costs—further accelerating
adoption, and so on.
Distributed computing is a good idea, but distributed data is not. Having the output of this significant corporate
investment in tools scattered all around the enterprise made for lower operating efficiency and increased risk.
Consolidating that data on more centralized servers became critical in order to best address these newfound
issues. In a way, it is ironic that the answer to the unforeseen problems that occurred due to the unpredicted
success of the Distributed Era ended up being a consolidation effort. We refer to this phenomenon as a ―second
wave‖ market. Second wave markets are often much larger than the originating market—such as the networking
market versus the workstation market.
The client-server wave of distributed computing was a hybrid of individual function execution at the user‘s
desktop combined with central servers housing group data and processing group functions.
Network Appliance (NetApp) became a runaway success by enabling the consolidation of distributed data in the
Distributed Era. Veritas eventually dominated the distributed backup industry. Oracle and SAP were able to
offer their functionality on both mainframe and ―open systems‖ servers in the distributed world, eventually availing
themselves to previously unattainable parts of the market. Storage competitors from the transactional/block
space competed fiercely to move downstream to apply their wares to a net new market opportunity.
In the Distributed Computing Era, corporate computing and data spread to workgroups and individuals connected
via a network. Anyone with the asset on their desk had the ability to create digital content—which evolved from
disconnected individuals, to networked workgroups, to wide LANs, to campus-wide LANs, to corporate-wide
LAN/WANs, to Internet-connected VLANs, to eventually becoming ―Internet connected.‖
The Distributed Era has evolved from a connectivity perspective, but remains in the control of
the corporation—99% of the digital content created by corporations on distributed systems is
created by employees, business partners, or other trusted users within the business. Unlike
transactional systems built upon block data architectures, distributed systems were built on filebased data. Distributed systems have become accessible over ever-widening geographies, with
content creation occurring at both the client and server.
Growth in the Distributed Era far exceeded transactional data growth—but not at the expense of transactional
systems or the marketplace. Instead, distributed computing brought entirely new use cases for IT and the
business at large; an entirely new data paradigm (files vs. blocks); an entirely new usage and support paradigm
(decentralized vs. glass house); and countless issues ranging from security, to protection, to attempts to keep up
with the demands being set by newly exposed users.
-7Copyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.
ESG REPORT
Commercial Computing Market Dynamics
TABLE 2. OPEN SYSTEMS STORAGE CAPACITY IN THE DISTRIBUTED COMPUTING ERA
Data Growth (+132% CAGR)
1985 - < 1 TB
1995 – 19,000 TB
2005 – 21,000,000 TB
Cost per GB (-37% CAGR)
$20,000,000/TB
$300,000/TB
$2,400/TB
Source: Enterprise Strategy Group, 2008
As Table 2 shows, the compound annual growth rate for distributed data in the corporate world has increased at
five times that of transactional data (132%), while the cost decrease of that capacity has only reduced by 37%
annually. The decline in pricing is at a higher rate than in transactional core systems, but the demand curve has
more than offset the discrepancy. This table makes it easy to see why the overall revenue of the commercial IT
market more than doubled when we added the distributed computing market to the transactional computing
market.
The new usage models that evolved in the Distributed Era altered the distribution of revenue (and capital market
value) from the systems providers to the networking, management, and application providers.
In a relatively short period of time, the world of corporate computing shifted from 100% block-based transactional
data to a world where 85% of all the data generated and utilized within a commercial entity was now file-based—
and born of the distributed computing age.
FIGURE 1. A NEW DATA CENTER PARADIGM
2006 Commercial Computing Data
Core
Transactional
Data (block),
15%
Distributed Data
(file), 85%
Source: Enterprise Strategy Group, 2008
The Internet has extended connectivity and enabled communication and collaboration pathways between users
both inside and outside the corporate domain. From a corporate ―business‖ perspective, the Internet has been
used primarily as a new forum through which a company could create or extend its brand, transact business, and
reduce the cost of connecting trusted users. Forward-looking companies also have leveraged the Internet by
using the corporate domain to enable customer-driven communications, but bi-directional communication and
content creation remains a very small minority of overall digital content in the corporate world. Customer forums,
service and support portals, ordering portals, etc. have all been mostly positive steps designed to improve
customer satisfaction and relations, but still remain largely private and single threaded.
Today, the Internet is the connective fabric that ties us all together, but corporate content creation, access, and
control continues to reside almost exclusively within the corporation itself. The non-corporate blogosphere is one
of the few places where content is created, debated, shared, accessed, and manipulated almost entirely by the
―community‖ itself—typically with very limited participation by the company that founded or sponsored the
community. That is going to change; that change has ramifications that will challenge not only how IT and the IT
industry operate, but the way business itself is conducted.
-8Copyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.
ESG REPORT
Commercial Computing Market Dynamics
The Internet Computing Era
Creating, Accessing, Storing, and Finding Data in the Cloud
Any device connected to the Internet now has the potential to create, access, move, find, manipulate, delete,
store, or manage digital content. In this era, everyone—via almost every conceivable device—can be connected
to everyone else, eliminating geographic boundaries. As in the previous eras, corporations will be responsible
for creating the policies and methods that ensure the proper use and protection of digital assets. Unlike the
previous eras, however, any benefit associated with limited physical connectivity as a defense mechanism has
been effectively eliminated.
The Internet Era of commercial computing has been fueled by several factors:
1. The Internet itself has reached every corner of the world, effectively creating a flat global network where
there are no longer any barriers to connectivity.
2. Technology has enabled the creation of digital content to occur at every point of connection—from the
core transaction system to your desktop, in handheld devices, cell phones, laptops, smart cards,
household appliances, and countless other devices. Anything and everything that can connect to the
network has the potential to create, access, and move data.
3. The ability to share digital content with the world now occurs in real-time.
In previous eras, corporate information assets have been created and controlled almost exclusively by the
corporation itself. Corporate data may be displayed and accessed via the Internet, but rarely does the corporate
world allow non-trusted users to create or manipulate corporate information. Conversely, most consider even the
attempt to access that data to be a cyber-crime or hack. In the Internet Computing Era, we will see a rapid
increase in data creation and access points for trusted users. Eventually, we will open parts of ourselves and our
systems to the rest of the world. Customers and other interested parties will change the way business is
fundamentally done—by providing content, aiding in product development and support, making sales, and calling
you on your mistakes.
This is counterintuitive to hundreds of years of business principles—namely, controlling messages, keeping
secrets secret, and controlling customers‘ and competitors‘ spheres of influence.
Innovative technologies, global connectivity, and new usage models have enabled a new generation of people to
create and share digital content faster and easier than was even conceivable just five years ago. Corporate
computing environments, while lagging behind the consumer markets, are slowly but steadily moving into the
realm of Web 2.0. Whether through online communities, social networking sites, new media, or collaboration, the
commercial computing world must ready itself to participate in the rapidly evolving new realities of business.
Tools such as document collaboration portals, blogs, wikis, streaming media, and a host of other digital content
creation and management applications are enabling organizations to redefine themselves in almost real-time.
New media content is being created for everything from training to marketing and becoming a mandatory
component of everyday business. Whether it‘s blogs or video, content is easier than ever to create—and
management of that content will become harder than ever without significant changes.
The very nature of data has changed. No longer primarily concerned with block-based transactional data, new IT
initiatives must deal with this new breed of data—which is almost exclusively file-based. Furthermore, the files
themselves are changing—becoming larger and larger as their richness increases. With the ability to create data
becoming ever easier, it is no wonder that the amount of data we are forced to contend with is growing
explosively. Growth in both the volume of data and the already enormous complexity of enterprise infrastructure
can only lead to an inevitable—and catastrophic—breakdown.
Basic data management has been a never-ending problem for IT organizations for many years. New business
processes—coupled with the assault of new, large file-based digital content—will likely crush existing norms.
There is an absolute need to re-evaluate processes predicated upon the transaction-based computing of the past
-9Copyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.
ESG REPORT
Commercial Computing Market Dynamics
and focus on finding new methods of dealing with the criteria of this new era. The ways we handle file storage,
backup, archiving, search and retrieval, content delivery, and collaboration will inevitably have to change—in
terms of both infrastructure and management. In addition, infrastructure files in the form of virtual machine
images, reusable web services, file systems, and management databases continue proliferating widely. IT
managers charged with protecting and managing such a dynamic range of formats face an almost untenable
situation.
The Situation is Only Going to Get Worse
1
ESG‘s file archiving market forecast projects that total worldwide file archive capacity will increase from 7,119
petabytes (PB) in 2007 to some 62,749 PB in 2012—a 55% compound annual growth rate . These surveys also
indicate that the growth in volume of archived file-based information exceeds all other categories—enterprises
show approximately 10X growth in the volume of unstructured information stored over the past two years, while
SMBs report almost 13X in the same timeframe. Most customers expect the growth of file-based archives to
continue unabated in the future. Over the next several years, 37% expect the size of these types of archives to
grow between 11% and 20% annually, while an additional 37% expect them to grow more than 20% annually.
As challenging as these rates may seem, ESG‘s analysis indicates that respondents actually underestimated
the real growth rates of files and related storage capacity requirements. For the most part, survey respondents
providing these forecasts focused on traditional file types. They did not generally consider such file types as
virtual machine or desktop images, reusable web services images, or other infrastructure software images and
file types. Furthermore, ESG‘s analysis shows that many customers significantly underestimated the size of files
created by new digital content formats such as audio, video, image, and streaming media.
When asked to identify the type of application that is responsible for generating the most content, office
productivity applications (22%) and document management systems (20%) are the most frequently mentioned.
As might be expected, more structured applications, such as ERP and CRM solutions, generate less content
because the core data supporting those systems is stored in structured database environments. Images used to
document ERP and CRM transactions do contribute to the unstructured information file somewhat, but not as
much as office productivity and document management solutions.
What IT has not yet realized is that the impact on their operations and infrastructure will not evolve seamlessly—
there is a dramatic difference between scaling Word documents and scaling huge rich media content. Every
company is becoming a media company and large files are going to rule the day. Customers, employees, and
markets are global—the content a business houses and protects needs to be stored and delivered globally. But
now, any predictive abilities we once had around the value, capacity, or scale requirements of this data type are
wild guesses at best.
New Infrastructure Demands
The Internet Computing Era widens the gap between the relatively known growth and cost attributes of the
Transactional Era and the less well understood Distributed Era. It was relatively easy to build products to support
the requirements of the Transactional Era—those requirements and attributes were clearly defined. It was much
more difficult to determine valuable and necessary attributes in the Distributed Era because most of the use
cases simply didn‘t exist previously. While many of the hardware architectures presented in that era have been
nothing other than scaled down examples of the same technologies from the Transactional Era, this new world
will require completely new architectures to support new demands and entirely new processes and constructs.
As Web 2.0 migrates from a world of questionable business value to mainstream business deliverables, IT
departments are increasingly finding themselves without a plan. How does an IT operation go from taking three
months to plan and provision infrastructure to support a business application to creating and scaling one in semi
1
All subsequent references to ESG research are from the ESG Research Report: 2007 File Archiving Survey, 2007.
- 10 Copyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.
ESG REPORT
Commercial Computing Market Dynamics
real-time? Without any realistic method to determine what the extent of interactions will ultimately be or how fast
growth will occur, it is very difficult to make any plans.
The infrastructure attributes required in this era include:
Infinite Scale – in real-time, dynamically, with little to no human intervention.
Self-Management – infrastructure needs to automatically re-balance and optimize itself without human
intervention.
Self-Healing – infrastructure needs to withstand failures and automatically adjust/heal itself.
Perpetually Decreasing Commodity Costs – accelerate and leverage declining costs.
This era requires infrastructural ―scale-out‖ at the server, network, and information infrastructure/data layers in
ways that have rarely been done before. No matter whose data you look at, the growth of capacity is
accelerating. The makeup of that data will continue to change.
FIGURE 2. PERCENTAGE OF OVERALL COMMERCIAL DATA BY TYPE
100%
90%
80%
70%
65%
60%
50%
82%
100%
Internet Cloud Data
95%
100%
Distributed Data
85%
Transactional Data
40%
30%
20%
34%
10%
18%
0%
1975
1985
1995
5%
1%
0.10%
2005
2015
2025
Source: Enterprise Strategy Group, 2008
As occurred previously, the new era of Internet computing will ultimately dwarf the previous Distributed and
Transactional Computing eras in the capacity of data generated. It is important to reemphasize that, for the most
part, this will not be in replacement of the data created during those eras, but rather, in addition to it. Like the
Distributed Era prior, this era will compound commercial computing issues—not replace them.
The industry was able to largely react to the Distributed Computing Era by ―dumbing down‖ core transactional
systems to meet cost demands. With no other way to adapt, IT professionals used these same basic
technologies and attempted to force-fit the new usage models of their users into existing processes—again built
upon the knowledge and skills gained in the Transactional Era. It was only after many years that new vendors
embraced new technological implementations designed to deal with the scale issues presented by new models.
Infrastructure scale has always been an issue, as has ―people‖ scale within IT. In early 2000, new scale-out
capabilities began to arrive that addressed the dynamic and unknown requirements of the day, but continued to
be dwarfed in the market by architectures and processes fundamentally designed decades ago—for applications
and use models that simply don‘t work in the new world order.
- 11 Copyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.
ESG REPORT
Commercial Computing Market Dynamics
Conclusion
1. It is not a matter of if commercial entities will adopt ―Web 2.0‖ technologies and demands, only a matter
of when.
2. In order to prepare for the inevitable, IT professionals and the industry must acknowledge reality: current
methods simply won‘t work in the new world order.
3. As an information-centric society, we need to acknowledge that there are three (at least) distinct types of
data to contend with—each with differing value (both inside and outside of the business), scale,
connectivity, management, protection, security, performance, and cost requirements.
4. Corporate entities will use—and therefore need to understand—the data generated during all three eras.
Remember, even Google (or pick your favorite Web 2.0 example):
a. runs transaction systems—just like the rest of us.
b. runs distributed systems—just like the rest of us.
c. runs Internet/Cloud infrastructural systems—just like the rest of us will.
Google just got to the breakdown of data types (by percentage) under management faster than the rest of us.
For IT to be successful in the coming age, it will have to demand much more from the industry. Ratios such as
―administrators to servers (or storage—or anything physical, really)‖ have to be completely upended. Function
per footprint is the only real variable—i.e., the quantity of a function that can occur in a given set of time or space,
such as how much data can be managed, protected, or recovered within the physical realities you face.
Infrastructure will simply have to scale in any dimension at any time and the need for manual human
management has to be almost completely eliminated. Neither people nor systems can be bottlenecks any
longer—the world is moving on Internet time, like it or not.
So while we have only just begun to leverage the technological compute and connectivity advances of the last
few decades, what has become apparent is that it is naïve to assume we can stop the momentum or ever get
back to ―the way it was.‖ The technology adopted by our kids at age four should be a clear indicator of where the
world is heading. Like our generation and those prior, to be successful, we must embrace new realities instead of
attempting to contain or thwart them.
You will have visible, mission-critical, transaction-based systems. You will have distributed/collaborative
systems. You will participate in your business communities in entirely different ways. You will lose the ability to
control certain elements of your world—from who creates and accesses information to what people say about
you and who hears it. You will keep some of your data (maybe most of it) outside of your direct control—in the
―cloud.‖ You will become a media company. You will create, house, stream, push, and pull huge rich content
files to and from every corner of the world.
The only question remaining is: will the vendors who have ruled the first two eras be able to rule—or even
participate—in the new one? If this era evolves as slowly as the Distributed Era did, then incumbent vendors
have plenty of time to make their moves—but if it happens as suspected, seemingly overnight, then we could be
looking at a massive inflection point not dissimilar to the disruption created when the automobile crossed the
chasm, the industrial revolution itself, or the relatively short 50 years of commercial computing. With hundreds of
billions of dollars in play annually (and trillions of market capitalization), the stakes are high. Regardless of who
wins or loses, it is safe to say that the next 10-15 years will make the last 50 seem like any other ancient time our
grandchildren may study in their history books.
- 12 Copyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.
ESG REPORT
Commercial Computing Market Dynamics
20 Asylum Street
Milford, MA 01757
Tel: 508-482-0188
Fax: 508-482-0218
www.enterprisestrategygroup.com
- 13 Copyright
2009, The Enterprise Strategy Group, Inc. All Rights Reserved.