Paper-13

advertisement
IT Governance Drivers of Process Maturity
ABSTRACT
This study analyses the relationship between key information technology (IT) governance factors and IT
process maturity, as a measure of IT capability, in a multinational field study of 51 organizations. The
study employs the IT processes defined within the COBIT framework as a model of the major processes
that make up IT capability. COBIT framework includes a of set maturity models based on the Capability
Maturity Model (CMM) from the Software Engineering Institute (SEI). Data is collected on all aspects of
IT governance at each organization as well as the level of maturity for each process in COBIT. Maturity
modeling in COBIT includes six attributes of process maturity. We find that the state of business/IT
alignment at each organization and the level of national development are strongly associated with process
maturity. These finding apply whether consider the overall level of process maturity or the level by
domain or attribute.
Keywords: IT governance, IT capability, process maturity, resource based view.
i
IT Governance Drivers of Process Maturity
1.
Introduction
This paper analyzes the association between key information technology (IT) governance factors and IT
process maturity, as a measure of IT capability, in a multinational field study of 51 organizations. We
integrate two strands of theory to develop a set of propositions on the relationship between governance
and process maturity. First we draw on the Resource Based View (RBV) and contingency theory to show
how organizations must build IT capability to support organizational goals. Second, we draw on theories
of corporate governance, including agency theory, to understand the role of enterprise governance of IT.
IT governance is the process by which the objectives of the entity that impact on information technology
are agreed, directed, and controlled (Van Grembergen and De Haes 2009). It includes establishment of
decision rights, setting of objectives and goals, implementation of these objectives and goals in a series of
IT processes and in feedback loops that employ a variety of measurement and metrics. Our theoretical
framework sees IT governance as driving and setting corporate strategy and in turn influencing the
development of IT capability. Development of robust, reliable, and responsive IT processes is an
important aspect of IT governance and capability. Achievement of a level of maturity in IT processes is
arguably a pre-requisite to well-governed and well-managed IT functions.
The Capability Maturity Model (CMM) from the Software Engineering Institute (SEI) has been
particularly influential in development of a structured perspective on the levels of process maturity
(Caputo 1998; Dymond 1995; Raynus 1999). CMM sees a systematic movement from largely chaotic
processes characterized by heroics (Level 1) to managed, measured processes that exhibit continuous
improvement (Level 5) While CMM, and its successor CMMi, arose in software development, there are a
host of CMM-like maturity models in many settings both within IT and elsewhere. Within the IT
governance area, COBIT expands the CMM concept to encompass the complete IT life cycle, which
COBIT divides into 34 IT processes, which COBIT groups into four domains: Plan and Organise (PO),
Acquire and Implement (AI), Deliver and Support (DS) and Monitor and Evaluate (ME). The COBIT
framework is a well-recognized IT governance framework promulgated by the IT Governance Institute
(ITGI 2007a). COBIT adopts and adapts the CMM methodology to devise process maturity levels both
generically, at the framework level, and at the individual process level.
There is a paucity of research on understanding the outcomes of capability development in general or
process maturity, in particular. Beyond specialist areas such as software development (Harter et al. 2000)
1
or quality management (Mithas et al. 2011), there is very little research of the relationship between
governance and capability, including process maturity. We seek to fill this gap. We adopt the COBIT
control objectives as a measure of the IT business processes and the maturity models in COBIT as the
measure of process maturity. COBIT has a singular advantage for this purpose as it covers the complete
lifecycle of IT investment. The generic maturity model in COBIT is comprised of six attributes of
maturity:“Awareness and Communication,” “Policies, Standards and Procedures,” “Tools and
Automation, “Skills and Expertise” “Responsibility and Accountability” and “Goal Setting and
Measurement.” This allows analysis of organization-wide factors that lead to process maturity.
We conduct a field study of 51 organizations in Europe, North America, and Asia. We interview the CIO
and elicit answers to a detailed questionnaire on all aspects of IT governance. All process owners are
identified and interviews conducted. We elicit the level of process maturity for each of the six attributes
of process maturity for each process or processes that the process owner is responsible for. We find that
the overall level of process maturity is relatively low. The highest average process maturity is three on the
five-level scale of process maturity. Many processes have an average maturity level of a little more than
two. Further, the highest ranked processes are concentrated in the more prosaic “Deliver and Support”
(DS) domain of COBIT. Processes in the more governance and strategic-oriented “Plan and Organize”
(PO) and Monitor and Evaluate (ME) domains have relatively and absolutely low levels of process
maturity.
After undertaking a process of data reduction for the IT governance variables, we conduct a regression
analysis on the level of process maturity. We find that business/IT alignment and national development
are strongly associated with process maturity. This finding applies whether we consider the overall level
of process maturity or the level by domain or attribute.
The paper proceeds as follows. In the second section, we review the literature on IT capability,
governance and process maturity. We set out the research questions for the study. In the third section we
review the generation of the IT governance instrument, and the data collection processes. We consider the
particular application of process maturity methods in COBIT. In the fourth section we review the
resulting extensive set of descriptive statistics on governance factors and process maturity levels. We
review our methods for data reduction of the IT governance metrics. We undertake multivariate analyses
to address the research questions. We discuss the implications of our results, set out our conclusions and
call for additional research in the final section.
2
2.
Literature Review
In this section, we discuss two board themes of literature that inform our study: IT governance and the
nature and maturity of IT processes. These two areas of literature intersect and in the final subsection we
integrate these threads. We set up some key research questions that inform the design of the field study
that we analyze in the next section and the results that we report in the following theory.
2.1.
Theoretical Foundations
Several theoretical foundations, including the Resource Based View (RBV) (Mahoney and Pandian 1992;
Wernerfelt 1984), dynamic capabilities theory (DCT) (Teece 2009; Teece et al. 1997), and contingency
theory (Hofer 1975; Woodward 1980), provide guidance on the relationship between governance,
strategy, and resource acquisition in order to develop strategic capability designed to lead to outcomes
that support organizational goals. While these three theories of strategic capability development vary in
orientation – for example, dynamic capabilities theory concentrates on how enterprises respond
temporally to highly competitive markets characterized by rapid changes in the technological landscape –
each sees the enterprise as marshaling a portfolio of resources1 to generate capabilities to respond to
competitive forces that generate an entity-level (as distinct from industry-level) comparative advantage.
Resources are transferable and not distinctive to the firm (Makadok 2001). Conversely, capabilities are
firm-specific and designed to achieve organizational goals. Makadok (2001) refers to capabilities as a
type of “intermediate good” that is a necessary input to the production of final goods and services. In the
IT context, Feeny and Willcocks (1998) define capabilities as a “distinctive set” of “skills, orientations,
attitudes, motivations, and behaviors that .. contribute to achieving specific activities and influencing
business performance.” This is in line with Teece’s view that the comparative advantage of firms “lies
with its managerial and organizational processes, shaped by its (specific) asset position” (Teece et al.
1997, 518).
These theoretical foundations have found wide acceptance in the study of an array of IT concerns
including outsourcing (Cullen et al. 2005; Han et al. 2008), customer service (Ray et al. 2005),
infrastructure development (Bhatt et al. 2010), software development quality (Li et al. 2010), and the role
of IT in supporting the development of supply chains (Seggie et al. 2006; Wu et al. 2006), process
1
Teece differentiates between factors of production which are “undifferentiated inputs” available to all
competitors and resources, that are firm-specific. For the purposes of this discussion, we concentrate on the
deployment of resources that generate firm-specific capabilities. We also incorporate externally acquired
resources from, for example, outsourcing that are applied to the generation of capabilities. Teece et al. also
distinguishes between “Organizational routines/competences,” “Core competences” and “Dynamic
capabilities.” We refer to these generically as capabilities.
3
innovation (Tarafdar and Gordon 2007), and electronic commerce (Ferguson et al. 2005). Indeed, RBV
has, for example, become a standard lens through which we view the contribution of investment in
information technology to enterprise value (Masli et al. 2011).
The theoretical foundation provided by RBV, DCT and contingency theory do not, however, address a
core institutional factor which is the separation of ownership and management so common in modern
enterprises (Berle and Means 1932). These theories are essentially silent on separation of ownership and
management and an undifferentiated firm. In for-profit enterprises, when there is separation of ownership
and management with resulting discordant risk profiles and reward mechanisms, the exercise of
ownership control over management is by a variety of enterprise governance mechanisms (Daily et al.
2003; Shleifer and Vishny 1997; Sundaramurthy and Lewis 2003). We see somewhat similar concerns
and governance mechanisms in the not-for-profit and public sectors. The role of enterprise governance in
oversight, monitoring and direction setting is, then, key. Our theoretical framework is shown in Figure 1.
The framework brings together theories of enterprise governance with theories on strategic capability
development. It shows that governance monitors and guides strategic development. In turn, the enterprise
strategy leads to the acquisition of resources that are transformed into capabilities. These capabilities are
essential inputs to the creation of final goods or services. A feedback loop links the production of
intermediate (capabilities) and final outputs with governance and planning processes. In the following
sub-sections we explore each of these elements of the framework in more detail, and provide a more
expanded framework.
Insert Figure 1 about here
2.2.
IT Governance
Given our focus on information technology capabilities, we concentrate on that part of corporate
governance that relates to IT. This subset is termed enterprise governance of information technology (Van
Grembergen and De Haes 2009) or, more simply, IT governance. In broad terms, IT governance is the
tension between the exercise of decision rights, as a subset of corporate governance, and the design and
execution of structures and processes to implement organizational objectives (ITGI 2003, 2007b; Van
Grembergen and De Haes 2008, 2009). IT consumes considerable resources within modern organizations
and both create and mitigate risks. As we will elaborate in more detail shortly, building IT governance
requires outlay of both monetary expenditures and board and managerial time resources. Evidence on the
returns to investment in IT governance is not yet definitive. While focused primarily on the design of
decision rights, Weill and Ross (2004) note that “top performing” enterprises more effectively bind IT
4
and organizational processes generate significantly higher levels of return on their IT investments than
their competitors.
There has been more recent attention to a broader range of attributes of IT governance including the
determinants of IT investment decision making (Xue et al. 2008), achieving strategic outcomes
(Raghupathi 2007), and business/IT alignment (De Haes and Van Grembergen 2008). In recent years, we
can see the development of common themes on the shape and nature of IT governance. Van Grembergen
and De Haes (2008, 3) define IT Governance as being an “integral part of corporate governance” that
addresses the “definition and implementation of processes, structures, and relational mechanisms in the
organization that enable both business and IT people to execute their responsibilities in support of
business/IT alignment and the creation of business value from IT enabled business investments.”2 IT
governance has become increasingly important as the strategic and tactical role of IT is recognized in
meeting organizational objectives. IT must respond to the need for such organizational attributes as
agility, reliability, and compliance with laws and regulations. The ISO/IEC standard 38500 for “Corporate
governance of information technology” defines IT Governance as: “The system by which the current and
future use of IT is directed and controlled. Corporate governance of IT involves evaluating and directing
the use of IT to support the organization and monitoring this use to achieve plans. It includes the strategy
and policies for using IT within an organization” (ISO/IEC 2008). ISO 38500 establishes six principles
for responsibility, strategy, acquisition, performance, conformance, and human behavior.
An important attribute of IT governance within this view, is the development and maintenance of the
capability to perform key IT processes. The IT function working with the rest of the organization must
build a variety of capabilities to meet organizational strategic objectives. These capabilities bring together
internal and external human resources, software applications, hardware, and other resources in a
systematic fashion to achieve a desired outcome. These outcomes may be strategic in nature, such as the
determination of future direction for the IT function; tactical, such as providing customer service from a
help-desk or problem management, or operational, such as installing a systematic process for backup and
storage of data.
2
Van Grembergen and De Haes and other authors refer to “Enterprise Governance of IT” (Van Grembergen
and De Haes 2009; Wilkin and Chenhall 2010). While arguably more indicative of the exercise of corporate
governance in an IT setting, the more traditional term “IT governance” is neutral and easier to
communicate.
5
Decision Rights, Structures and Accountability
As shown in Figure 2, a key aspect of IT governance concerns who decides the direction of IT. A
concomitant issue is the structuring of IT at the governance and management layers. Indeed, much of the
discourse on IT governance has been on the exercise of decision-making rights and the forms of IT
organizational structure (Brown and Grant 2005; Weill and Ross 2004). A fundamental issue that must be
addressed at the governance level is who decides on the future direction of IT and the shape of the IT
organization(s) within the enterprise.
Insert Figure 2 about here
Agreement on decision rights at the governance layer (see Figure 2) is arguably the most important task
of IT governance. Identification and agreement of stakeholders’ rights to participate in different levels of
decision making on IT strategy and tactics is critical. Deciding on the organizational form of IT in the
enterprise is a vital element of IT governance that flows from and is concomitant with the agreement on
the exercise of decision rights. Weill and Ross (2004, 12) categorize what they describe as six
“archetypes” of organizational form including “business monarchy” where senior executives including
the CIO collectively make strategic IT decisions; “IT monarchy” where the CIO and senior IT managers
make the essential decisions about IT; “Feudal” where major business units drive their own IT strategies
and tactics; “Federal” where decision making and control over IT is divided between headquarters and
business units; “IT duopoly” that involve dualist decision making between an IT function and a single
group such as a key business process owner; and finally “Anarchy” where IT is a response to business
unit needs. Each of these archetypal responses to the need for IT delivery results in different decision
making structures. This, in turn, leads to alternative sets and design of IT business processes. For
example, Weill and Ross (2004, 89) note that federal approaches often are driven by a desire for shared
data and infrastructure across the enterprise. At the same time, a federal system allows business unit
flexibility in implementation of business, as distinct from IT, processes. Conversely, IT monarchical
systems that, while maintaining IT control over direction and decision making, are distinguished by
strong relationships between business units and IT, thereby balancing competing needs. In essence, then
the form of IT functions and the way that they are integrated with the rest of the enterprise are core
elements of IT governance.
With decision rights comes accountability, which is a multi-faceted task involving the board, IT, and
operational management (Weill and Ross 2004, 227-228). Notions of accountability by board to
stakeholders and from management to board are central to corporate governance broadly, generic
6
corporate governance frameworks such as Cadbury, King III and COSO, and IT governance frameworks
including ISO 38500 and COBIT. A variety of structural forms for accountability and systems of
performance measurement have been proposed. For example, a common theme in IT governance is the
creation of co-ordinating mechanisms such as a IT Steering Committee (Bowen et al. 2007; Trites 2004).
The Balanced Scorecard (Kaplan and Norton 1992, 1996; Van Grembergen and De Haes 2009, Chapter
4) is a widely accepted measurement system for reporting performance that is embedded throughout
COBIT and observed in the field (Herath et al. 2010; Hu and Huang 2006).
Interestingly, given the importance of decision rights, structures, and accountability there is surprisingly
little research in the IT context. For example, most of the papers referenced by Wilkin and Chenhall
(2010) in their literature review of IT governance, are drawn from the broader corporate governance
literature. When it comes to the role of CEO and CIO, there is a more substantial literature (Chun and
Mooney 2009; Feeny et al. 1992; Stephens et al. 1992). The evidence shows, unsurprisingly, that top level
management involvement in strategic decision making is a key driver of success with information
technology. This includes enterprise resource planning (ERP) (Al-Mashari et al. 2003; Bradley 2008;
Davenport 2000; Willcocks and Sykes 2000), electronic commerce (Sutton et al. 2008), and knowledge
management (Davenport et al. 1998; Gottschalk 2006). Similarly, despite the perceived importance for
governance mechanisms, such as the IT steering committee, there is surprisingly little research of the
efficacy of such mechanisms (Wilkin and Chenhall 2010, 124).
Strategic Alignment
As the definitions of IT governance we consider above indicate, information technology exists to further
organizational objectives. Achieving alignment between the current and future needs of the organization
and service delivery by IT is a core component of IT governance. For example, the second principle of
ISO/IEC 38500 calls for the Board to “evaluate IT activities to ensure they align with the organization’s
objectives for changing circumstances” when considering “plans and policies” (ISO/IEC 2008).
Achievement of strategic alignment has been an active matter for academic research, arguably predating
any definitive shape of what is now termed IT governance (Jordan and Tricker 1995; Luftman 1996;
Luftman et al. 1999; Powell 1993; Teo and Ang 1999; Venkatraman et al. 1993). Strategic alignment
requires that IT dynamically respond to both current needs but also involvement in new business
initiatives (Wilkin and Chenhall 2010, 113). This is increasingly important as IT is a necessary but not
sufficient condition for successful development of most business processes within the wider enterprise.
Alignment involves attentions to leadership, communication, development of a shared understanding,
7
structure (e.g. IT steering committee), underlying enterprise architecture (Ross et al. 2006, 119-122), and
methods for resource allocation.
While there is widespread discussion on the importance of strategic alignment, evidence of the impact of
strategic alignment on development of capability and firm performance is mixed (Wilkin and Chenhall
2010). Cragg et al. (2002) studied the impact of business strategy, IT strategy, and IT alignment on
organizational performance, in the manufacturing context and they found a strong correlation between
levels of IT alignment and organizational performance. In a study of ERP implementations in
Scandinavia, Velcu (2010) finds that strategic alignment is a clear determinant of ERP success, measured
by on-time, on-budget delivery. Byrd et al. (2006) surveyed privately owned manufacturers employing
previously moderated measures of alignment and outcomes and they found consistent support for the
effect of alignment, particularly co-ordination factors, in improving the relationship between IT
investment and firm performance.
Outsourcing
Outsourcing has been a feature of the IT landscape for at least three decades (Lacity et al. 2009). The
most important reason for outsourcing is cost reduction (Blaskovich and Mintchik 2011; Lacity et al.
2009, 133), although other reasons include the need to improve process capability and business outcomes.
Outsourcing brings risk to the enterprise from supplier credit risk; being locked in to a single vendor; loss
of institutional knowledge; transfer of knowledge to the vendor; reduced internal controls; cultural
differences between vendor and enterprise; and difficulty in managing business processes (Bahli and
Rivard 2003; Lacity et al. 2009, 133). Outsourcing can also reduce risk by providing resources and
capabilities that are beyond the financial and human resources of the enterprise. The level and type of
outsourcing is also related to successful outcomes. High levels of outsourcing, for example, is associated
with low success rates (Straub et al. 2008). As in many other areas of endeavor, the level of top
management support is also associated with success (Iacovou and Nakatsu 2008) as is governance of
programs and projects (Choudhury and Sabherwal 2003), and trust mechanisms in the outsourcing
relationship (relational governance) (Langfield-Smith and Smith 2003; Poppo and Zenger 2002). The
ability for outsourcing to affect enterprise IT capability has only been moderately researched. In the
context of IT consulting, which can be seen as a light, human resources and strategic form of outsourcing,
Nevo et al. (2007) shows that outsourcing positively impacts on overall capability when internal IT
capability is weak.
8
Other characteristics – contingency
From a contingency theory perspective, a number of other organizational characteristics have been
considered when viewing the impact of IT capability on firm performance. Examples include firm size
(Mithas et al. 2011) and industry (Devaraj and Kohli 2000; Hendricks and Singhal 1996).
2.3.
IT capability and processes
In the opening sub-section of this literature review, we introduced the concept of firm-specific
capabilities. We referred to the catalog of IT-specific attributes of capability developed by Feeny and
Willcocks (1998). Each of these attributes, to which we will return shortly, is essentially a business
process. Davenport (1993, 5) defines a business process as “a specific ordering of work activities across
time and space, with a beginning and an end, and clearly defined inputs and outputs: a structure for
action.” In essence, then, IT capabilities are isomorphic with IT processes. In this subsection, we address
three interrelated characteristics of IT processes. First, overall IT capability is made up of a portfolio of
individual IT processes. Second, individual IT processes within the portfolio have differential impact on
the ability of the enterprise to respond to its competitive environment. Third, standardization of individual
processes is a key consideration in contributing to the success of overall capability (i.e. the portfolio).
An organization’s IT capability is portfolio of inter-related processes. Feeny and Willcocks (1998) refer
to nine core IS capabilities, found in three vectors: architecture, business and IT vision and service
delivery. Some of the core capabilities cataloged by Feeny and Willcocks are more intangible in nature
(e.g. Leadership) while others are more straightforward business processes (e.g. Contract monitoring).
Intangible capabilities such as leadership do not happen by accident and require planning and delivery.
These intangible capabilities are, however, of a different character than business processes. Further, while
Feeny and Willcocks catalog constitute human processes, the importance of IT infrastructure in
supporting processes in the portfolio must not be underestimated (Ross et al. 1996).
Seeing IT capability as a portfolio of processes is supported in the research literature. Chen et al. (2008),
in a longitudinal study of business/IT alignment, demonstrate the importance of IT process
interdependence and path dependence in limiting strategic choices. Fink and Neumann (2009) show, in a
large survey, that IT process flexibility and strategic alignment is dependent on managerial knowledge
and skills and not on the availability of physical resources. This finding supports the Feeny and Willcocks
view that capabilities are essentially embedded in processes and in intangible resources – particularly
managerial knowledge and skills – rather than tangible resources. Mithas et al. (2011) catalog a further 10
studies that apply IT capability to firm performance. Taken in the aggregate, these studies show that IT
9
capability is associated with higher firm capability. It should be noted, however, that the definitions of
both IT capability and performance vary widely in these studies.
While we can see overall IT capability as a portfolio of IT processes, individual processes provide
unequal benefits for key strategic attributes. Depending on entity-specific contingent factors, strategic
organizational goals may include agility, reliability, and low cost. For example, Weill and Ross (2004)
and Ross et al. (2006) demonstrates the importance of the processes that underpin enterprise architecture,
including data management. They see mature enterprise architecture as facilitating both agility and
reliability.
As the definition set out in the opening paragraph of this sub-section demonstrates, business processes
bring structure to value-adding activities. Standardization of IT processes provides a consistent way in
which work activities are conducted. Standardization provides reliability and predictability and lowers
cost (McDonald 2007). Davenport (2000) points to a seeming conundrum: standardization can also bring
flexibility. Process and data standardization provides a consistent foundation upon which new
applications can be built (Ross et al. 2006). Enterprise Resource Planning (ERP) systems are perhaps the
ultimate of process standardization as they integrate, routinize, and centralize key business processes
(Grabski et al. 2011). McDonald (2007) points out that increasingly process standardization within
organizations follows a dominant design from outside the entity, employing de facto standards or
frameworks such as ITIL and CMM, to which we will return shortly. This lowers the barriers to adoption,
allowing the enterprise to concentrate on building an appropriate portfolio. Kauffman and Tsai (2010)
take process standardization one step further and view it from an inter-organizational perspective. Just as
process standardization within the enterprise tends to follow a dominant design, industry-wide process
standardization follows a similar but much more complex lifecycle. Kauffman and Tsai show that
industry-wide process standardization moves from small number of de facto standards to a single de jure
standard.
2.4.
Process maturity
Theoretical influences on process maturity
In the previous sub-section, we referred to the concept of process standardization to improve reliability,
predictability, lower costs, and, perhaps counter-intuitively, increase flexibility and agility. The concept
of standardization is drawn from manufacturing and the quality literature and particularly from the early
influential writings of Crosby, Deming and Juran (Crosby 1980; Deming 1982; Juran and Gryna 1999). A
leading advocate for process maturity and reliability within the software industry was Humphrey (1989)
10
whose seminal “Managing the Software Process,” reads almost like a primer for the quality movement. It
focuses on not just what we might see as the more mechanical aspects of software development, such as
process definition, project management, standards, defect prevention, and measurement, but also on softer
aspects such as change management and leadership. Improvement in process standardization is typically
associated with the concept of process maturity. There are a number of frameworks for the determination
of process maturity but the most influential is the SEI’s Capability Maturity Model (CMM) and the more
recent Capability Maturity Model Integration (CMMI), both strongly influenced by Humphrey (Ahern et
al. 2004; Kasse 2004; Staples and Niazi 2008; West 2004).
The CMM provides an intellectual foundation for the measurement of the level of capability reliability at
the level of discrete processes. CMM is defined as “a description of stages through which software
organisations evolve as they define, implement, measure, control and improve their software process”
(Paulk et al. 1995). Humphrey draws the concept of discrete stages of process improvement from Crosby
(1980). The CMM sets out a systematic migration strategy that takes organizations to a level where
software development employs clearly defined processes and appropriate performance metrics. When
software organizations reach the highest maturity level, they exhibit continuous improvement of their
managed, repeatable development processes (Subramanian et al. 2007).
The CMM has five levels, viz: Level 1 - Ad hoc; Level 2 – Repeatable; Level 3 – Defined; Level 4 –
Managed; and Level 5 – Optimized. A process at Level 1 will be managed in a largely ad-hoc fashion and
there may be unstructured allocation of resources, little or no defined policies and procedures, and no
performance measurement. Software development organizations at Level 1 typically demonstrate
processes characteristics dependent primarily on the skills of the software development team to succeed.
The team often employs “heroics” to achieve desired organizational outcomes. Process repeatability
comes at Level 2, with a focus on project management. Level 3 is associated with improved process
repeatability, with an emphasis at this level on achievement of process documentation. The focus at Level
4 is on quantitative management of software development. Finally, a software organization at Level 5 will
draw on industry best practice and be self-improving because of a systematic performance measurement
and analysis feedback loop.
Process improvement and maturity comes about through a set of so-called Key Process Areas (KPAs).
Each KPA encompasses a set of activities to achieve outcomes consistent with a given maturity level. For
example, a KPA at Level 2 is “Software Project Planning” and at Level 3, “Integrated Software
Management.” Humphrey (1989) and others (e.g. Agrawal and Chari 2007; Garcia and Turner 2007;
McGarry and Decker 2002; Niazi et al. 2005) point to the challenges required to move an organization to
11
higher levels of maturity. While CMM has discrete process maturity levels, improvement requires
continuous and management-intensive improvement. The concept of process maturity has been applied in
a number of areas including software development where von Wangenheim et al. (2010) identify no fewer
than 52 standards such as ISO/IEC 15504 “Software Process Improvement and Capability Determination”
(SPICE). Other areas include outsourcing (Gottschalk and Solli-Saether 2006) and data warehousing (Sen
et al. 2006). As we will explore shortly, CMM has also been systematically embedded in COBIT and
provides a foundation for this study.
Process maturity and outputs
While CMM, CMMI and similar frameworks have been in existence for many years and several thousand
CMM assessments have been conducted by the SEI, there is surprisingly little published evidence on the
impact of higher levels of process maturity on key organizational objectives (e.g. cost, reliability,
compliance, agility). While not explicitly concerned with process maturity, an early study by Banker et al.
(1998) of the influences of the various attributes of software maintenance on cost showed that project
team experience was significantly and negatively associated with software maintenance efforts, adjusted
for project complexity. Harter et al. (2000) assessed software projects in a longitudinal study of a major
IT enterprise. They found that increases in process maturity lead to increased effort but also to increased
quality. The quality effects outweighed the effort. They conclude that a “1% improvement in process
maturity leads to a 0.32% net reduction in cycle time, and a 0.17% net reduction in development effort.”
Jiang et al. (2004) surveyed members of the IEEE Computer Society, employing an instrument that
incorporated key elements of the CMM. They found that process engineering and organizational support
activities, that are part of Level 3 activities, to be related to project performance. However, more basic
process activities associated with CMM Level II were not associated with project performance,
suggesting that the real benefits may not flow until organizations reach Level 3.
Mithas et al. (2011) study the business units in a large conglomerate that collect performance metrics
based on the Baldridge quality aware. Mithas et al. study what they term information management
capability. Attributes included making data available for management decision making, information
integrity, and hardware and software quality. Mithas et al. associate information management capability
with four self-measured performance metrics that encompass customer, financial, human resources, and
“organizational effectiveness” dimensions. They find that information management capability is
significantly associated with performance management, process management and customer management.
Taken together, these studies show that process maturity is positively associated with performance, both
measured at a micro-level (e.g. Banker et al. 1998) or at a more macro-level (e.g. Mithas et al. 2011). An
12
important caveat here is that none of these studies address a complete set of governance or other
institutional characteristics, or outcomes. Each study is limited by data availability.
2.5.
COBIT, Process Maturity and Strategy
A broader view of IT processes and maturity.
Each of the studies we discuss above incorporate only partial views of the relationships set out in our
theoretical framework (Figure 1). For example, given the foundation of CMM and CMMI in software
development, studies based around CMM do not allow a clear understanding of capability across the
complete range of activities typically observed within the rubric of IT governance. Conversely, COBIT,
promulgated by the IT Governance Institute (ITGI) (currently version 4.1)(ITGI 2007a; Van Grembergen
and De Haes 2009), is a comprehensive approach to IT governance and management. The major
components of COBIT are the framework, which sets out the overarching principles, and a set of 34
business processes allocated across four domains. These business processes encompass the complete
lifecycle of IT investment, from strategic planning to the day-to-day operations of the IT function. COBIT
defines four primary domains of control that are relevant throughout the lifecycle of information systems
from planning, through development and acquisition of solutions to deployment along with a set of
controls on monitoring and feedback. The four domains within COBIT are Plan and Organise (PO),
Acquire and Implement (AI), Deliver and Support (DS) and Monitor and Evaluate (ME). While COBIT is
slated by the ITGI to be an IT governance framework, it comprises both governance and management
components. Major aspects of the PO and ME domains are at the governance layer. Conversely, the AI
and DS domains are almost entirely at the management layer.
Within each control domain there are a series of control objectives that define the elements of control
over a given business process that a well-managed entity would be likely to employ. The 34 control
processes in COBIT are best seen as a comprehensive set of best practices for establishing management
and control over the complexity and change that characterize the modern IT function. In this study, we
make some adjustments to the processes to allow elaboration of processes that subsume a range of
important sub-processes (e.g. DS 5 – Ensure Systems Security) or where the sub-processes contain
essentially distinct processes (e.g. DS 2 Define the Information Architecture). These processes are
appropriately separated into the distinctive components, resulting in 41 processes, which are set out in
Table 8: Regression Analysis – Overall and by Domain
VARIABLES
CENTRAL
ALL
0.091
13
PO
AI
DS
ME
0.041
0.227
0.131
-0.191
DECENTRAL
OUTSOURCE
-0.144
-0.159
-0.160
-0.159
-0.217
0.486**
0.369
0.587**
0.527**
0.697**
-0.205
-0.300
-0.235
-0.203
-0.297
0.000
-0.005
0.003
0.006
-0.006
-0.007
-0.008
-0.007
-0.007
-0.009
0.508***
0.474***
0.476***
0.493***
0.834***
-0.088
-0.110
-0.089
-0.079
-0.152
-0.048
-0.078
-0.036
-0.063
0.184
-0.081
-0.102
-0.088
-0.079
-0.122
-0.015
-0.011
-0.034
-0.003
-0.021
-0.082
-0.104
-0.089
-0.077
-0.134
0.149*
0.130
0.119
0.208**
0.096
-0.087
-0.105
-0.097
-0.090
-0.109
0.056
0.081
0.022
0.074
-0.016
-0.120
-0.161
-0.104
-0.106
-0.107
0.011
0.006
0.005
0.020
0.010
-0.014
-0.020
-0.012
-0.013
-0.018
0.106
0.009
0.175
0.154
0.120
-0.092
-0.108
-0.107
-0.096
-0.141
DEVT
0.740***
0.838***
0.630***
0.703***
0.768***
-0.160
-0.198
-0.156
-0.143
-0.234
Constant
1.348***
1.622***
1.354***
1.104**
1.158**
-0.426
-0.553
-0.424
-0.413
-0.532
STRATEGY
VISION
RISK
ENVIRON
SIZE
SWARE
HWARE
Observations
2095
815
324
780
176
R-squared
0.304
0.287
0.359
0.349
0.454
F test:
6.472
3.346
6.210
10.390
6.600
Robust standard errors in parentheses
*** p<0.01, ** p<0.05, * p<0.1
14
Table 9: Regression Analysis by Attribute
VARIABLES
CENTRAL
DECENTRAL
OUTSOURCE
STRATEGY
VISION
RISK
ENVIRON
SIZE
SWARE
AWARE
POLICIES
TOOLS
SKILLS
RESP
GOALS
0.118
0.057
0.014
0.035
-0.052
-0.011
-0.172
-0.202
-0.179
-0.181
-0.177
-0.16
0.526**
0.683**
0.485*
0.36
0.23
0.447
-0.23
-0.275
-0.27
-0.335
-0.308
-0.271
0.003
0.002
-0.003
-0.004
0.003
-0.008
-0.009
-0.008
-0.007
-0.007
-0.009
-0.009
0.468***
0.540***
0.504***
0.437***
0.556***
0.641***
-0.105
-0.097
-0.086
-0.096
-0.097
-0.101
-0.049
-0.049
-0.074
-0.082
-0.081
-0.043
-0.096
-0.102
-0.097
-0.106
-0.1
-0.107
-0.11
-0.022
-0.07
0.019
-0.088
-0.07
-0.083
-0.085
-0.088
-0.096
-0.091
-0.084
0.240**
0.170*
0.143
0.168
0.216*
0.181
-0.101
-0.097
-0.103
-0.112
-0.108
-0.123
0.028
0.071
0.03
0.054
0.006
0.07
-0.112
-0.118
-0.112
-0.143
-0.118
-0.136
0.016
0.017
0.019
0.009
0.018
0.01
-0.014
-0.015
-0.014
-0.016
-0.015
-0.017
0.182*
0.108
0.148
0.144
0.128
0.103
-0.106
-0.113
-0.103
-0.104
-0.106
-0.1
0.692***
0.702***
0.978***
0.554***
0.704***
0.752***
-0.162
-0.183
-0.156
-0.173
-0.18
-0.204
1.420***
1.280**
0.73
1.501***
1.487***
1.203**
-0.48
-0.513
-0.47
-0.471
-0.5
-0.51
Observations
1896
1893
1883
1889
1885
1824
R-squared
0.261
0.282
0.258
0.242
0.269
0.315
F test:
8.543
7.67
10.37
4.198
6.773
9.553
HWARE
DEVT
Constant
Robust standard errors in parentheses
*** p<0.01, ** p<0.05, * p<0.1
Appendix 1.
COBIT has a set of structural relationships. For example, at the framework level, there is a posited
relationship between business goals, intermediate IT goals, and business processes. Further, outputs from
each business processes (control objectives) are predicated as inputs to other business processes. COBIT
has as a variety of supporting and ancillary protocols, including the Management Guidelines. The
15
Management Guidelines are designed to provide a toolset for managers of the IS function to develop the
controls throughout their organization and across the range of activities which the function undertakes.
The Guidelines include sets of Key Performance Indicators (KPIs), Key Goal Indicators (KGIs) and, most
relevant for this study, process maturity models at both the framework and process levels.
The application of Process Maturity in COBIT 4.1
An essential component of COBIT is its use of process maturity modeling. COBIT leverages process
maturity concepts from CMM but does so across the complete lifecycle of IT investment. The COBIT
framework notes that “[t]he assessment of process capability based on the COBIT maturity models is a
key part of IT governance implementation. After identifying critical IT processes and controls, maturity
modelling enables gaps in capability to be identified and demonstrated to management. Action plans can
then be developed to bring these processes up to the desired capability target level” (ITGI 2007a).”
Process maturity in COBIT has six levels, viz: 0—Non-existent; 1—initial; 2—repeatable; 3—defined;
4—managed; and 5—optimized. COBIT recognizes that fulfilling the objectives of the organization
requires development of systematic capabilities to deliver results on each of the IT processes. These
capabilities require a combination of human, software, and hardware resources bound together in a policy
and procedure structure. Each of these resources requires careful monitoring through a collection of
metrics and review to ensure that any given process is continuing to meet ongoing demands. Maturity
models appear in two ways within COBIT. At the framework level, there is a generic maturity model. In
this model, COBIT recognizes that there are a number of dimensions or attributes of process maturity.
These include management’s “Awareness and Communication” of the process and the “Policies,
Standards and Procedures” and “Skills and Expertise” supporting the process. Each of these attributes
have their own maturity. Then, at the process level, there are individual maturity models, which do not
have the maturity attributes that characterize the generic model.
2.6.
Evidence on Influence of Process Maturity on Meeting Organizational Outcomes
The studies discussed above on the effect of IT process maturity on organizational objectives cover only
partial aspects of information technology (e.g. software development and project management) and have
limited data sets. Equally, we only have a handful of studies that empirically test process maturity within
the rubric of COBIT or other governance and management frameworks and the achievement of
organizational outcomes. Simonsson et al. (2010) assessed the correlation between the level of maturity
for each of the processes in COBIT and a highly summarized measure of governance performance in 37
Scandinavian organizations. They find significant bivariate correlations for 18 of the 34 processes. Prasad
16
et al. (2010), employing a theoretical framework similar to ours, undertook a mailed survey of global
firms. Employing summary measures of IT capability, loosely based on COBIT, they find that IT
governance initiatives such as steering committees, have a positive impact on key objectives such as
customer service. Bowen et al. (2007) conduct a case study of a single large organization, employing both
quantitative and qualitative methods. Their research linked methods and structures of IT governance with
IT project performance. They were inspired by COBIT but did not draw directly thereon. They found that
while the decision rights and formal structures of governance were important, the methods were equally
or more important. These included, for example, communication methods and collaborative management.
2.7.
Synthesis and Research Questions
In this section, we have reviewed the literature on the development of capability and IT governance to
develop a theoretical framework. In the theoretical framework laid out in Figure 1 and expanded in Figure
2 we see IT governance driving and setting corporate strategy (or at least that part of corporate
governance that relates to IT), and in turn influencing the development of IT capability. The allocation of
resources is also influenced by governance. We draw on a range of theories including the Resource Based
View (RBV), contingency theory and agency theory to explain the relationship between IT governance
and IT capability. We see the level of process maturity as being reflective of the level of IT capability.
We note that there is very little research that associates the nature and level of IT governance with the
building of IT capability, in general, or IT process maturity in particular.
We ask the following primary research questions: Are there attributes of IT governance that govern the
level of process maturity? If so, which attributes are more or less significant? Are there domains or
processes that are more influential? Are there other control variables such as size or industry that explain
the relationship between IT governance and process maturity? Is process maturity evenly distributed
across domain?
3.
Method
3.1.
Introduction
In this section we set out the methods employed to address the research questions laid out in the previous
section. These questions can be summarized as: which IT governance attributes Given the complexity of
both the nature of IT governance and of the processes in COBIT, we employ field research (Bailey 2007;
Burgess 1990). The independent variable is the IT governance construct. This construct is complex with
many facets of governance. We design a data collection instrument to collect all these facets. This
17
instrument is deployed in face-to-face interviews with the CIO or, in some cases, a designated senior
manager.
18
The dependent variable is the level of process maturity of IT processes typically observed in IT
organizations. We select the COBIT framework as a foundation for the selection of IT processes.
For each of the organizations studied, we collect data on process maturity for the 41 COBIT
processes set out in Table 8: Regression Analysis – Overall and by Domain
VARIABLES
CENTRAL
DECENTRAL
OUTSOURCE
STRATEGY
VISION
RISK
ENVIRON
SIZE
SWARE
HWARE
ALL
PO
AI
DS
ME
0.091
0.041
0.227
0.131
-0.191
-0.144
-0.159
-0.160
-0.159
-0.217
0.486**
0.369
0.587**
0.527**
0.697**
-0.205
-0.300
-0.235
-0.203
-0.297
0.000
-0.005
0.003
0.006
-0.006
-0.007
-0.008
-0.007
-0.007
-0.009
0.508***
0.474***
0.476***
0.493***
0.834***
-0.088
-0.110
-0.089
-0.079
-0.152
-0.048
-0.078
-0.036
-0.063
0.184
-0.081
-0.102
-0.088
-0.079
-0.122
-0.015
-0.011
-0.034
-0.003
-0.021
-0.082
-0.104
-0.089
-0.077
-0.134
0.149*
0.130
0.119
0.208**
0.096
-0.087
-0.105
-0.097
-0.090
-0.109
0.056
0.081
0.022
0.074
-0.016
-0.120
-0.161
-0.104
-0.106
-0.107
0.011
0.006
0.005
0.020
0.010
-0.014
-0.020
-0.012
-0.013
-0.018
0.106
0.009
0.175
0.154
0.120
-0.092
-0.108
-0.107
-0.096
-0.141
0.740***
0.838***
0.630***
0.703***
0.768***
-0.160
-0.198
-0.156
-0.143
-0.234
1.348***
1.622***
1.354***
1.104**
1.158**
-0.426
-0.553
-0.424
-0.413
-0.532
Observations
2095
815
324
780
176
R-squared
0.304
0.287
0.359
0.349
0.454
F test:
6.472
3.346
6.210
10.390
6.600
DEVT
Constant
Robust standard errors in parentheses
*** p<0.01, ** p<0.05, * p<0.1
19
Table 9: Regression Analysis by Attribute
VARIABLES
CENTRAL
DECENTRAL
OUTSOURCE
STRATEGY
VISION
RISK
ENVIRON
SIZE
SWARE
AWARE
POLICIES
TOOLS
SKILLS
RESP
GOALS
0.118
0.057
0.014
0.035
-0.052
-0.011
-0.172
-0.202
-0.179
-0.181
-0.177
-0.16
0.526**
0.683**
0.485*
0.36
0.23
0.447
-0.23
-0.275
-0.27
-0.335
-0.308
-0.271
0.003
0.002
-0.003
-0.004
0.003
-0.008
-0.009
-0.008
-0.007
-0.007
-0.009
-0.009
0.468***
0.540***
0.504***
0.437***
0.556***
0.641***
-0.105
-0.097
-0.086
-0.096
-0.097
-0.101
-0.049
-0.049
-0.074
-0.082
-0.081
-0.043
-0.096
-0.102
-0.097
-0.106
-0.1
-0.107
-0.11
-0.022
-0.07
0.019
-0.088
-0.07
-0.083
-0.085
-0.088
-0.096
-0.091
-0.084
0.240**
0.170*
0.143
0.168
0.216*
0.181
-0.101
-0.097
-0.103
-0.112
-0.108
-0.123
0.028
0.071
0.03
0.054
0.006
0.07
-0.112
-0.118
-0.112
-0.143
-0.118
-0.136
0.016
0.017
0.019
0.009
0.018
0.01
-0.014
-0.015
-0.014
-0.016
-0.015
-0.017
0.182*
0.108
0.148
0.144
0.128
0.103
-0.106
-0.113
-0.103
-0.104
-0.106
-0.1
0.692***
0.702***
0.978***
0.554***
0.704***
0.752***
-0.162
-0.183
-0.156
-0.173
-0.18
-0.204
1.420***
1.280**
0.73
1.501***
1.487***
1.203**
-0.48
-0.513
-0.47
-0.471
-0.5
-0.51
Observations
1896
1893
1883
1889
1885
1824
R-squared
0.261
0.282
0.258
0.242
0.269
0.315
F test:
8.543
7.67
10.37
4.198
6.773
9.553
HWARE
DEVT
Constant
Robust standard errors in parentheses
*** p<0.01, ** p<0.05, * p<0.1
Appendix 1. We collect process maturity data disaggregated into the six process maturity attributes
described above.
20
3.2.
IT Governance Instrument
We designed a survey instrument to accommodate the major elements of enterprise governance of
information technology that we discuss in the previous section. Apart from these governance factors,
there are several environmental considerations that prior research has shown to be influential in the
determination of IT capability. Design of the questions is based on guidance from the literature. It was
validated with practitioners and in a previous test study. Data for the instrument is collected in face-toface interviews with the CIO, or designated representative.
Decision rights and organization: A vital aspect of IT governance is the organization of decision rights.
We ask about the decision rights, with a six level scale that ranges from “Highly centralized with
corporate general management primarily in control” to “Federal: Centralized governance over
architecture, network etc. with decentralized development and operations.” We also ask about the
organization of infrastructure, systems development and operations, using a five level scale. We also ask
about the existence of board level committees (IT Strategy/Investment and IT Risk/Security) and
organizational committees including IT Strategy/Investment/IT Council, IT Architecture, IT Security and
IT Project Management.
Governance Frameworks: The level of adoption of IT governance and management frameworks is
indicative of the likely maturity of IT governance. We ask about the adoption of 39 frameworks,
including corporate Governance (e.g. COSO), quality management (e.g. ISO9000), industry and
compliance (e.g. FDIC, HIPAA), and more strictly IT frameworks (e.g. ITIL, CMM, Prince 2).
Business/IT Alignment: The central importance of business/IT alignment to IT governance is clearly
established in the literature. After extensive review of this literature, we ask 24 questions about core
aspects of alignment. The questions cover the dimensions of business/IT alignment of involvement of
business representatives in strategic and tactical decision making; feedback to business by measurement
and alignment mechanisms such as Service Level Agreements. Each is measured with a five point Likert
scale (Strongly Disagree to Strongly Agree). Examples of questions include “We track the business value
from IT project investment,” “The business is usually involved in determining IT strategic directions (> 3
years),” “We use Service Level Agreements to align the business and IT,” and “The business is very
knowledgeable about IT technological choices we face.”
Outsourcing: The nature and level of outsourcing are, as discussed, key factors in governance. We ask
about the strategic role of outsourcing (Outsourcing is central to the mission of our IT function) and the
21
reasons for outsourcing. We also ask about the level of outsourcing for application development (three
components), service delivery (four components), and hardware (seven components).
Environmental volatility: The level of stability in the competitive and technological landscape is likely to
influence the level of capability. We ask about the level of change for the technical architecture,
infrastructure and application software portfolio with a five point Likert scale.
Size and complexity: Prior research indicates institutional factors such as the size of the organization and
the complexity of its technical environment influence the development of IT capability. There is
considerable diversity in geographical location and industry in organizations we survey. More typical
metrics such as the level of spending is less appropriate. For size, we measure the number of IT
personnel, the number of application systems, servers and clients. For complexity we measure the range
of software in use, and the range of hardware technologies.
3.3.
IT Process Frameworks
The COBIT framework encompasses the major IT processes that constitute the lifecycle of IT investment.
There are other frameworks that encompass aspects of the IT function (e.g. ITIL for service provision and
PRINCE 2 for project management). All of these are partial, rather than covering the totality of IT. At a
higher level of abstraction there is the ISO/IEC 38500 IT governance framework. Unfortunately, ISO/
IEC 38500 framework provides high-level guidance only. It is challenging to operationalize the constructs
in ISO/ IEC 38500. Conversely, the high level process objectives in COBIT encompass the complete
lifecycle of IT investment.
3.4.
Process Maturity Data Collection
Given our use of COBIT as the underpinning for the study, the next important question was how to
measure the process maturity. We determined that the appropriate method was to collect perceptions of
maturity levels from practicing managers within the organizations in our field study. A principal
advantage of this technique was that we were able to cost-effectively visit and collect data at a significant
number of organizations. The organization identified the process owners for each of the 41 processes in
our study. At the start of each interview, the researcher provided a description of the project and the
interview protocol. The interviewee (the CIO or other senior IT managers) was given a copy of the
generic process maturity attribute table. The columns were the six attributes and the rows were the scale
from 1 to 5. Each cell included a brief description of what the characteristics would be for a specific
attributes for each level for a specific attribute.
22
Taking one process at a time, the interviewer would introduce the process control objectives and
summarize the detailed control objective (from the COBIT documentation) for the interviewee. The
interviewee would read the attribute descriptions from the process maturity attribute table to him- or
herself and then stated the process maturity levels for each of the six attributes for that process. They
would then state the maturity level for each of the six attributes. The researcher recorded the
interviewee’s responses.
The interviewee stated two numbers for each attribute. The first number represented the current level of
maturity (‘As Is’) and the second number was their expected level of maturity one year in the future (‘To
Be’). We allowed respondents to select a level in between the given discrete levels of process maturity
(e.g. 1.6, 2.4) reflecting that they felt that they had not yet achieved every aspect of the next level. We
found that respondents could normally see that they had got to a particular level, but had not necessarily
completely achieved the next discrete level. We also allowed zero as a response, but with a slightly
different definition than in the SEI’s CMM. In CMM, zero means non-existent—the process does not
exist at all. Some of the subjects in this study wanted to use zero to indicate that the maturity of a
particular process was still significantly less than one—but that the process did exist. As such, a zero is
the indicated maturity level measure, not that the process did not exist at all.
The interviewers also sought additional information and challenged the manager on a subset of the
processes. When the manager was being interviewed on several processes, it was particularly important to
ensure that the maturity levels be measured correctly early in the data collection cycle. Examples of
validation questions included, “How is management’s awareness of this process communicated to the IT
organization for this process?” and “What are some of the tools and technologies supporting this
process?” We carried this pattern to the other processes for the particular interviewee. This registration
and validation process had to be handled judiciously because data collection could take an unacceptable
amount of time if every data point was challenged.
With some exceptions, we found that respondents understood these attributes and the descriptions of the
attribute levels. For some respondents, it took a little time to recognize that the process maturity
statements were to be applied only to that particular process. For example, whilst metrics, tools or
techniques might exist for one process, they might not necessarily apply to another process. Further, some
managers could not initially recognize the concept of goals that related to a particular process. There were
more difficulties with the attribute, “Goal Setting and Measurement,” than with the others. Often
managers would grasp on to a variety of broad goals rather than process goals. Similarly, managers would
refer to generic performance measures or metrics that related to other processes rather than those that
23
measured performance for the given process under discussion. Each of these issues was resolved by
discussion, using appropriate analogies and by additional questioning of the respondent.
Regarding the ‘To Be’ data, the point was stressed to the interviewee that they should not give a higher
level just because they hoped capability would improve in the future. Instead, the point was to recognize
that only if the IT function had a specific initiative in process and that funding was obtained, the maturity
level might be expected to change in the future.
For each organization, we collected a maximum of 492 data points, which represents both “As Is” and
“To Be” points in time, for six attributes, for 41 processes (2 X 6 X 41). When calculating the overall
maturity level for one process, we took a simple average of the six attributes associated with that process.
When we interviewed more than one manager for a given process, we again took a simple average of their
responses. For a small number of organizations, we collected only overall process maturity level data; not
attribute-level data. Where we collected the data from more than one person for a given process, the
between-person variation was typically within one level of maturity. This data is, of course, self-reported
and subject to bias and we were not able to validate independently the responses by inspection of policies
and procedures or through other techniques. Because of the number of Level 0 and Level 1 responses we
got, the respondents seemed candid in the information they gave us.
3.5.
Selection of Organizations
A convenience sample of organizations was employed. As can be expected, obtaining agreement for this
research was challenging. Organizations must invest considerable human resources in assessing process
maturity and working with the CIO. The primary criterion for selection of organizations was the size of
the IT function. We were interested in collecting data on all 34 processes in COBIT. This includes
processes as disparate as application development, security, facilities management and strategy and risk
management. Only IT organizations of a reasonable size are likely to include the complete range of
processes. With some exceptions, we worked to include in the survey organizations that had the
equivalent of more than 25 staff.
4.
Results
This section is divided into three broad areas. We first analyze the results of the extensive questionnaire
completed in a face-to-face session with the CIO. This is followed by an analysis of the results of the
24
maturity modeling. Finally, we associate core institutional and environmental factors with the maturity
levels, to answer the research questions set out in the second section.
4.1.
Demographics
As the descriptive statistics about the participant organizations in Panel A of Table 1 show, there were 51
organizations that participated in our study from North America, Asia, and Europe. The organizations
were divided between utilities, manufacturing and other capital intensive organizations, and services.
Some 16% of the study participants were from the government and Not-For-Profit sector. Because we
wanted organizations that were actively involved in the 41 IT processes, the 51 organizations were
relatively large. They averaged 172 IT staff and the maximum was 690 IT staff. In terms of clients (or
workstations), the average was 3,156 and the maximum was 15,000. Most organizations in the study had
mixed IT environments, with 98% using Wintel servers and 94% using Unix. Mainframes were also used
to varying degrees by a large minority (34%) of the sample organizations. These are relatively stable
organizations. As shown in Panel C, we asked the CIOs on a five-point Likert scale (1=Strongly Disagree,
5=Strongly Agree) their agreement on six questions on the information, data and technical architectures,
technical infrastructure, and application software portfolio. In essence, the CIOs reported moderately
strongly that the architecture was defined and relatively unchanging. The results for the question on the
application software portfolio was somewhat lower (𝑥̅ = 3.00).
Insert Table 1 about here
4.2.
Governance
One of the most significant aspects of IT governance is the allocation of decision rights and concomitant
organizational structure. As Panel A of Table 2 indicates, three quarters of the organizations had
centralized IT structures. Only 6% (3 organizations) indicated a decentralized structure, and 20%
indicated a federal structure. For those participants that had centralized structures, two thirds had IT rather
than management being primarily in control of strategic direction. Panel B of Table 2 shows the decisionmaking structures of Board and management committees and councils. One fifth of Boards had an IT
strategy committee and, interestingly, one quarter had a Board-level risk or security committee. Five
study participants had both Board committees, so one third of Boards had some type of IT committee.
Committees or councils at the management layer were relatively common, with 62% having a strategy or
steering committee. Architecture (46%) and security (54%) committees were also more frequently
observed.
25
We were interested in the perceptions of CIOs at the level of monitoring from the Board and other
cognate issues. We asked 10 questions about monitoring, each with five-point Likert scale (1=Strongly
Disagree, 5=Strongly Agree). For the six questions about monitoring, the highest level of agreement was
on the level of Board oversight of strategic directions for IT (3.14).
Insert Table 2 about here
The level of adoption or utilization of governance or IT frameworks or standards is a potential maker of
maturity in governance. Table 3 shows the level of adoption of broad governance frameworks (e.g.
COSO, COSO-ERM, Turnbull guidelines (UK)), quality management standards, (ISO9000/ ISO9001),
compliance frameworks (e.g. Sarbanes-Oxley for US listed companies (SOX), Foreign Corrupt Practices
Act (FCPA), Payment Card Industry Data Security Standard (PCI)), and the many frameworks and
standards specifically within IT (e.g. COBIT, ITIL, NIST 800). We asked CIOs on a four point scale (0 =
Not heard of or Not Applicable. 1 = Influences Internal Standards. 2= Partially followed. 3 = Followed
thoroughly). In general, there was little usage of governance or IT standards or frameworks. There was
some usage of COSO (𝑥̅ = 0.62), which reflects its influence from the Sarbanes-Oxley Act of 2002
(SOX). A quarter of the CIOs said that it influenced their standards, with 15% saying that they either
partially or fully followed COSO. The importance of the SOX is also reflected in the mean response for
SOX directly (𝑥̅ = 1.00). For each of these sets of frameworks, we grouped together the highest response
given by the CIO. Only 40% of CIOs noted any usage of a broad governance framework (𝑥̅ = 0.66) and
only five CIOs noted that they “thoroughly followed” such a framework.
As might be expected, there was a range of usage of industry and compliance frameworks and standards.
The previously mentioned SOX requirements (𝑥̅ = 1.00) was the highest scored of these frameworks.
Considering the highest ranked compliance framework (𝑥̅ = 1.42), 45% of CIOs noted no such
framework. Conversely, 40% of CIOs noted that they “thoroughly followed” at least one framework.
In terms of IT governance and IT management frameworks, both COBIT (𝑥̅ = 1.60) and ITIL (𝑥̅ = 1.52)
were quite widely employed in the organizations we studied. Our study used a convenience sample, and
while it was certainly not necessary to employ COBIT in order to participate in the study, it is likely that
COBIT usage was over-represented because most of the leads to organizations came through ISACA
chapters. Only three organizations, 6%, said that they thoroughly followed both COBIT and ITIL. Usage
of other frameworks including CMM/CMMI was less common. Some 85% of CIOs said that they either
partially or thoroughly followed at least one IT framework or standard. Looking at the level of
frameworks overall, we counted the number of frameworks that CIOs indicated that they either partially
26
or thoroughly followed. The number of frameworks followed (𝑥̅ = 4.59) diverged greatly. The maximum
number of frameworks followed by any of the participating organizations was 16. Twelve percent of
organizations followed ten or more governance, industry, or IT standards or frameworks. Conversely,
CIOs reported following none (8%) or one (6%) framework only.
Insert Table 3 about here
The majority of the organizations in the study outsourced some aspects of their IT activities. Although the
range of outsourcing (not tabulated) varied widely, some 76% and 65% of respondents outsourced some
aspects of their software or hardware functions, respectively.
As we discuss in the literature review, the level of alignment between business and IT is an important
aspect of IT governance. We asked 16 questions (see Table 4) on all aspects of alignment. In general, we
noted a relatively high self-reported level of alignment. For example, on a five point Likert scale, business
involvement in IT strategy (𝑥̅ = 3.68) (Business Unit Managers are usually involved in determining IT
strategic directions (> 3 years)), funding of business-facing IT staff (𝑥̅ = 3.82) (We have staff with direct
business-facing responsibilities) and project communication (𝑥̅ = 4.22) (We regularly communicate
IT project status to the business units) were all strongly ranked. Less highly ranked was the tracking
of business value from projects (𝑥̅ = 3.04) (We track the business value from IT project investments)
or architectural investment (𝑥̅ = 2.62) (We track the business value from IT architectural investments)
and chargeback (𝑥̅ = 2.76) (We use chargeback mechanisms to align the business and IT).
Insert Table 4 about here
4.3.
Process Maturity Levels by Domain
Table 5 presents the overall level of process maturity as well as the level of maturity for each of the
attributes of process maturity (Management Awareness, Policies and procedures, Tools, Staffing,
Responsibility and Goals). In the final column of Table 5 we show the rank for the maturity of the
process. The average level of process maturity is relatively low, with most processes being having a mean
maturity level between 2 and 2.5 on a scale from zero to 5. The processes with the highest level of
maturity are associated with security including virus (Rank = 1, 𝑥̅ = 3.12) and network and firewall (Rank
= 4, 𝑥̅ = 2.95). The other two components of systems security were much lower ranked, however with
security policy being ranked at #14 (𝑥̅ = 2.70) and user access which is ranked at #11 (𝑥̅ = 2.81).
Insert Table 5 about here
27
There are clear differences by domain. In general, the Plan and Organize (PO) domain is of much lower
maturity. With the exception of project management (Rank = 8, 𝑥̅ = 2.89) and budgeting (PO10PJ) (Rank
= 3, 𝑥̅ = 3.00), none of the processes are in the top quartile of processes. Most of the processes in this
domain are in the lowest quartile. An even more distinct result applies to the Monitor and Evaluate (ME)
domain, with all four processes being in the lowest quartile of maturity. The more prosaic processes that
make up the Deliver and Support have relatively high levels of maturity, with service desk (DS8) (rank
#6), data management (DS11) (rank #5) and management of the physical environment (DS13) (rank #2)
all being highly ranked.
4.4.
Data Reduction and Variable Definitions
Prior to multivariate data analysis, data reduction of the organizational data was necessary, given the
relatively small size of the population for extensive statistical testing. The following procedures were
undertaken:
Business/IT alignment: Answers to the 16 questions in the business/IT alignment section of the CIO
instrument that directly addressed alignment were answered in a highly consistent fashion (Cronbach’s α
= 0.902). Principal factor analysis was performed, with the minimum eigenvalue being set at 1.0. This
resulted in two factors that, together, reflect 80% of the variation. The two factors are clearly distinct
(pairwise correlation = -0.006, p = 0.965). After analyzing the correlation of the two factors with each of
the individual questions, we designate the first factor as “Strategy” and the second as “Vision.”
Board monitoring, risk management, environment, and size: Using identical reduction procedures, we
analyzed the questions relating to board monitoring and risk management. We grouped the five questions
on Board and Audit Committee monitoring with the questions on CIO membership of the organization’s
core team, influence of organizational corporate governance, and use of metrics for monitoring. Again,
the questions were answered consistently (Cronbach’s α = 0.878). A single factor resulted, reflecting 74%
of the variation. The same process applied to the four questions in the risk management grouping
(Cronbach’s α = 0.878, a single factor explained effectively all of the variation); environment where we
factorized the three questions in business/IT alignment on demand for services and the three questions on
change in the architectural environment into a single factor that explained 74% of the variation
(Cronbach’s α = 0.636) and for size, where we factorized the number of servers, clients and personnel (a
single factor explained effectively all of the variation).
Outsourcing: The level of outsourcing is measured by a simple count of the number of services
outsourced, weighted by importance.
28
4.5.
Multivariate Analyses
Model
As laid out above, the research questions are addressed in a series of regression analyses of the overall
level of maturity as well as the level of maturity by maturity attribute. We also analyze the influences on
the level of maturity by domain within the COBIT framework. The regression is of the form:
OVRL = β1 + β1CENTRAL + β1DECENTRAL + β1OUTSOURCE + β1STRATEGY +
β1VISION + β1RISK + β1ENVIRON + β1SIZE + β1SWARE + β1HWARE + β1DEVT + α
(1)
The dependent and independent variables are defined in Figure 3.
Insert Figure 3 about here
Correlations and Descriptive statistics
The pairwise correlations of both the dependent and independent variables are set out in Table 6. The
table shows that the primary business/IT alignment factor (“Strategy”) (STRATEGY) is strongly
associated with the overall level of maturity and each of the attributes, as well as each of the independent
variables. A number of other dependent variables including notably, the size, and software and hardware
complexity are correlated with the overall level of maturity and each of the attributes of maturity. The
summary descriptive statistics are shown in Table 7. The table shows that the distribution of the
independent variables as measured by level of skewness is acceptable, with the possible exception of
decentralization (DECENTRAL). As we note shortly, however, the VIF of the various regressions is
within acceptable levels.
Insert Table 6 and Table 7 about here
Overall level of maturity
Table 8 sets out the multivariate OLS regression analysis for the overall level of maturity for all
processes, clustered by participant. The average VIF was 1.43 (max VIF= 1.81), which is within
acceptable bounds. We then break down the level of maturity by domain. What is striking in this analysis,
is the importance of business/IT alignment. The STRATEGY variable is associated with the overall level
of maturity for all domains (ALL) and for each of the Plan and Organize (PO), Acquire and Implement
(AI), Deliver and Support (DS), and Monitor and Evaluate (ME) domains. Also significant is the level of
development. Interestingly, the power of the domain-level regressions was highest for the Monitor and
29
Evaluate (ME) domain, with the STRATEGY variable being particularly influential. In essence, higher
levels of business/IT alignment are consistently associated with higher levels of process maturity. The
other consistently significant variable is the level of development of the relevant country (DEVT), with
the level of development also being associated with higher levels of process maturity. Other explanatory
variables such as size, were not correlated with maturity. We saw this also in the process of data
collection. Some of the medium sized IT functions that we visited were clearly very well organized with
consistently high levels of process maturity.
Insert Table 8 about here
Level of maturity by attribute
We collected data on each of the attributes of process maturity set out in the generic maturity model in
COBIT. The second set of multivariate analyses (Table 9) were of these different attributes. As set out
above, there is a significant difference in the level of maturity by attribute. As shown in Table 7, the
managerial awareness attribute has the highest level of process maturity (2.928), while goals and metrics
had the lowest level of maturity (2.218). Once again, the level of business/IT alignment (SRATEGY) is
particularly significant in each of these regressions, as is the overall level of development (DEVT). In this
set of regressions, the environment in which IT is conducted was significant for managerial awareness
(AWARE) and policies and procedures (POLICIES).
Insert Table 9 about here
5.
Conclusion
In this study, we assess the relationship between key IT governance determinants and the level of process
maturity, as a key marker of IT capability. Using the 34 IT processes, as well of associated description of
maturity, described in COBIT, we explored the maturity levels across 51 organizations. Appendix 1
includes those processes, which we disaggregated into 41 processes. COBIT separates the processes into
four domains. The 51 organizations used in this study were located in the North America, Europe, and
Asia, and ranged widely in size with the smallest organization having five IT personnel to the largest with
690 IT personnel. They also included a wide variety of industries.
Some caution must be exercised generalizing our findings because the 51 organizations were
volunteers—not random samples—and small in number, particularly when subdivided into various
categories (e.g., different industries). With that caveat stated, our field study did find some interesting
30
results that could be explored deeper in future research. Even with the wide diversity of organizations, for
the 51 organizations in this study there were similarities. Generally, the most mature IT processes were
security and virus detection and prevention (DS5V), manage physical environment (DS12), and IT
investment and budgeting (PO5B). The first two processes deal with daily activities where organizations
are constantly addressing virus, hacking, and physical security issues. Even the smallest organizations
have annual budgeting processes, which IT would have to prepare their own budgets and compete for
resources for the next year’s activities.
At the other end of the spectrum, the least mature IT processes were monitor IT performance (ME1),
architecture-classification (PO2D), and IT investment-value management (PO5V). It was particularly
interesting that PO5B (budgeting) was in the top three in terms of maturity and PO5V (value
management) was in the bottom three. This may reflect the fact that it is much easier to determine the IT
budgeted costs (e.g., labor, hardware, software, etc.) than it is to quantify the value or benefits of IT,
which can have both tangible and intangible components and may be realized beyond the one-year time
horizon of the IT budget.
On the other, there were significant differences across IT organizations related to different characteristics.
As can be gleaned from Table 8, IT processes were more mature in more developed countries, in those
organizations with closer IT and business alignment, and the degree of decentralization of IT operations.
Interestingly, organization size did not seem to be a significant predictor of IT maturity. The clear
association of business/IT alignment with process maturity is an important finding of the study. The
relationship of alignment with maturity has not been widely addressed in prior research.
Based on our experiences with CIOs and other managers who participated in our study, it is clear the
COBIT’s 34 IT processes and descriptions of maturity attributes and maturity levels do provide a
comprehensive IT lifecycle view of IT organizations. As such, we believe this COBIT material could
provide a standard on which to build future IT process maturity research.
6.
References:
Agrawal, M., and K. Chari. 2007. Software Effort, Quality, and Cycle Time: A Study of CMM
Level 5 Projects. IEEE Transactions on Software Engineering 33 (3):145-156.
Ahern, D. M., A. Clouse, and R. Turner. 2004. CMMI distilled: A practical introduction to
integrated process improvement. 2nd ed. Boston: Addison-Wesley.
31
Al-Mashari, M., A. Al-Mudimigh, and M. Zairi. 2003. Enterprise resource planning: A
taxonomy of critical factors. European Journal of Operational Research 146:352–364.
Bahli, B., and S. Rivard. 2003. The information technology outsourcing risk: a transaction cost
and agency theory-based perspective. Journal of Information Technology 18 (3):211-221.
Bailey, C. A., ed. 2007. A Guide to Qualitative Field Research. 2nd ed. Thousand Oaks, CA:
Sage Publications, Inc.
Banker, R., G. Davis, and S. Slaughter. 1998. Software Development Practices, Software
Complexity and Software Maintenance Performance. Management Science 44 (4):433450.
Berle, A. A., and G. C. Means. 1932. The modern corporation and private property. New York:
The Macmillan Company.
Bhatt, G., A. Emdad, N. Roberts, and V. Grover. 2010. Building and leveraging information in
dynamic environments: The role of IT infrastructure flexibility as enabler of
organizational responsiveness and competitive advantage. Information & Management 47
(7-8):341-349.
Blaskovich, J., and N. Mintchik. 2011. Information Technology Outsourcing: A Taxonomy of
Prior Studies and Directions for Future Research. Journal of Information Systems 25
(1):1–36.
Bowen, P. L., M.-Y. D. Cheung, and F. H. Rohde. 2007. Enhancing IT governance practices: A
model and case study of an organization's efforts. International Journal of Accounting
Information Systems 8 (3):191-221.
Bradley, J. 2008. Management based critical success factors in the implementation of Enterprise
Resource Planning systems. International Journal of Accounting Information Systems 9
(3):175-200.
Brown, A. E., and G. G. Grant. 2005. Framing the Frameworks: A Review Of IT Governance
Research. Communications of AIS 2005 (15):696-712.
Burgess, R., ed. 1990. Field Research: A Sourcebook and Field Marshal. New York, NY:
Routledge.
Byrd, T. A., B. R. Lewis, and R. W. Bryan. 2006. The leveraging influence of strategic
alignment on IT investment: An empirical examination. Information & Management 43
(3):308-321.
Caputo, K. 1998. CMM implementation guide: Choreographing software process improvement.
Reading, Mass.: Addison-Wesley.
32
Chen, R.-S., C.-M. Sun, M. M. Helms, and W.-J. Jih. 2008. Aligning information technology and
business strategy with a dynamic capabilities perspective: A longitudinal study of a
Taiwanese Semiconductor Company. International Journal of Information Management
28 (5):366-378.
Choudhury, V., and R. Sabherwal. 2003. Portfolios of control in outsourced software
development projects. Information Systems Research 14 (3):291–314.
Chun, M., and J. Mooney. 2009. CIO roles and responsibilities: Twenty-five years of evolution
and change. Information & Management 46 (6):323-334.
Cragg, P., M. King, and H. Hussin. 2002. IT alignment and firm performance in small
manufacturing firms. The Journal of Strategic Information Systems 11 (2):109-132.
Crosby, P. B. 1980. Quality Is Free: The Art of Making Quality Certain. New York, NY: New
American Library.
Cullen, S., P. B. Seddon, and L. P. Willcocks. 2005. IT outsourcing configuration: Research into
defining and designing outsourcing arrangements. Journal of Strategic Information
Systems 14 (4):357-387.
Daily, C. M., D. R. Dalton, and A. A. C. Jr. 2003. Corporate Governance: Decades of Dialogue
and Data. The Academy of Management Review 28 (3):371-382.
Davenport, T., D. DeLong, and M. Beers. 1998. Successful knowledge management projects.
Sloan Management Review 39 (2):43-57.
Davenport, T. H. 1993. Process innovation, Reengineering Work through Information
Technology. Cambridge, Mass: Harvard Business School Press.
Davenport, T. H. 2000. Mission Critical: Realizing the Promise of Enterprise Systems.
Cambridge, Mass.: Harvard Business School.
De Haes, S., and W. Van Grembergen. 2008. Analysing the Relationship Between IT
Governance and Business/IT Alignment Maturity. Paper read at 41st Hawaii International
Conference on System Sciences, at Kailua-Kona, HI.
Deming, W. E. 1982. Quality, Productivity, and Competitive Position. Cambridge, MA: MIT
Center for Advanced Engineering Study.
Devaraj, S., and R. Kohli. 2000. Information technology payoff in the health-care industry: A
longitudinal study. Journal of Management Information Systems 16 (4):41-67.
Dymond, K. M. 1995. A guide to the CMM: Understanding the capability maturity model for
software. Annapolis, Md.: Process Inc US.
Feeny, D., and L. Willcocks. 1998. Core IS Capabilities for Exploiting Information Technology.
Sloan Management Review 39 (3):9-21.
33
Feeny, D. F., B. R. Edwards, and K. M. Simpson. 1992. Understanding the CEO/CIO
relationship. MIS Quarterly 16 (4):435-448.
Ferguson, C., F. Finn, and J. Hall. 2005. Electronic commerce investments, the resource-based
view of the firm, and firm market value. International Journal of Accounting Information
Systems 6 (1):5-29.
Fink, L., and S. Neumann. 2009. Exploring the perceived business value of the flexibility
enabled by information technology infrastructure. Information & Management 46 (2):9099.
Garcia, S., and R. Turner. 2007. CMMI survival guide: Just enough process improvement. Upper
Saddle River, NJ: Addison-Wesley.
Gottschalk, P. 2006. Research propositions for knowledge management systems supporting it
outsourcing relationships. Journal of Computer Information Systems 46 (3):110-116.
Gottschalk, P., and H. Solli-Saether. 2006. Maturity model for IT outsourcing relationships.
Industrial Management & Data Systems 106 (1-2):200-212.
Grabski, S. V., S. A. Leech, and P. J. Schmidt. 2011. A Review of ERP Research: A Future
Agenda for Accounting Information Systems Journal of Information Systems 25 (1).
Han, H.-S., J.-N. Lee, and Y.-W. Seo. 2008. Analyzing the impact of a firm's capability on
outsourcing success: A process perspective. Information & Management 45 (1):31-42.
Harter, D. E., M. S. Krishnan, and S. A. Slaughter. 2000. Effects of process maturity on quality,
cycle time, and effort in software product development. Management Science 46 (4):451466.
Hendricks, K. B., and V. R. Singhal. 1996. Quality Awards and the Market Value of the Firm:
An Empirical Investigatio. Management Science 42 (3):415-436.
Herath, T., H. Herath, and W. G. Bremser. 2010. Balanced Scorecard Implementation of Security
Strategies: A Framework for IT Security Performance Management. Information Systems
Management 27 (1):72-81.
Hofer, C. 1975. Towards a Contingency Theory of Business Strategy. Academy of Management
Review 18 (4):784-810.
Hu, Q., and C. D. Huang. 2006. Using the balanced scorecard to achieve sustained it-business
alignment: A case study. Communications of AIS (17):2-45.
Humphrey, W. S. 1989. Managing the Software Process. New York: Addison-Wesley
Professional.
Iacovou, C. L., and R. T. Nakatsu. 2008. A risk profile of offshore-outsourced development
projects. Communications of the ACM 51 (6):89-94.
34
ISO/IEC. 2008. ISO/IEC 38500 Corporate governance of information technology. Geneva:
International Organization for Standardization/International Electrotechnical
Commission.
ITGI. 2003. Board Briefing on IT Governance. 2nd ed. Rolling Meadows, IL: IT Governance
Institute.
———. 2007a. Control Objectives for Information and Related Technologies (CobiT) 4.1.
Rolling Meadows, Il: IT Governance Institute.
———. 2007b. IT Governance using CobiT and ValIT. 2nd ed. Rolling Meadows, IL: IT
Governance Institute.
Jiang, J. J., G. Kleinb, H.-G. Hwang, J. Huang, and S.-Y. Hung. 2004. An exploration of the
relationship between software development process maturity and project performance.
Information and Management 41:279-288.
Jordan, E., and B. Tricker. 1995. Information strategy: alignment with organization structure.
The Journal of Strategic Information Systems 4 (4):357-382.
Juran, M., and F. M. Gryna. 1999. Juran's Quality Handbook. Vol. 5th. New York: McGrawHill.
Kaplan, R. S., and D. P. Norton. 1992. The Balanced Scorecard--Measures that drive
performance. Harvard Business Review 70 (1):71-79.
———. 1996. The balanced scorecard: translating strategy into action Boston: Harvard Business
School Press.
Kasse, T. 2004. Practical insight into CMMI. Boston, MA: Artech House.
Kauffman, R. J., and J. Y. Tsai. 2010. With or without you: The countervailing forces and effects
of process standardization. Electronic Commerce Research and Applications 9 (4):305322.
Lacity, M. C., S. A. Khan, and L. P. Willcocks. 2009. A review of the IT outsourcing literature:
Insights for practice. The Journal of Strategic Information Systems 18 (3):130-146.
Langfield-Smith, K., and D. Smith. 2003. Management control systems and trust in outsourcing
relationships. Management Accounting Research 10 (2):79-96.
Li, S., J. Shang, and S. A. Slaughter. 2010. Why Do Software Firms Fail? Capabilities,
Competitive Actions, and Firm Survival in the Software Industry from 1995 to 2007.
Information Systems Research 21 (3):631-654.
Luftman, J. N. 1996. Competing in the Information Age: Strategic Alignment in Practice.
Oxford: Oxford University Press.
35
Luftman, J. N., R. Papp, and T. Brier. 1999. Enablers and inhibitors of Business-IT alignment.
Communications of the Association for Information Systems 1:Article 11.
Mahoney, J. T., and J. R. Pandian. 1992. The Resource-Based View Within the Conversation of
Strategic Management. Strategic Management Journal 13 (5):363-380.
Makadok, R. 2001. Toward a Synthesis of the Resource-Based View and Dynamic-Capability
Views of Rent Creation. Strategic Management Journal 22 (5):387-401.
Masli, A., V. J. Richardson, J. M. Sanchez, and R. E. Smith. 2011. The Business Value of IT: A
Synthesis and Framework of Archival Research. Journal of Information Systems 25
(2):RTBD.
McDonald, M. P. 2007. The Enterprise Capability Organization: A Future for IT. MIS Quarterly
Executive 6 (3):179-192.
McGarry, F., and B. Decker. 2002. Attaining Level 5 in CMM process maturity. IEEE Software
19 (6):87-96.
Mithas, S., N. Ramasubbu, and V. Sambamurthy. 2011. How information management
capability influences firm performance. MIS Quarterly 35 (1):137-A115.
Nevo, S., M. Wade, and W. Cook. 2007. An examination of the trade-off between internal and
external IT capabilities. The Journal of Strategic Information Systems 16 (1):5-23.
Niazi, M., D. Wilson, and D. Zowghi. 2005. A framework for assisting the design of effective
software process improvement implementation strategies. Journal of Systems and
Software 78 (2):204-222.
Paulk, M. C., C. Weber, B. Curtis, and M. B. Chrissis. 1995. The capability maturity model:
Guidelines for improving the software process. Reading, Mass.: Addison-Wesley Pub.
Co.
Poppo, L., and T. Zenger. 2002. Do formal contracts and relational governance function as
substitutes or complements? Strategic Management Journal 23:707-725.
Powell, P. 1993. Causality in the alignment of information technology and business strategy. The
Journal of Strategic Information Systems 2 (4):320-334.
Prasad, A., J. Heales, and P. Green. 2010. A capabilities-based approach to obtaining a deeper
understanding of information technology governance effectiveness: Evidence from IT
steering committees. International Journal of Accounting Information Systems 11
(3):214-232.
Raghupathi, W. 2007. Corporate governance of IT: A framework for development.
Communications of the ACM 50 (8):94 - 99.
36
Ray, G., W. A. Muhanna, and J. B. Barney. 2005. Information Technology and the Performance
of the Customer Service Process: A Resource-Based Analysis. MIS Quarterly 29 (4):625652.
Raynus, J. 1999. Software process improvement with CMM. Boston: Artech House.
Ross, J. W., C. M. Beath, and D. L. Goodhue. 1996. Develop long-term competitiveness through
IT assets. Sloan Management Review 38 (1):31-42.
Ross, J. W., P. Weill, and D. Robertson. 2006. Enterprise Architecture As Strategy: Creating a
Foundation for Business Execution. Boston, Mass.: Harvard Business School Press.
Seggie, S. H., D. Kim, and S. T. Cavusgil. 2006. Do supply chain IT alignment and supply chain
interfirm system integration impact upon brand equity and firm performance? Journal of
Business Research 59 (8):887-895.
Sen, A., A. P. Sinha, and K. Ramamurthy. 2006. Data Warehousing Process Maturity: An
Exploratory Study of Factors Influencing User Perceptions. IEEE Transactions on
Engineering Management 53 (3):440-455.
Shleifer, A., and R. W. Vishny. 1997. A Survey of Corporate Governance. Journal of Finance 52
(2):737-783.
Simonsson, M., P. Johnson, and M. Ekstedt. 2010. The Effect of IT Governance Maturity on IT
Governance Performance. Information Systems Management 27 (1):10 – 24.
Staples, M., and M. Niazi. 2008. Systematic review of organizational motivations for adopting
CMM-based SPI. Information and Software Technology 50 (7-8):605-620.
Stephens, C. S., W. N. Ledbetter, A. Mitra, and F. N. Ford. 1992. Executive or functional
manager? The nature of the CIO's job. MIS Quarterly 16 (4):449-467.
Straub, D., P. Weill, and K. Schwaig. 2008. Strategic dependence on the IT resource and
outsourcing: a test of the strategic control model. Information Systems Frontiers 10
(2):195–211.
Subramanian, G. H., J. J. Jiang, and G. Klein. 2007. Software quality and IS project performance
improvements from software development process maturity and IS implementation
strategies. Journal of Systems and Software 80 (4):616-627.
Sundaramurthy, C., and M. Lewis. 2003. Control and Collaboration: Paradoxes of Governance.
The Academy of Management Review 28 (3):397-415.
Sutton, S. G., D. Khazanchi, C. Hampton, and V. Arnold. 2008. Risk Analysis in Extended
Enterprise Environments: Identification of Critical Risk Factors in B2B E-Commerce
Relationships. Journal of the Association for Information Systems 9 (4):151-174.
37
Tarafdar, M., and S. Gordon. 2007. Understanding the influence of information systems
competencies on process innovation: A resource-based view. The Journal of Strategic
Information Systems 16 (4):353-392.
Teece, D. J. 2009. Dynamic Capabilities and Strategic Management: Organizing for Innovation
and Growth. Oxford: Oxford University Press.
Teece, D. J., G. Pisano, and A. Shuen. 1997. Dynamic Capabilities and Strategic Management.
Strategic Management Journal 18 (7):509-533.
Teo, T. S. H., and J. S. K. Ang. 1999. Critical success factors in the alignment of IS plans with
business plans. International Journal of Information Management 19 (2):173-185.
Trites, G. 2004. Director responsibility for IT governance. International Journal of Accounting
Information Systems 5 (2):89-99.
Van Grembergen, W., and S. De Haes. 2008. Implementing information technology governance:
Models, practices, and cases. Hershey, PA: IGI Pub.
———. 2009. Enterprise Governance of Information Technology: Achieving Strategic
Alignment and Value. New York, NY: Springer.
Velcu, O. 2010. Strategic alignment of ERP implementation stages: An empirical investigation.
Information & Management 47 (3):158-166.
Venkatraman, N., J. C. Henderson, and S. Oldach. 1993. Continuous strategic alignment:
Exploiting information technology capabilities for competitive success. European
Management Journal 11 (2):139-149.
von Wangenheim, C. G., J. C. R. Hauck, A. Zoucas, C. F. Salviano, F. McCaffery, and F. Shull.
2010. Creating Software Process Capability/Maturity Models. IEEE Software 27 (4):9294.
Weill, P., and J. W. Ross. 2004. IT governance: How top performers manage IT decision rights
for superior results. Boston: Harvard Business School Press.
Wernerfelt, B. 1984. A Resource-Based View of the Firm. Strategic Management Journal 5
(2):171-180.
West, M. 2004. Real process improvement using the CMMI. Boca Raton, Fla.: Auerbach
Publications.
Wilkin, C. L., and R. H. Chenhall. 2010. A Review of IT Governance: A Taxonomy to Inform
Accounting Information Systems. Journal of Information Systems 24 (2):107.
Willcocks, L. P., and R. Sykes. 2000. The role of the CIO and IT Function in ERP.
Communications of the ACM 43 (4):32-38.
38
Woodward, J. 1980. Industrial Organization: Behavior and Control. 2nd. ed. Oxford: Oxford
University Press.
Wu, F., S. Yeniyurt, D. Kim, and S. Cavusgil. 2006. The impact of information technology on
supply chain capabilities and firm performance: A resource-based view. Industrial
Marketing Management 35 (4):493-504.
Xue, Y., H. Liang, and W. R. Boulton. 2008. Information Technology Governance in
Information Technology Investment Decision Processes: The Impact of Investment
Characteristics, External Environment, and Internal Context. MIS Quarterly 32 (1):67-96.
39
Resources
Acquire
Inputs
Inputs
Governance
Direct
Strategy
Design
Capabilities
Supports
Production
Outcomes
Feedback Loop
Figure 1: Theoretical Structure
Desires
Governance Layer
Board
Who decides?
What next?
Structure
Resources
Management Layer
Demands
Business
Needs
Processes
Informs
Capability
Supports
Information
Technology
Figure 2: Patterns of Governance, Processes and Delivery
40
Business
Variable
Description
Dependent Variables
OVRL
Overall process maturity
AWARE
Maturity of Management Awareness attribute
POLICIES
Maturity of Policies and procedures attribute
TOOLS
Maturity of Tools attribute
SKILLS
Maturity of Staffing attribute
RESP
Maturity of Responsibility attribute
GOALS
Maturity of Goals and metrics attribute
Independent Variables
CENTRAL
Centralized IT organizations
DECENTRAL
Decentralized IT organizations
OUTSOURCE
Level of outsourcing
STRATEGY
Business/IT alignment – “strategy”
VISION
Business/IT alignment – “vision”
RISK
Management of risk
ENVIRON
IT environment
SIZE
Size: factorized from personnel, servers and clients.
SWARE
Software complexity
HWARE
Hardware complexity
DEVT
Level of development of country
Figure 3: Variable Definitions
41
Table 1: Descriptive Statistics-Organizations
Panel A: Location and Industry
Location
Europe
Canada
Mexico
Philippines
Singapore
United States
Total
Freq.
13
3
4
8
4
18
51
Percent
25%
6%
8%
16%
8%
35%
100%
Industry
Manufacturing/Capital Intensive
Utilities
Services
Financial Services
Government & NFP
Total
Freq.
11
9
9
14
8
51
Percent
22%
18%
18%
27%
16%
100%
Panel B: Size and Complexity
Variable
Mean
Median
StdDev
Min
Max
Application Software
- Count
9.8
10.0
2.1
5
15
19.6
19.0
5.9
10
33
- Count
2.5
3.0
0.7
1
4
- Weighted by complexity
4.9
5.0
1.0
3
7
172.3
120.0
161.6
5
690
- Weighted by complexity
Hardware Types
Size
- # IT Personnel
- # Servers
193.8
163.3
196.0
10
1,100
- # Clients
3,156.6
2,050.0
3,056.0
150
15,000
Mean
Median
Std Dev
Panel C: Environment
Variable
Min
Max
Information and data architecture
- Explicitly defined
3.38
4.00
1.03
1
5
- Stable and relatively unchanging
3.02
3.00
1.15
1
5
- Explicitly defined
3.66
4.00
1.02
1
5
- Stable and relatively unchanging
3.26
3.00
1.08
1
5
3.26
3.50
1.08
1
5
3.00
3.00
1.12
1
5
Technical architecture
Technical infrastructure
- Stable and relatively unchanging
Application software
- Stable and relatively unchanging
42
Table 2: Governance
Panel A: Decision Rights
Governance over IT
Freq.
Percent
- Management primarily in control
11
22%
- IT primarily in control
27
53%
Subtotal
38
Highly Centralized
Highly Decentralized
- Management primarily in control
1
2%
- IT primarily in control
2
4%
Subtotal
3
Federal
10
20%
Total
51
100%
Panel B: Committees
Board Level Committees:
Organizational Committees or Boards:
IT Strategy/Investment
22%
IT Security
54%
IT Risk/Security
24%
IT Project Management
44%
IT Governance
32%
Organizational Committees or Boards:
IT Strategy/Investment/IT Council
62%
Risk management
26%
IT Architecture
46%
Compliance (e.g. SOX, HIPAA)
32%
Panel C: Monitoring
Variable
Mea
n
Std Dev
Board monitors:
Variable
Mean
Std
Dev
IT is:
- Strategic directions for IT
3.14
1.29
- Tactical directions for IT
2.54
1.18
- IT governance
2.70
1.28
Audit Committee monitors:
- Strategic directions for IT
2.80
1.29
- Tactical directions for IT
2.46
1.18
- IT governance
2.90
1.36
- Strongly influenced by
compliance environment
- Works closely with
Internal Audit
- Performance has clearly
understood metrics
- Influenced by Corporate
Governance standards
CIO member of core
management team
43
3.32
1.46
3.59
1.35
2.94
1.32
3.26
1.52
3.82
1.3
Table 3: Governance and IT Frameworks
Variable
Mean
StdDev
COSO
0.62
0.97
PCI
0.50
1.05
COSO-ERM
0.24
0.62
Sentencing Guidelines
0.14
0.61
Blue Ribbon Committee
0.00
0.00
Industry and Compliance-Max
1.42
1.36
Turnbull Guidelines
0.04
0.20
IT and Related Frameworks
King Guidelines
0.00
0.00
ITGI SOX Control Objectives
0.56
0.99
OECD Code
0.12
0.44
CobiT
1.60
0.88
Governance-Max
0.66
0.98
ITIL/ISO 20000
1.52
0.86
ISO 17799/27001
1.12
1
1.00
1.14
CMM/CMMI
0.66
0.8
Prince 2+PMBOK
0.96
0.99
Corporate Governance
Mean
StdDev
Industry and Compliance (cont.)
Quality Management
ISO 9000/ISO9001
Variable
Industry and Compliance
SOX
1.02
1.35
ITSEC
0.04
0.2
Basel II
0.32
0.71
TickiT
0.06
0.31
HIPAA
0.46
1.01
NIST 800
0.22
0.65
FDIC
0.12
0.59
IT - Max
2.10
0.76
FCPA
0.16
0.65
Frameworks in use
4.59
3.47
Table 4: Business/IT Alignment
Variable
Mean
StdDev
Business involved in strategy
3.68
1.30
Business involved in tactics
3.48
Business involved
in IT Architecture Exceptions
Business involve IT in
process improvements
Business involve IT in
product & service
Business involved in decisions on
investment in IT projects
Business involved in decisions on
investment in IT architecture
Track business value from
IT project investments
Track business value from
IT architectural investments
Staff with business-facing resp.
Communicate IT project status
to the business units
Undertake post-project
implementation reviews
Variable
Mean
StdDev
Chargeback mechanisms
2.76
1.49
1.25
Service Level
3.00
1.32
3.34
1.21
IT & business share common vision
3.34
1.17
3.72
1.09
Business highly knowledgeable
2.64
1.08
3.26
1.21
3.74
0.88
3.64
1.27
3.18
1.22
2.74
1.27
IT has infrastructural capacity
to meet business requirements
IT has HR capacity to meet
business requirements
Demand on IT is intensive
3.90
1.20
3.04
1.29
Demand on IT is extensive
4.00
1.16
2.62
1.26
2.86
1.39
3.88
1.30
Business requests for
services are prioritized
IT core element corporate strategy
3.58
1.30
4.22
0.84
3.45
1.23
3.36
1.31
IT's primary responsibility
is to provide foundation technologies
Primary responsibility for
identifying IT value is with IT
3.30
1.09
44
Table 5: Process Maturity by Process – Overall and by Attribute§
Process
Process Name
PO Plan and Organize Domain
PO1
Define a Strategic IT Plan
PO2A
Architecture-Architecture
PO2D
Architecture-Classification
PO3
Technological Direction
PO4P
IT Processes-Processes
PO4O
IT Processes-Organization
PO5B
IT Investment-Budgeting
PO5V
IT Investment-Value Mgt
PO6
Management Aims & Direction
PO7
Manage IT Human Resources
PO8
Manage Quality
PO9
Assess & Manage IT Risks
PO10PG
Manage Projects-Program
PO10PJ
Manage Projects-Projects
AI Acquire and Implement Domain
AI1
Identify Automated Solutions
AI2
Application Software
AI3
Technology Infrastructure
AI4
Enable Operation & Use
AI5
Procure IT Resources
AI6
Manage Changes
AI7
Install Solutions Changes
Stats
O
A
P
T
S
R
G
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
2.37
0.80
2.21
0.98
2.07
0.97
2.35
0.95
2.34
1.02
2.49
0.95
3.00
0.96
1.93
1.03
2.37
1.02
2.78
0.90
2.17
0.98
2.18
0.95
2.42
0.92
2.89
0.83
2.89
0.96
2.71
1.16
2.34
1.14
2.88
1.02
2.87
1.06
2.95
0.98
3.41
1.02
2.26
1.16
2.77
1.17
3.10
0.95
2.40
1.11
2.62
1.05
2.71
1.15
3.28
0.92
2.26
0.90
2.17
1.11
2.05
1.12
2.39
1.06
2.42
1.16
2.55
1.12
3.03
1.03
1.93
1.10
2.50
1.12
2.98
1.01
2.26
1.19
2.18
1.00
2.35
1.08
3.06
1.00
1.72
0.98
1.95
1.11
2.00
1.25
1.68
1.15
1.70
1.38
1.93
1.23
2.81
1.15
1.49
1.13
2.10
1.21
2.49
1.20
1.87
1.06
1.75
1.00
2.10
1.16
2.63
1.05
2.29
0.99
2.30
1.03
2.16
0.94
2.51
0.99
2.29
0.98
2.48
0.92
2.86
1.08
1.89
1.08
2.32
0.97
2.67
0.94
2.25
0.98
2.21
1.07
2.28
0.86
2.72
0.83
2.69
0.97
2.39
1.25
2.21
1.06
2.53
1.17
2.65
1.14
2.81
1.06
3.15
1.15
2.09
1.18
2.54
1.18
3.01
0.96
2.35
1.05
2.38
1.05
2.66
1.02
3.03
0.96
2.36
0.95
1.74
1.18
1.62
1.15
2.04
1.06
2.12
1.13
2.17
1.11
2.73
1.16
1.94
1.12
1.94
1.07
2.39
1.06
1.88
1.02
1.91
1.06
2.39
0.96
2.58
0.84
27
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
2.52
0.91
2.91
0.85
2.71
0.82
2.50
0.94
2.83
0.82
2.64
0.80
2.69
0.80
2.95
0.92
3.29
0.89
3.15
0.95
2.87
1.02
3.34
0.88
3.08
0.80
3.12
0.76
2.67
1.15
3.04
1.00
2.79
0.99
2.60
1.06
3.10
0.94
2.92
1.03
2.81
0.93
2.08
1.01
2.52
1.03
2.25
0.96
2.15
1.09
2.52
1.23
2.50
1.08
2.39
0.94
2.45
0.96
2.87
0.86
2.60
0.86
2.51
0.90
2.64
0.87
2.46
0.77
2.65
0.92
2.68
1.00
3.20
0.87
2.98
0.94
2.71
1.17
3.07
0.89
2.73
0.89
2.92
0.83
2.23
1.06
2.59
1.02
2.37
1.00
2.12
1.02
2.24
1.00
2.04
0.98
2.20
1.03
21
45
Rank
35
40
30
31
25
3
41
27
12
37
36
26
8
7
13
24
10
16
15
Process
Process Name
DS Deliver and Support Domain
DS1
Manage Service Levels
DS2
Manage Third-party Services
DS3
DS4
Manage Performance &
Capacity
Ensure Continuous Service
DS5P
Security-Policy
DS5U
Security-User access
DS5NF
Security-Network & Firewall
DS5V
Security-Virus
DS6
Identify & Allocate Costs
DS7
Educate & Train Users
DS8
Service Desk & Incidents
DS9
Manage Configuration
DS10
Manage Problems
DS11
Manage Data
DS12
Manage Physical Environment
DS13
Manage Operations
O
A
P
T
S
R
G
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
Mean
Std Dev
2.17
0.87
2.57
0.92
2.51
0.82
2.57
1.06
2.70
0.88
2.81
0.86
2.95
0.95
3.12
1.03
2.59
1.21
2.51
1.04
2.92
0.91
2.36
0.95
2.56
0.97
2.94
0.90
3.10
0.97
2.86
0.80
2.54
0.98
2.97
0.95
2.82
0.91
2.94
1.13
2.97
1.05
3.14
0.96
3.27
0.94
3.37
0.97
2.94
1.40
2.66
1.14
3.22
0.95
2.60
1.03
2.80
0.93
3.16
0.94
3.53
0.90
3.19
0.87
2.13
0.99
2.65
1.05
2.29
0.90
2.59
1.18
2.86
1.09
2.88
1.00
2.93
1.11
3.15
1.09
2.54
1.42
2.55
1.18
2.76
0.95
2.34
1.00
2.52
1.05
3.04
1.03
3.14
1.07
2.91
0.85
1.88
1.15
2.12
1.23
2.40
0.91
2.38
1.21
2.30
1.12
2.63
1.01
3.00
1.14
3.40
1.14
2.43
1.39
2.24
1.16
2.98
1.05
2.25
1.09
2.50
1.14
3.04
1.08
3.06
1.22
2.72
1.07
2.15
0.91
2.56
0.95
2.52
0.87
2.52
1.08
2.68
0.92
2.72
0.97
2.97
0.96
2.98
1.15
2.59
1.21
2.60
1.09
2.79
0.97
2.43
0.95
2.54
1.01
2.77
0.94
3.00
1.10
2.67
0.86
2.26
1.07
2.88
1.00
2.78
0.96
2.69
1.16
3.03
0.89
3.07
0.99
3.06
1.06
3.20
1.19
2.84
1.27
2.70
1.15
3.09
1.03
2.64
1.13
2.78
0.97
3.11
0.95
3.33
1.00
3.16
0.82
1.94
1.03
2.24
1.06
2.20
1.04
2.29
1.14
2.32
1.03
2.40
1.06
2.48
1.14
2.59
1.30
2.09
1.26
2.26
1.07
2.65
1.12
1.88
1.18
2.18
1.18
2.42
0.91
2.50
1.20
2.46
0.97
Rank
37
18
22
18
14
11
4
1
17
22
6
29
20
5
2
9
ME Monitor and Evaluate Domain
ME1
Monitor IT Performance
39
ME2
33
ME3
ME4
§
Stats
Mean
2.16 2.59 2.12 1.73 2.10 2.38 2.00
Std Dev
0.97 1.04 1.10 0.98 1.02 1.16 1.27
Monitor Internal Control
Mean
2.28 2.61 2.42 1.96 2.17 2.47 2.01
Std Dev
1.10 1.18 1.35 1.14 1.16 1.12 1.26
Regulatory Compliance
Mean
2.26 2.65 2.36 1.78 2.10 2.50 2.16
Std Dev
1.14 1.15 1.27 1.22 1.16 1.30 1.26
IT Governance
Mean
2.29 2.61 2.42 1.77 2.27 2.63 2.04
Std Dev
1.05 1.14 1.13 1.10 1.18 1.10 1.22
Key to Maturity Attributes: O=Overall maturity level, A=Management Awareness P=Policies T=Tools
S=Staffing R=Responsibility G=Goals
46
34
32
Table 6: Pairwise Correlation – Dependent and Independent Variables
OVRL
AWARE
1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
0.894
0.000
POLICIES
2
TOOLS
3
SKILLS
4
RESP
GOALS
CENTRAL
5
6
7
DECENTRAL
8
OUTSOURCE
9
STRATEGY
VISION
RISK
ENVIRON
SIZE
SWARE
10
11
12
13
14
15
HWARE
16
DEVT
17
0.918
0.821
0.000
0.000
0.875
0.714
0.769
0.000
0.000
0.000
0.891
0.760
0.779
0.734
0.000
0.000
0.000
0.000
0.918
0.801
0.819
0.743
0.796
0.000
0.000
0.000
0.000
0.000
0.877
0.719
0.752
0.719
0.746
0.793
0.000
0.000
0.000
0.000
0.000
0.000
0.081
0.087
0.061
0.029
0.078
0.062
0.047
0.000
0.000
0.008
0.205
0.001
0.007
0.043
0.059
0.065
0.106
0.098
0.050
0.032
0.075
-0.419
0.007
0.005
0.000
0.000
0.030
0.169
0.002
0.000
0.046
0.033
0.069
-0.014
-0.001
0.063
-0.010
0.072
0.033
0.149
0.003
0.551
0.968
0.006
0.681
0.001
0.002
0.376
0.294
0.370
0.252
0.333
0.354
0.395
0.184
-0.140
0.297
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
-0.096
-0.138
-0.111
-0.094
-0.121
-0.141
-0.099
-0.223
-0.016
-0.070
-0.049
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.453
0.001
0.026
0.210
0.107
0.191
0.114
0.202
0.137
0.173
0.195
-0.090
0.125
0.531
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.054
0.151
0.082
0.084
0.131
0.132
0.097
0.222
-0.184
-0.296
-0.197
-0.339
-0.096
0.013
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.178
0.163
0.183
0.133
0.164
0.152
0.181
0.007
-0.082
0.217
0.226
0.020
0.060
-0.031
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.759
0.000
0.000
0.000
0.372
0.006
0.162
0.142
0.141
0.141
0.150
0.124
0.133
0.148
0.171
0.005
-0.221
0.067
-0.068
0.074
0.031
0.177
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.804
0.000
0.002
0.002
0.001
0.151
0.000
0.177
0.173
0.153
0.182
0.179
0.138
0.164
-0.103
0.030
-0.188
0.061
0.197
0.190
-0.068
0.345
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.168
0.000
0.005
0.000
0.000
0.002
0.000
0.000
0.198
0.222
0.190
0.295
0.166
0.191
0.201
-0.123
0.147
-0.126
-0.318
0.054
-0.122
0.084
-0.030
-0.008
0.172
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.000
0.013
0.000
0.000
0.171
0.710
0.000
47
-0.069
0.161
0.231
Table 7: Descriptive Statistics – Variables in Multivariate Analysis
Variable
Mean
Median
Std Dev
Skewness
Dependent Variables
OVRL
2.517
2.500
0.967
0.126
AWARE
2.928
3.000
1.045
-0.232
POLICIES
2.609
2.500
1.103
0.079
TOOLS
2.283
2.000
1.184
0.254
SKILLS
2.512
2.500
1.000
0.229
RESP
2.772
3.000
1.078
-0.028
GOALS
2.218
2.000
1.103
0.269
CENTRAL
0.721
1.000
0.449
-0.983
DECENTRAL
0.064
0.000
0.244
3.573
OUTSOURCE
12.889
12.000
11.709
1.138
STRATEGY
0.052
0.203
0.959
-0.182
VISION
0.025
0.024
0.871
-0.257
RISK
0.009
-0.065
0.968
0.100
-0.040
0.090
0.855
0.515
Independent Variables
ENVIRON
SIZE
0.041
-0.263
0.876
1.675
SWARE
19.924
19.000
5.796
0.426
HWARE
2.572
3.000
0.733
-0.041
DEVT
0.769
1.000
0.421
-1.279
1
Table 8: Regression Analysis – Overall and by Domain
VARIABLES
CENTRAL
DECENTRAL
OUTSOURCE
STRATEGY
VISION
RISK
ENVIRON
SIZE
SWARE
ALL
PO
AI
DS
ME
0.091
0.041
0.227
0.131
-0.191
-0.144
-0.159
-0.160
-0.159
-0.217
0.486**
0.369
0.587**
0.527**
0.697**
-0.205
-0.300
-0.235
-0.203
-0.297
0.000
-0.005
0.003
0.006
-0.006
-0.007
-0.008
-0.007
-0.007
-0.009
0.508***
0.474***
0.476***
0.493***
0.834***
-0.088
-0.110
-0.089
-0.079
-0.152
-0.048
-0.078
-0.036
-0.063
0.184
-0.081
-0.102
-0.088
-0.079
-0.122
-0.015
-0.011
-0.034
-0.003
-0.021
-0.082
-0.104
-0.089
-0.077
-0.134
0.149*
0.130
0.119
0.208**
0.096
-0.087
-0.105
-0.097
-0.090
-0.109
0.056
0.081
0.022
0.074
-0.016
-0.120
-0.161
-0.104
-0.106
-0.107
0.011
0.006
0.005
0.020
0.010
-0.014
-0.020
-0.012
-0.013
-0.018
0.106
0.009
0.175
0.154
0.120
-0.092
-0.108
-0.107
-0.096
-0.141
0.740***
0.838***
0.630***
0.703***
0.768***
-0.160
-0.198
-0.156
-0.143
-0.234
1.348***
1.622***
1.354***
1.104**
1.158**
-0.426
-0.553
-0.424
-0.413
-0.532
Observations
2095
815
324
780
176
R-squared
0.304
0.287
0.359
0.349
0.454
F test:
6.472
3.346
6.210
10.390
6.600
HWARE
DEVT
Constant
Robust standard errors in parentheses
*** p<0.01, ** p<0.05, * p<0.1
2
Table 9: Regression Analysis by Attribute
VARIABLES
CENTRAL
DECENTRAL
OUTSOURCE
STRATEGY
VISION
RISK
ENVIRON
SIZE
SWARE
AWARE
POLICIES
TOOLS
SKILLS
RESP
GOALS
0.118
0.057
0.014
0.035
-0.052
-0.011
-0.172
-0.202
-0.179
-0.181
-0.177
-0.16
0.526**
0.683**
0.485*
0.36
0.23
0.447
-0.23
-0.275
-0.27
-0.335
-0.308
-0.271
0.003
0.002
-0.003
-0.004
0.003
-0.008
-0.009
-0.008
-0.007
-0.007
-0.009
-0.009
0.468***
0.540***
0.504***
0.437***
0.556***
0.641***
-0.105
-0.097
-0.086
-0.096
-0.097
-0.101
-0.049
-0.049
-0.074
-0.082
-0.081
-0.043
-0.096
-0.102
-0.097
-0.106
-0.1
-0.107
-0.11
-0.022
-0.07
0.019
-0.088
-0.07
-0.083
-0.085
-0.088
-0.096
-0.091
-0.084
0.240**
0.170*
0.143
0.168
0.216*
0.181
-0.101
-0.097
-0.103
-0.112
-0.108
-0.123
0.028
0.071
0.03
0.054
0.006
0.07
-0.112
-0.118
-0.112
-0.143
-0.118
-0.136
0.016
0.017
0.019
0.009
0.018
0.01
-0.014
-0.015
-0.014
-0.016
-0.015
-0.017
0.182*
0.108
0.148
0.144
0.128
0.103
-0.106
-0.113
-0.103
-0.104
-0.106
-0.1
0.692***
0.702***
0.978***
0.554***
0.704***
0.752***
-0.162
-0.183
-0.156
-0.173
-0.18
-0.204
1.420***
1.280**
0.73
1.501***
1.487***
1.203**
-0.48
-0.513
-0.47
-0.471
-0.5
-0.51
Observations
1896
1893
1883
1889
1885
1824
R-squared
0.261
0.282
0.258
0.242
0.269
0.315
F test:
8.543
7.67
10.37
4.198
6.773
9.553
HWARE
DEVT
Constant
Robust standard errors in parentheses
*** p<0.01, ** p<0.05, * p<0.1
3
Appendix 1: Business Processes in COBIT- Modified
Plan and Organize (PO) Domain
PO1-Define a Strategic IT Plan
PO2A-Architecture-Architecture
PO2D-Architecture-Classification
PO3-Technological Direction
PO4P-IT Processes-Processes
PO4O-IT Processes-Organisation
PO5B-IT Investment-Budgeting
PO5V-IT Investment-Value Management
PO6-Management Aims & Direction
PO7-Manage IT Human Resources
PO8-Manage Quality
PO9-Assess & Manage IT Risks
PO10PG-Manage Projects-Program
PO10PJ-Manage Projects-Projects
Acquire and Implement (AI) Domain
AI1-Identify Automated Solutions
AI2-Application Software
AI3-Technology Infrastructure
AI4-Enable Operation & Use
AI5-Procure IT Resources
AI6-Manage Changes
AI7-Install Solutions Changes
Processes
Acquire and Implement (AI) Domain
DS1-Manage Service Levels
DS2-Manage Third-party Services
DS3-Manage Performance & Capacity
DS4-Ensure Continuous Service
DS5P-Security-Policy
DS5U-Security-User access
DS5NF-Security-Network & Firewall
DS5V-Security-Virus
DS6-Identify & Allocate Costs
DS7-Educate & Train Users
DS8-Service Desk & Incidents
DS9-Manage Configuration
DS10-Manage Problems
DS11-Manage Data
DS12-Manage Physical Environment
DS13-Manage Operations
Monitor and Evaluate (ME) Domain
ME1-Monitor IT Performance
ME2-Monitor Internal Control
ME3-Regulatory Compliance
ME4-IT Governance
4
Download