CSPA readiness report

advertisement
1. Executive Summary
2. Background and objectives
Rationale for this study
While still in an early “ramping up” phase,


the specification of CSPA (Common Statistical Production Architecture) has reached a level
of maturity (V1.1) which supports practical application, and
CSPA compliant services are actively being developed, shared and catalogued.
Complementing this progress on the “supply” side, at its face to face meeting in April 2015 the
Modernisation Committee on Production and Methods (MC P&M) concluded it is appropriate to
review in more detail how ready agencies are to adopt, and realise practical benefits from, services
being developed and shared via the CSPA initiative.
The ability of agencies within the official statistics industry to


adopt and integrate CSPA compliant services, and
realise practical advantage through doing this (rather than choosing to address their needs
in other ways)
is critical to the success of CSPA and, more broadly, to realising value from the international strategy
for modernising statistical production and services.
More fully understanding readiness to adopt and integrate CSPA compliant services, including the
barriers that exist, should assist in






addressing the risk of over-investing in producing individual CSPA compliant services while
under-investing in facilitating adoption and integration of those services
identifying opportunities for the architectural design standards applied to individual CSPA
services to more fully facilitate adoption and integration of those services within agencies
more clearly describing an architectural “target state” which would allow an agency to
efficiently and successfully adopt and integrate CSPA compliant services
understanding how the “current state” of an agency relates to the “target state”, including
whether some dimensions are already close to target while others are significantly holding
back the ability to adopt and integrate CSPA services successfully
prioritising strategic support to address the most challenging dimensions, leading to
maximum gains in overall readiness (in other words, greatest reduction in barriers)
identifying how particular agencies have addressed barriers to adoption and integration of
CSPA services and sharing the ideas and practices which have proved successful
While this initial study initiated by MC P&M on readiness/maturity is small and exploratory, a
number of findings are documented in this report together with suggestions for possible next steps.
What constitutes readiness to adopt CSPA services?
CSPA is based on Service Oriented Architecture (SOA).
An agency poorly placed to implement and integrate SOA based solutions is likely to find it
significantly less feasible and/or cost effective to adopt CSPA compliant services and harness them
within their broader IT environment. An extreme example might be an agency that remained
committed (through choice or through budgetary or other constraints) to producing statistics using a
small number of “set in stone” massively multi-functional applications (monolithic architecture).
CSPA, however, is premised on more than “generic” SOA. It assumes an approach oriented toward
alignment and sharing on an international/”official statistics industry” basis. This is reflected in, for
example, standards based information interfaces that CSPA services are required to support. A team
seeking to develop localised services on a pragmatic basis might favour some form of a SOA
approach but still not favour CSPA nor anticipate a practical return on investment through adopting
and integrating CSPA compliant services.
When it comes to adopting and integrating CSPA compliant services, cost-benefit is likely to be
maximised when agencies/enterprises support application of individual standards and practices
associated with CSPA, together with application of CSPA itself, rather than local developers making
choices in the absences of broader integration support and guidance within the organisation.
Such support might include, for example,

corporate practice of mapping production activities to the GSBPM and capabilities to the
CSPA Capability Reference Model so it is easy to see which CSPA services are relevant to
which business requirements within the agency

corporate support for use of the relevant information standards through sharing expertise
and through repositories or translation libraries which make it easy to input and output
information in the required format
Framework (maturity model) used for assessing readiness
MC P&M chose to base the assessment on the widely recognised and applied Open Group Service
Integration Maturing Model (OSIMM).
The base model associated with OSIMM is designed to be extended so organisations/industries can
focus on readiness criteria relevant to


the strategic and operational context associated with that particular organisation/industry
the particular service oriented framework the organisation/industry is seeking to adopt
Annex 1 of this report contains the one page questionnaire used to gather readiness assessments for
individual agencies, together with the instructions for completion and definitions provided to
respondents.
A1.5 and A1.6 document



the dimensions used in this study,
recommended assessment questions for each dimension
how the framework for this study, in places, extends the OSIMM base model
As is evident from A1.5 and A1.6, the extensions used in this study address considerations of the
type outlined in Section 2.2. It was important the readiness assessment incorporated these
considerations which arise from the fact that adoption and integration of CSPA compliant services
incorporates factors which are additional to adopting a “generic” approach to service orientation.
3. Method
Instrument design, target population and response population
The questionnaire documented in Annex A was sent to all members of MC P&M on 5 June 2015. At
that time MC P&M had 14 listed members who represented 11 distinct agencies.
5 agencies had responded by the closing date 17 July. These were the NSIs for





Ireland (provided assessment of Current Maturity but not Target Maturity)
Netherlands
New Zealand
Norway
Australia
The full response from each of the 5 agencies is available via the Working Pages in the MC P&M wiki.
Analysis method
As per the results presented in Section 4, the responses for Current Maturity and Target Maturity
were tabulated based on two dimensions


By agency
By maturity dimension
A third table was then calculated based on the difference between Target Maturity and Current
Maturity. The third table represents a broad measure of how much further progress is expected to
be required in order to achieve the target maturity.
A mean and standard deviation was calculated for both dimensions in each of the three tables.
The standard deviation was calculated


based on “the population of agencies which responded to the questionnaire”,
NOT by treating respondents as a (very small and not necessarily representative) sample and
seeking to calculate a standard deviation for a broader population of agencies.
The standard deviation was used only as a broad indicator of spread or consistency in responses
received.
Coverage and other quality considerations
It is particularly important to bear in mind a number of points when considering the results and
analysis.
1. The very small number of respondents means patterns apparent in the results of this
study cannot be expected – statistically speaking -to hold true for other agencies which
are seeking to adopt and integrate CSPA related services.
2. The questionnaire seeks to assess how ready an agency is to adopt and integrate CSPA
conformant services.
People who filled out the questionnaire on behalf of an agency, however, come from a
particular discipline and business unit within that agency. Given the direct or indirect
association between these people and MC P&M, the people filling out the questionnaire
may tend, for example, to be more forward looking, more strategically focused and
more optimistic than average.
Respondents employed a number of approaches (eg determining responses in
consultation with multiple colleagues) to mitigate this factor. Nevertheless the factor
needs to be recognised.
A related possibility is that different states of readiness/maturity in different parts of an
organisation may make an “overall” assessment not particularly relevant. For example,
a particular “line of business” within the agency might be able to adopt and integrate
CSPA compliant services without the fact that other lines of business are unable to do
that posing too much of a constraint in practice. The response from Netherlands
explicitly noted a lack of homogeneity applies to their organisation.
3. While a broad description is provided for the dimensions and maturity levels, individual
interpretations of these descriptions may vary when determining a maturity level for a
particular dimension. In other words, interpretations of the rating scale itself can vary
(in addition to assessment of a particular agency against the scale being imprecise).
Nevertheless, the study identifies a number of interesting findings which can be discussed
further to see if they appear to hold true more generally. If they do appear to hold true
more generally then strategies and actions to support adoption of CSPA compliant services
should be updated accordingly.
4. Results: Maturity Levels
All of the results presented in this section should be interpreted with coverage and other quality
considerations (see Section 3.3) in mind.
Quantitative Results
Current Maturity
Dimension
Business Activity View
Business Capability View
Statistical Methodology
Information
Application
Infrastructure & Management
Governance & Organisation
Design Practices
Mean
StdDev
New
Zealand
2.5
2.1
3.5
2.1
1.5
1.5
3.1
2.5
2.4
Norway
3.5
1
1
1
2
3
1
2
1.8
1.1
0.7
0.9
Ireland Netherlands
3
4
4
4
5
7
4
4
4.4
New
Zealand
5.5
5.5
5
5
5.1
4.5
6
6
5.3
Norway
5
5
5
5
5
5
5
5
5.0
1.1
0.5
0.0
Ireland Netherlands
1.5
2
1
3.5
1
3
1.5
3.5
1
4
1.5
6
2
3
1
2.5
1.3
3.4
0.3
Australia
1
2.5
4
2.2
2.5
3
2
1
2.3
Mean
StdDev
2.1
0.9
2.0
0.9
2.5
1.3
2.1
0.8
2.2
1.0
3.0
1.6
2.2
0.8
1.8
0.7
2.2
Mean all dimensions, all
0.9 nations
Target Maturity
Dimension
Business Activity View
Business Capability View
Statistical Methodology
Information
Application
Infrastructure & Management
Governance & Organisation
Design Practices
Mean
StdDev
Australia
5
5
5
5
5
5
5
5
5.0
Mean
StdDev
4.6
1.0
4.9
0.5
4.8
0.4
4.8
0.4
5.0
0.0
5.4
1.0
5.0
0.7
5.0
0.7
4.9
Mean all dimensions, all
0.0 nations
Gap
Dimension
Business Activity View
Business Capability View
Statistical Methodology
Information
Application
Infrastructure & Management
Governance & Organisation
Design Practices
Mean
StdDev
Ireland Netherlands
1
0.5
1
0.5
1
1
1
1.5
0.9
New
Zealand
3
3.4
1.5
2.9
3.6
3
2.9
3.5
3.0
Norway
1.5
4
4
4
3
2
4
3
3.2
0.3
0.6
0.9
Australia
4
2.5
1
2.8
2.5
2
3
4
2.7
Mean
StdDev
2.4
1.2
2.6
1.3
1.9
1.2
2.6
1.3
2.5
1.0
2.0
0.7
2.7
1.1
3.0
0.9
2.5
Mean all dimensions, all
0.9 nations
Current Maturity
Compared with Target Maturity, Current Maturity exhibits considerably greater variability by
country and by dimension.
Three dimensions are discussed in detail below in regard to Current Maturity. The other five
dimensions are considered less “remarkable”, especially given quality considerations outlined in
Section 3.3.
Design Practices was at the lowest level on average in regard to Current Maturity. Compared with
other dimensions, there was relatively little variation in level of Current Maturity across offices.
Overall Design Practices were seen as the dimension with the (equal) second highest Target
Maturity level.
This means Design Practices have, by a considerable margin, the “furthest to go” from Current
Maturity to Target Maturity.
Pursuing modernisation enabled though CSPA can be expected to require a significantly greater
emphasis on Design Practices than previous, more localised, stand alone, development within
agencies. CSPA focuses on services being designed based on the architectural framework, including
with reference to relevant standards. Application of GSBPM and the Capability Reference Model
implies agencies consider their enterprise, and individual business activities and capability
developments within it, from a design perspective. Much of the saving realisable through
automating “routine” operational tasks currently undertaken by staff during statistical production is
based on designing and configuring these business processes in advance rather than leaving it until
the Collect, Process, Analyse and Disseminate Phase to identify decision points, decision criteria and
alternative workflows.
Current Maturity for Design Practices is assessed as comparatively high by New Zealand. The Target
Maturity sought by New Zealand is also comparatively high. There may be scope for sharing
experiences and practices.
In addition, it appears that a number of nations there is currently an emphasis on Design Thinking
and/or User Centred Design being applied more broadly by government agencies.
It may be that

Maturing of Design Practices to support adoption and integration of CSPA compliant
services could be facilitated in part through reference to concepts, skills and guidelines
associated with these broader initiatives.

The strong alignment of CSPA with broader Design Thinking initiatives could be emphasised
with the aim of these broader initiatives referring to CSPA, and supporting adoption and
integration of CSPA compliant services, as an exemplar of the broader direction.
At the other end of the scale, the strongest dimension is Infrastructure and Management with a
mean Current Maturity level of 3.0. This average is strongly influenced by Netherlands rating their
maturity at 6.0 (Virtualised) for this dimension. Even without that outlier, however, this is one of the
strongest dimensions.
With the notable exception of New Zealand, Infrastructure and Management is one of the highest
rated dimensions for every agency. While typically not yet at the optimal level for uptake of CSPA
compliant services, this dimension appears comparatively “healthy” in most cases.
The second strongest dimension on average was Statistical Methodology, with



the highest maturity rating for two countries (Australia and New Zealand),
equal lowest maturity rating for two countries (Ireland and Norway), and
Netherlands somewhere between the poles
Statistical Methodology can be seen as core to leading practice (from a statistical business
perspective) services being delivered though CSPA. Methodologists often assist subject matter
statisticians in


determining which statistical methods will be used in the design of a particular statistical
program and assist them in identifying
identifying which IT software/functions can be used to apply a particular statistical method
Given the variability in reported current maturity levels, Statistical Methodology may warrant
further discussion, including


whether, compared with other dimensions, Statistical Methodology should be a leading
dimension (or whether it is OK for it to trail), and – if so,
sharing more information on how leading levels of maturity in this dimension have been
developed in some offices.
Target Maturity
A remarkable feature in regard to Target Maturity, and its relationship with Current Maturity, is
Netherlands as an outlier. Netherlands has


the highest mean Currently Maturity level (3.4 – one level above the next highest average of
2.4)
the lowest mean Target Maturity level (4.4 – more than ½ a level below next lowest average)
This means the average gap between Current Maturity and Target Maturity is remarkably low for
Netherlands (0.9) where the other three nations have average gaps of 3.0, 3.2 and 2.7.
In addition, Netherlands shows high variability of target levels across dimensions from


Business Activity View with a target level of 3, to
Infrastructure & Management with a target level of 7
This represents a range of 4 levels, where the next highest range is 1½ levels from New Zealand.
(New Zealand assigns Infrastructure & Management the lowest target level, 4.5).
Norway and Australia assigned the same Target Maturity (5) across all dimensions (in other words,
their range was 0 levels).
This suggests scope for a very informative discussion among members of MC P&M on

How ambitious is the Target Maturity level for being able to adopt and integrate CSPA
compliant services in a manner which is cost effective and delivers strong practical benefits?

Is the Target Maturity level relatively consistent across dimensions or are particularly
advanced levels of maturity required in some dimensions and only modest levels of maturity
in others much more important than in others
Ireland, who considered themselves not yet in a position to confidently nominate Target Maturity
levels, noted they would be particularly interested in interactive discussion of Target Maturity levels
among members of MC P&M.
There was negligible variation across Target Maturity levels identified for the Application dimension.
This is, perhaps, not surprising because


CSPA is SOA based, which leads to a particularly strong and direct relationship with
application architecture - including maturity in service orientation
many respondents have a strong knowledge of application architecture and are well placed
to identify the target level
Other apparent variations such as


Business Activity View having the lowest average Target Maturity, and
Infrastructure and Management having the highest
are simply artefacts of the variability in Target Maturity levels assigned by Netherlands. These would
be expected to be addressed, therefore, as part of the more general discussion recommended above
on whether Target Maturity levels vary strongly by dimension or not.
5. Results: Next Steps / Requirements
The questionnaire invited respondents to identify the next major steps their agency might need, or
choose, to take to progress toward Target Maturity for each dimension. Three agencies (Norway,
New Zealand and Australia) provided comments.
This section summarises key elements of these responses, particularly where similar requirements
were identified by multiple agencies. If the reader is interested in more information, a link to the full
source content is provided in Section 3.1 of this report.
Business Activity View
Both Norway and Australia noted the importance of staff responsible for statistical production
becoming familiar with, and applying, terminology from the Business Activity Model (eg based on
GSBPM/GAMSO). As Norway noted, in some cases staff describe the statistical activities they
undertake in terms of the (IT) tools they currently use to perform them rather than providing a
business based description of the activity which would better support



understanding the nature of the current business activity
tracing the activity to business capabilities which may allow that activity to be undertaken
more efficiently and/or effectively
considering opportunities for re-engineering or transforming the current business process.
Organisational responsibility and resourcing for building familiarity of statistical production staff with
the relevant framework(s) was unclear although a number of means of increasing practical
familiarity were raised.
Business Capability View
Both Norway and New Zealand considered the Business Capability View was relatively little
understood across the organisation (eg compared to the Business Activity View). An early step was
to have the Business Capability View better understood, and its value more widely recognised,
across the organisation. New Zealand noted that pockets of the organisation were already very
familiar with applying this view.
A key challenge for Australia was relating Business Capabilities to catalogued IT Services (from broad,
reusable assemblies to fine grained individual services) which can be used to support the Capability.
This would support full navigation from Business Activities to Capabilities to solutions/services
available to deliver capability.
Statistical Methodology
New Zealand and Australia rated Statistical Methodology as the most mature current dimension.
Australia provided no short term recommendations.
Norway and New Zealand noted that successfully enabling, and sharing, Statistical Methods based
on SOA may require some additional learning and analysis by methodologists.
New Zealand posed a particular question for resolution over the next six months.
Identify what will be more useful, CSPA enabling an existing methodology or CSPA enabling R
‘packages’
The latter option might entail support for statistical methods remaining relatively functionally
oriented from the perspective of methodologists while making it possible to expose the relevant
functions via services within applications architecture?
Information
Progressing from GSIM to an information architecture and data architecture which, in practice,
enabled consistent and efficient flow of statistical information from one application to another was a
challenge common to all three agencies.
In addition to existing collaboration in regard to CSPA Information Architecture (including definition
of the Logical Information Model) this finding suggests scope for sharing of ideas, learnings and
frameworks for defining and implementing information and data architecture within individual
agencies.
Application
All three agencies identified integration as a key challenge in moving to SOA, including integration
with COTS products in some cases.
While SOA was a well-established direction within elements of all three organisations, all three
identified a need for broadening the message and ensuring a consistent approach.
New Zealand identified the need to identify which services are candidates for developing and
sharing via CSPA, recognising the feasibility, priority and value add for this approach is unlikely to be
the same for every service that is required as part of each agency’s application architecture.
Infrastructure & Management
Both Norway and Australia highlighted the need to be able to more clearly identify and assure IT
Service Levels so it becomes possible to ensure these are consistent with the required Service Level
from a business perspective. Configuration and management of IT infrastructure was also a
consideration for both agencies, including more automated/virtualised provisioning of infrastructure
in Australia.
New Zealand specifically identified moving to XaaS for key elements of infrastructure.
Governance & Organisation
New Zealand and Australia both highlighted the need to translate notional support for SOA
approaches, including CSPA, into consistent action and decision making at a practical level.
Norway highlighted the importance of ensuring senior decision makers are exposed to, and
convinced of, the practical value of designing, sharing and adopting CSPA conformant services rather
than this being considered a worthy ideal and vision but not a practical consideration in current
decision making about capability investments and solution development strategies.
Design Practices
Norway and Australia identified the need to strengthen design as a recognised and valued discipline,
including extending and supporting the community of designers within their organisation.
Australia focused on improved traceability in design, including being able to trace solution designs
back to the business capabilities, needs and strategies they are supporting. This requires the ability
to reference models related to Business Capabilities, Business Activities and Statistical Information
as part of applied design activities.
6. Conclusion
Possible topics



Is there any interest/recommendation to have additional agencies undertake this
assessment process or a similar one?
o If “similar” then what are the key changes in methods that should be made?
Is MC P&M keen to look further at Design Practices (see 4.2) as the dimension with the
lowest Current Maturity and furthest to go to reach Target Maturity?
Given Statistical Methodology is a particularly variable dimension in regard to Current
Maturity (4.2), and with two agencies raising questions about what SOA means for designing


and sharing Statistical Methodology (5.3), is this another dimension which warrants follow
up consideration?
Target Maturity (4.3) appears to warrant further discussion and Ireland explicitly expressed
interest in this.
o What level of maturity for each dimension should agencies target if they hope to
adopt and integrate CSPA services efficiently and effectively?
 Defining this may help
 agencies determine whether an ability to harness CSPA compliant
services is “close at hand” for them or a long way over the horizon
 prioritise “next steps” at an agency level and “readiness” support
and advice provided through CSPA.
o Is Target Maturity more or less similar across the dimensions (as per Norway and
Australia) or does it vary markedly by dimension (as per Netherlands).
 If it does vary, are there key dependencies such that
 it is hard to achieve the target level of N on Dimension X, unless
 A target level of M (or higher) has already been achieved on
Dimension Y
 Alternatively, are maturity levels on different dimensions more or less
independent of each other.
Recommendations could be made for any of the qualitative results to be explored further.
o For example, ensuring the Business Capability View is better understood, and
valued, across organisations appeared to be a common interest. The Capability
Reference Model is a relative late comer at the HLG level compared with
GSBPM/GAMSO and GSIM and appears to have less on line support and training
aides.
Annex 1: Questionnaire
CSPA Maturity Assessment
A1.1 Introduction
The evaluation template on the next page is based on OSIMM (Open Group Service Integration
Maturity Model). Part of the assessment process documented in OSIMM is to extend the “base
model” to tailor it appropriately for the specific assessment and the outcomes sought through the
assessment.
ABS took this approach in July 2014 when using OSIMM as the basis for an Enterprise Solutions
Readiness Assessment. The assessment reviewed the readiness of ABS - across various architectural,
and supporting, domains – to undertake enterprise level re-engineering. The re-engineering is to be
based on foundations such as SOA, Metadata Driven Processes, sharing and re-use of solutions
(across the organisation and more broadly within CSPA) and ability to integrate COTS.
The Modernisation Committee agreed the same extensions should be applied in the CSPA Maturity
Assessment. Some questions have been updated, however, to ensure they focus on CSPA rather
than ABS Enterprise Architecture.
The evaluation template contains links to the definition of each Dimension and Measure.
If you follow a link then pressing <Alt><Left Arrow>
template once again.
will return you to the evaluation
Once you’ve completed the one page evaluation template the survey is complete – nothing else
needs to be filled in. The last nine pages are reference documentation for the Measures and
Dimensions.
A1.2 Evaluation Template
Dimension
Current
Maturity
Business Activity View
Business Capability
View
Statistical
Methodology
Information
Application
Infrastructure &
Management
Governance &
Organisation
Design Practices
Any additional comments
Target
Key Steps/Requirements
Maturity
A1.3 Measures
A1.3.1 Current Maturity Level
Identify the Current Maturity Level for each dimension.
The Description of Maturity Levels should be used for reference.
The Characterisation of Maturity Level by Dimension should also be used for reference.
The level will be a number between 1 and 7 inclusive.
If a whole number is entered it will be interpreted as indicating the maturity level has been achieved
more or less completely.
It is also possible to indicate that, while concerted progress is being made, a particular maturity level
has not yet been achieved completely. In such cases the next lowest level number should be
selected and a single decimal place added to indicate the extent of progress toward the next level.
Example:
Assigned Meaning
Level
2
The organisation has achieved at least the “Integrated” level of maturity for this
dimension consistently across all (or very nearly all) statistical production activities.
(There may specific examples at a higher level of maturity.)
1.1
The organisation has started progressing toward achieving “Integrated” consistently. A
number of key enablers are in place (eg senior management commitment, strategies,
plans, policies, infrastructure) to ensure progress continues
1.5
The organisation is around half way to consistently achieving “Integrated” maturity and
has key enablers in place to ensure progress continues.
A1.3.2 Target Maturity Level
Using the same definitional reference sources as Current Maturity Level, this represents the level of
maturity sought in regard to the dimension in 5 years’ time.
ABS set a target of “5” across all dimensions by July 2019, with an intermediate target of 4 across all
dimensions by July 2015. There is no requirement, however to set the same target for every
dimension. It was observed in discussions at the ABS, however, that if some dimensions became
advanced while others remained much less advanced then overall benefits realised for the ABS
would remain significantly constrained by the less advanced dimensions.
ABS considered that consistent attainment of “4” across all dimensions would allow the organisation
to make use of CSPA services, although the organisation would more benefit, and greater economy
of implementation, once “5” was achieved.
A1.3.3 Key steps / Requirements
List the first major 1 to 3 steps you expect your organisation would take to move maturity toward
the target level. These could be steps the organisation would undertake on its own or they might be
steps based on enablers from HLG/CSPA.
A1.4 Maturity Levels
A1.4.1 Description of Maturity Levels
Level
Name
Description
1
Silo
Individual parts of the organisation are developing their
own capabilities and activities independently, with no
integration of processes, data, standards or technologies.
2
Integrated
Processes and technologies have been put in place to
communicate between the silos and to integrate the data
and interconnections. However, integration does not
extend to common standards in data or business processes.
Connections may require bespoke code/adapters, leading
to a difficult to manage and complex environment. It is not
easy to develop or automate new business processes.
The organisation is aware of what methods are being used,
but there is little common use of methods across the
organisation.
3
Componentised
Silos have been analysed and broken down into component
parts, with a framework in which they can be developed
into new processes and systems. Components interface
through defined interfaces, but are not loosely coupled.
Business and IT components are discrete and re-usable, but
are often replicated and redundant. Common methods are
used, but each statistical program (statistical survey,
statistical domain, statistical line of business) implements
them in their own processes (no central use, not
controlled).
4
Services
Capabilities are exposed via loosely coupled services.
Business capability has been analysed in detail and broken
down into business services residing within a business
architecture that ensures that services will inter-operate at
the business level. Technical services use open standards
and are independent of technology. At this stage
composition of services and processes is still defined by
writing bespoke code. Methods are singular purpose with
clear separation of concerns and are expressed at a
business level using canonical descriptions.
5
Composite
Services are highly standardised in terms of information
model, application architecture and infrastructure use. It is
now possible to construct a business process utilising
services not by bespoke development, but by the use of a
business process modelling language. This can be done at
design-time with the support of BAs (business analysts),
designers and developers. Methods are able to be tuned to
client requirements.
6
Virtualised
At this stage business and IT services are now provided
virtually - ie. the physical location of the service is hidden
away and ceases to be of concern. This allows design-time
Assemble to Order utilising services across organisations.
7
Dynamically Re-configurable Prior to this level, the business process assembly, although
agile, is performed at design time using suitable tooling.
Now this assembly may be performed at runtime - ie. Plug n
Play. This includes design and selection of methods.
The above descriptions are based on the description of Maturity Levels in OSIMM (Open Group
Service Integration Maturity Model) but have been shortened, simplified and generalised in places.
A1.4.2 Characterisation of Maturity Level by Dimension
This table is based on the OSIMM Maturity Matrix. The above table sought to simplify the
terminology used in OSIMM. In places terminology more familiar and meaningful for the official
statistics industry was added. Some wording was updated to help broaden some aspects beyond
simply application architecture.
In addition, the above table adds wording for dimensions which are additional to, or have variations
compared with, those in OSIMM.
Neither the OSIMM Maturity Matrix nor the above table are associated with more detailed
explanations for each cell.
Where a dimension is consistent with OSIMM, and if the entry in the OSIMM Maturity Matrix is
more meaningful for the reviewer, this can be used as the basis for the assessment.
A1.5 Primary Dimensions
A1.5.1 Business Activity View
This dimension focuses on the business activity domain ie. the organisation's core business activity
practices and policies; how business processes are designed, structured, implemented and executed.
It also considers how the business activity strategy is connected to capability architecture, as well as
IT strategy and costs.
This corresponds closely with the Business Dimension in OSIMM but with a particular focus on
business activity (for which GSBPM and/or GAMSO can be used as a reference model) as an aspect
of business architecture.
Questions include

Has the organisation adopted a Business Activity Model that is either GSBPM or
mappable to GSBPM?

Do statistical production, and other, staff within the organisation understand the
Business Activity Model?

Do staff refer to the Business Activity Model as a point of reference in discussions,
general documentation, process descriptions and requirement specifications?

Do staff describe their business activities using terminology which is specific to their local
area or based on the tools used currently (eg “And then we run <System Name>” rather
than “And then we apply disclosure control”)?
The 19 questions for the Business Dimension in OSIMM are also broadly appropriate.
A1.5.2 Business Capability View
This dimension focuses on the capability domain ie. the organisation's core business capability
practices and policies; how business capabilities are designed, structured, implemented and
executed. It also considers the business services which make these capabilities available to
consumers. It consider how all dimensions of a capability are brought together to deliver business
services.
Capability is defined by Open Group as “an ability that an organization, person, or system
possesses”.
Successfully performing a particular business activity may require many capabilities (eg related to
trained staff, appropriate statistical methods, a range of IT capabilities that allow the business
activity to be performed.) Similarly a single capability may be used in several activities (eg an
automated data validation capability might be used

as providers are entering responses in a survey form (an activity in the Collect phase)

as part of a Validation activity during the Process phase

as part of a Validate Outputs activity during the Analyse phase
If solutions are designed and built based only on specific business activities

a common business capability may be “reinvented” different ways for different activities

it becomes hard to harness a new, improved, service for a capability because support for
a particular business activity will have been built as a “block” rather than an assembly of
capability based services

opportunities may be lost for achieving efficiencies of scale from providing functionally
specialised business services in a manner that supports the common needs of a range of
different business activities
Questions include

Is the concept of Business Capability (separate from, but related to, Business Activity)
recognised within the organisation’s enterprise architecture?

Does the organisation use a reference model for Business Capability

Do statistical production, and other, staff within the organisation understand the concept
of Business Capability?

Do staff think and speak in terms of Business Capability as separate from, but related to,
Business Activity?

Do staff understand and apply the organisation’s reference model for business capability
(if one exists)?

Does the organisation have, and apply, a general mapping between Business Activities
and the Business Capabilities required for each activity?

Does the organisation have a mapping between Business Capabilities and IT services that
exist – or are required by the organisation – to deliver the capability?

Has the organisation assessed, and planned for, other factors required to deliver each
Business Capability (eg staff training, business processes, statistical methods, relevant
standards and frameworks)?
A1.5.3 Statistical Methodology
This dimension focuses on the management of methods ie. how methods are designed, structured,
implemented and executed. It also considers how these methods will be integrated with business
capabilities, as well as applications.
While OSIMM includes a dimension for Methods, the dimension in OSIMM primarily relates to IT and
related methods (eg software development lifecycle, project management). Most of these
considerations are now addressed under Design Practices.
In the case of the Official Statistics industry, however, Statistical Methodology is an essential
dimension to consider.
Questions include

Are methods designed and managed as reusable organisational assets?

Do methodologists view methods as one of a number of interrelated enablers for
Business Capability (as opposed to thinking of methodology more or less independently
of other capability considerations)?

Are methodologists familiar with, and supportive of, service oriented approaches to
putting methods into effect (as opposed to, eg, authoring bespoke code in languages such
as SAS and R, and relying on customised data and metadata structures, in order to apply
methods)?

Is “Not Invented Here” a barrier when

o
evaluating, and potentially implementing, new methods
o
considering adoption of CSPA services that support statistical methods which
differ from current practice without being manifestly inferior to current practice?
Do methodologists tend to be thought leaders and champions within the organisation
when it comes to achieving business architecture and applications architecture consistent
with CSPA and promoting modernisation of statistical production and services more
generally?
A1.5.4 Information
This dimension focuses on how information is structured, how information is modelled, the method
of access to enterprise data, abstraction of the data access from the functional aspects, data
characteristics, data transformation capabilities, service and process definitions, handling of
identifiers and the business information model.
The definition is largely as per the Information Dimension in OSIMM, but with a specific focus on
structured information (eg data and metadata) used and produced by services and reduced focus on
knowledge management. The 13 questions in OSIMM for the Information dimension can be used.
The following questions, based on, OSIMM are considered particularly relevant

Is there difficulty in moving data from one application to another? For all applications?
For only some applications?

Does your organization have a common data model, (or mappings between multiple data
models)?

Are the data models in the form of Business Object Models, understandable to and
owned by, the business, or as IT object models, understandable only to, and owned by,
the IT teams?

If there are mapping rules across different models, are these understandable to and
maintained by the business or by IT staff? Are such mapping rules performed by the
infrastructure?

If your organisation has a common data model, is it consistent with GSIM as a conceptual
model?

Does your organization have or are you developing a Business Information Model to
standardize data and message formats and concepts across the enterprise?

If so, do you expect it will be consistent/interoperable with CSPA conceptual, logical and
physical models (summarised in Figure 1 in the linked resource)
A1.5.5 Application
This dimensions focuses on the application domain which includes physical structures, integration
techniques, standards and policies, web services adoption level, experience in SOA implementation,
SOA compliance criteria and typical artefacts produced.
This definition of this dimension essentially spans the Application Dimension and the Architecture
Dimensions in OSIMM. The 11 questions in OSIMM for each of these dimension can be used.
The following questions, based on OSIMM, are considered particularly relevant:

What is your current application development style?

What types of re-use do you engage in and how is re-usability measured?

Are SOA-enabling technologies, such as ESB, shared data environment, or registry, being
used?

How is integration achieved in your architecture?

How mature are your services implementations?

How are architectural decisions made in your organization?

Does your organization use reference architectures
A1.5.6 Infrastructure & Management
Focused on the organisation’s infrastructure capability, service management, IT operations, IT
management and IT administration, how SLAs are met, how monitoring is performed, and what
types of integration platforms are provided.
This dimension corresponds to the Infrastructure & Management Dimension in OSIMM. The 12
questions for the OSIMM dimension can be used.
The following questions, based on OSIMM, are considered particularly relevant:

How are your IT SLAs derived from the business SLAs?

What platforms are currently in use for integration?

Which assets are placed under version control?

What tools are used for configuration management?

What are considered as your organization's IT assets (excluding human resource)? How
are these assets managed?

How does your operational architecture support the non-functional requirements for
applications and services?
A1.6 Supporting Dimensions
A1.6.1 Governance & Organisation
This dimension focuses on the structure and design of the organisation itself and the measures of
organisational effectiveness. Organisation considers structure, relationship, roles and the
empowerment to make decisions. It includes the skills, training and education available. Governance
is associated with formal management processes to keep activities, capabilities and solutions aligned
with the needs of the business. It also guides any aspects of the other maturity dimensions, including
how management is structured and costs are allocated.
The dimension is similar to the Organization and Governance Dimension in OSIMM. The latter
dimension, however, appears specifically focused on IT and SOA governance where the former
dimension is intended to span organisational governance (and culture) more generally.
The following, broader, questions are recommended:

Is the importance of modernisation, and the value of a collaborative, service oriented
approach, broadly agreed across the organisation or only advocated by certain parts?

How well do different lines of business, methodologists and IT staff work together to
define shared goals and expectations for each modernisation project / innovation and
then work together to establish a solution that includes both IT and non IT aspects?

To what extent are senior management committed to establish solutions that align with
enterprise (and industry) architecture standards, and are “adequate for purpose”, rather
than committed to solutions tailored to closely fit preferences of individual parts of the
organisation?

Does the organisation have an approach to governing and managing projects, and
programs of work, that would facilitate due consideration of

o
adopting CSPA services rather than investing in local solutions
o
developing services that were CSPA compliant and sharable
Is “Not Invented Here” a barrier when evaluating and potentially implementing solutions
(new processes, statistical methods, IT capabilities) from other parts of the organisation
or from outside the organisation?
A1.6.2 Design Practices
The dimension is focused on the practices and processes employed by the organisation for its
business and IT transformation. This includes items such as requirements management, design
methodologies/techniques and tools for designing solutions.
This dimension is closely aligned with the Method Dimension in OSIMM. The dimension was
renamed to avoid confusion between “IT” methods and statistical methods. The 10 questions for
the Method Dimension in OSIMM can be used.
The following questions, based on OSIMM, are considered particularly relevant:

What design methodologies and best practices are you currently adopting?

Do you practice any SOA design techniques?

What design tools are in practice today?

What is the current practice for service development and management?

Do you have an active community that works to evolve your SOA methods and practices?

Has your organization developed a repository for best practices and asset re-use?
Download