Herb Krasner Principal Investigator, IV&V Services Director, Industry Outreach

Improving IT Process and Quality
Herb Krasner
Principal Investigator, IV&V Services
Director, Industry Outreach
Sr. Lecturer in Software Engineering
hkrasner@ece.utexas.edu
Advanced Research in
Software Engineering
http://arise.utexas.edu/
Copyright@ 2014 – Herb Krasner – all rights reserved
The Importance of IT Quality
l Software is blamed for more major business
problems than any other man-made product.
l Poor software quality has become one of the most
expensive topics in human history:
t  > $150 billion per year in U.S.
t  > $500 billion per year world wide
l For U.S. software:
t  Average is about 5 defects per function point, with
about 85% of these being removed prior to delivery.
t  The best results have defects below 2 per function
point combined with 99.6% removal efficiency.
t  Projects often fail at levels of 7 and greater.
l Message #1 – focus on Quality
Caper Jones III, 2009
Copyright@ 2014 – Herb Krasner – all rights reserved
function point ~= a 100 SLOC program module
What is IT System Quality?
l Conformance to requirements
t  The requirements are clearly stated
and the product must conform to it
t  Any deviation from the requirements
is regarded as a defect
t  A good quality product contains
fewer defects
What is
Specified
What
Stds must
be met
What
users
need
Underlying aspects:
• Structural quality
• E.g. complexity
• Aesthetic quality
• E.g. ease of use
l Fitness for use/purpose
t  Fit to user expectations: meet user’s
needs
t  A good quality product provides
better user satisfaction
l Meets standards
Copyright@ 2014 – Herb Krasner – all rights reserved
t  In many industries and organizations
certain external and internal
standards must be complied with
t  A good quality product conforms to
required standards of quality/process
How Are We Doing in Software?
•  Industrial data shows that even experienced software
engineers inject about 100 defects per 1000 lines of
code (10%)
• 
• 
Typically, about 25 of these defects find their way into integration
and systems test, at a cost of from 10 to 40 programmer hours each
Each of these 25 defects will cost (at least) several thousands of
dollars to find, to repair, and to verify
•  Allowing a defect to make its way into a fielded system/
product, will cause the repair and fix costs to quickly rise
-
Watts Humphrey, 1990s, collected during his PSP studies of individuals
“Even with all of the emphasis on writing software with
security in mind, most software applications remain
riddled with security holes .. .” - K. Jackson Higgins, DarkReadings, 3/1/2010
Copyright@ 2014 – Herb Krasner – all rights reserved
Opportunity Cost of IT Quality
8
1:13:92 - IBM (Kan,2007)
- IBM eServer SW
100X
Relative
Cost to
Fix a
Serious
Defect
1
defect
prevented
Copyright@ 2014 – Herb Krasner – all rights reserved
10X
company
finds &
fixes it prior
to release
customer
finds it
in fielded
system
It causes harm
in operation
JPL, 1990s
AT&T Software Bug - 1/15/90
l New version of
telephone exchange
software (#4ESS)
installed
l Switches started
crashing in NY
l 9hrs of complete long
distance network
disruption for 60K
people
l $1billion loss reported
l 1 line of misplaced
code in 3 million lines
of software
switch( MessageType )
{
case INCOMING_MESSAGE
if ( RemoteSwitch == NOT_IN_SERVICE )
{
if ( LocalBuffer == EMPTY )
SendInServiceMsg(3B);
else
break;
/* Bad News! */
}
ProcessIncomingMessage(); /*skipped */
break;
...
}
DoOptionalDatabaseWork();
...
A small bug can have a huge effect!
Copyright@ 2014 – Herb Krasner – all rights reserved
State of Practice in TX1 State Gov.
l A major IR project is identified in a state agency’s Biennial
Operating Plan whose development costs exceed $1.0 million and:
t  Requires one year or longer to reach operations status,
t  Involves more than one state agency, or
t  Substantially alters the work methods of state agency personnel or the
delivery of services to clients.
¬ 
Or Any other IR technology project designated by the legislature in the General
Appropriations Act as a major IR project.
l From December 2012 to November 2013, 77 projects representing
$1.8 billion in major IR projects were in the technology portfolio.
t  Twenty of the 77 projects were approved and began after Sept. 1, 2013.
l Of the other 57 projects in the technology portfolio that began
before September 2013:
t  Thirteen projects were reported to be complete. Some successful.
t  39 were late or projected to be late by an average of 24 months. In
addition, 28 of the projects exceeded or are expected to exceed their
initial budgets by an average of $8.9 million.
t  Some projects that used a commercial off-the-shelf solution as a beginning
point for their development had better budgetary and delivery outcomes than
projects that did not use a similar approach.
l Subtle message was “don’t do large IR projects”.
1. Annual Report: OVERVIEW OF MAJOR INFORMATION RESOURCES PROJECTS REPORTED TO THE QUALITY ASSURANCE TEAM,
December 2012 to November 2013
Copyright@ 2014 – Herb Krasner – all rights reserved
State of Practice in Commercial Orgs.2
l Local and TX commercial shops not directly surveyed and so only
anecdotal information is available
l If we assume that TX is similar to nationally reported trends then:
t  Chaos 2013 results – for 50,000 projects completed since 2003:
¬ 
¬ 
¬ 
¬ 
¬ 
¬ 
¬ 
39% succeeded (delivered on time, on budget, with required features
and functions);
43% were challenged (late, over budget, and/or with less than the
required features and functions); and
18% failed (cancelled prior to completion or delivered and never used).
For large projects – 10% succeed, 52% challenged, 38% failed
For small projects – 76% succeed, 20% challenged, 4% failed
Á  Small = < $1M, Large = > $10M
Large projects have twice the chance of being late, over budget, and/or
missing critical features than their smaller project counterparts.
A large project is more than 10 times more likely to fail outright
[2] Chaos report for 2013, Standish Group
• 
The CHAOS research database encompasses 10 years of data on why projects succeed or fail
• 
For each reporting period, about 60% of the projects are U.S. based, 25% are European, and the remaining 15% represent the
rest of the world.
• 
A little more than half of the companies are considered Fortune 1000-type companies; another 30% would be considered
midrange; and 20% are in the small-range category.
• 
They span a diverse number of vertical industries and organizations. Participants are made up of a variety of CIOs, VPs,
directors, and PMO project managers.
Copyright@ 2014 – Herb Krasner – all rights reserved
IT Success Factors Cited
l Factors of Success
t  Executive management support
t  User involvement
t  Optimization of size/complexity
t  Skilled resources
t  Project management expertise
t  Disciplined/Agile/Iterative process
t  Clear business objectives
t  Emotional maturity/professionalism
t  Execution – to a plan
t  Tools and infrastructure
Copyright@ 2014 – Herb Krasner – all rights reserved
Points
20
15
15
13
12
10
6
5
3
1
Best Vs. the Worst IT Performers
l IBM sponsored benchmarking survey of 363 European software
organizations
l Covered 15 countries (Germany and UK represent over half the
sample) and over a dozen industries (banking, insurance,
manufacturing, IT and Distribution most heavily represented)
Performance Factor
Productivity
(fcn pts./mo.)
Best = top 10%
Worst = bottom 10%
Delivered quality
(% defects removed)
Cost/Schedule
Performance
Post delivery
maintenance
costs (within 1st yr.)
Best
Worst
25
5
>95%
<= 10%
<1%
(of total dev.
effort)
<50%
>40% over
>10%
Goodhew, 1996, Acheiving real improvements in performance from SPI initiatives, the European SEPG Conference, June 26, 1996
Copyright@ 2014 – Herb Krasner – all rights reserved
Best Practices-Performance Correlation
problem defect
solving prevention
abilities
Quality
Performance
employee
morale
TQM mgt.
culture
estimation
methods
HLLs
usage
project
mgt.
discipline
Delivery Performance
(schedule and cost)
Goodhew, 1996
Copyright@ 2014 – Herb Krasner – all rights reserved
quality
vision
staff
skills
testing
effort
CASE
tools
Customer
satisfaction focus
well defined,
adaptable
development
process
Focal Points for Improvement
Governance
Quality
products
Focus
process
Quality
Focus
people
Copyright@ 2014 – Herb Krasner – all rights reserved
technology
The Importance of the IT Process
l A bad process leads to either a bad product or the need
for heroic effort to save it
l There is a wide variation in performance at all process
levels (e.g. 1:26, 1:10, 1:5)
l Most failed IT projects are found to either have no
process or the wrong one
l Behind every successful IT product or system there is an
organization, team and/or set of individuals that have
effective processes
t  The choice of an appropriate development method for the
situation is crucial but very difficult to do. Good methods exist
l There is no silver bullet process that you can acquire
l Through the process we can make excellence in IT
development an organizational imperative
l Message #2 – Focus on a quality process
Copyright@ 2014 – Herb Krasner – all rights reserved
A Potential Role of Process
Process—the set of activities, methods,
practices, and transformations that integrate
managers and engineers in using technology to
define, develop and maintain IT systems.
Management
B
A
D
C
PROCESS
Staff
Copyright@ 2014 – Herb Krasner – all rights reserved
Technology
The IT System Lifecycle Process
Decide on
Investment
Define Concept
& reqts
25+ years
Develop
System
Deliver
System
A well defined, quality
development process is crucial
Copyright@ 2014 – Herb Krasner – all rights reserved
Support, evolution &
maintenance
Decline &
Retirement
Impact of Disciplined Process Improvement
# of orgs.
reporting
Improvement Change1
Category
reported
Median
Highest
Lowest
29
Cost
34%
87%
3%
22
Schedule
50%
95%
2%
20
Productivity
61%
329%
11%
34
Quality
48%
132%
2%
7
Customer Satisfaction
14%
55%
-4%
22
Return on investment
4:1
28:1
1.7:1
* Survey of 35 CMM-based process improvement leaders
with change-over-time results - from CMU/SEI 2006-TR-004
1 - Improvements are reported relative to a previous performance baseline
Copyright@ 2014 – Herb Krasner – all rights reserved
Process and Quality: Potent Dual Foci
l Quality process yields quality product – all else being
equal
l Early focus on a quality product implies certain best
practices as part of the process (e.g. DPP)
l Don’t assume other management and infrastructure
areas are under control (e.g. PM, HR, etc.)
l Effective IV&V framework synthesizes all these issues
together under ongoing “grey beard” scrutiny for high
integrity level & mission critical systems
t  Validation – are you building the right system? (quality)
t  Verification - are you building it the right way? (process)
t  Independent – from the development organization:
managerially, financially and technically.
¬  Provides an unbaised, expert view of what to expect.
Copyright@ 2014 – Herb Krasner – all rights reserved
The ROI of Development Quality
Non-quality focus
Poor quality is cheaper
until the end of the coding
phase. After that, high
quality is cheaper.
COST
e.g.
Technical
Debt
Poor quality costs
1.5-2X more than
good quality in dev.
Quality-focused
Quality focus is related to: QA effectiveness, process maturity
level, #error prone modules, bad fix rates, design/code
structure, testability, etc.
Requirements
Design
Coding
Testing
Delivery
DEVELOPMENT CYCLE/TIME
Adapted from Capers Jones, 2008
Copyright@ 2014 – Herb Krasner – all rights reserved
Two ROI Scenarios Over 5 Years
Initial
development cost
Poor Quality
Good Quality
(poor structure/defective)
(good structure/low defects)
1,200,000
800,000
192,000
Year 2
204,000
Year 3
216,000
Year 4
240,000
Year 5
264,000
Total Maintenance 1,116,000
TCoO
2,316,000
Year 1
120,000
112,000
104,000
96,000
80,000
512,000
1,300,000
100 KSLOC system assumed - table entries in $$
Adapted from Capers Jones, 2009
Copyright@ 2014 – Herb Krasner – all rights reserved
Lifecycle ROI of Good Quality
Premature
demise point
1
Total
Cost
Of
Ownership
2
0
LC ROI of Quality Improvement
System Lifetime in Years
Copyright@ 2014 – Herb Krasner – all rights reserved
25
CoSQ -- State of Practice Summary
CoSQ as % of total COO *
Quality level LOW
MEDIUM
HIGH
SW size
Small
20
15
10
Average
26
20
15
Large
42
33
24
CoSQ => defect prevention, discovery, and
removal before and after release
*–
includes development cost + 5 years of operation/maintenance costs
C. Jones, 2010
Copyright@ 2014 – Herb Krasner – all rights reserved
X___x Office Products ROI
- Organizational Cost of Quality Improvement Cost of Quality Projection
120
100
Feature Development
% of Cost
80
CoSQ: Prevention
60
CoSQ: Inspection
CoSQ: Failure - Internal
40
CoSQ: Failure - External
20
0
2007
2008
2009
Copyright@ 2014 – Herb Krasner – all rights reserved
2010
2011
2012
Quality Improvement Strategy
l Employing best practices to shift the peak of the
Rayleigh curve to the left while lowering its peak value
• People
• Process
• Tools
Test early/often
Copyright@ 2014 – Herb Krasner – all rights reserved
Some Best Practices to Consider
l 85% certified reuse
l Inspections, static analysis and early testing
l Quality measurement (e.g. DRE)
l Proactive QA
l Requirements modeling and JAD
l Enhanced IV&V
Copyright@ 2014 – Herb Krasner – all rights reserved
IV&V Continuous Improvement Cycle Program/
Project
Leads
Senior
Management
Checkpoint
Review
Preparation
& Planning
Action Plan
Implementation
Onsite
Review
Absorption of
Review Results
Review & Revise
Results Report
Copyright@ 2014 – Herb Krasner – all rights reserved
IV&V Team
Leads
IV&V Areas of Interest – Ex.
Copyright@ 2014 – Herb Krasner – all rights reserved
Relevance and Strength Indicators
Copyright@ 2014 – Herb Krasner – all rights reserved
Synthesis of Best Practice Standards
l IEEE Standard 1012-2012 – System and
Software V&V
l PMBOK – 4th Edition
l CMMI-Development, V1.3
l People CMM, V2.0, 2nd Edition
l IEEE Computer Society Software Engineering
Standards
l IT Quality metrics
l Others as needed
Copyright@ 2014 – Herb Krasner – all rights reserved
Staff Personnel/Team Capabilities Development
From P-CMM Level 2
!"#$%&!'()*'!+($,(-!'.)/-.$$(-.%)'%0'1(/.2'*./2.#$.)!'.)-%'3%&40%&2!'(2-.+.-.!/'
HR-1
HR-2
HR-3
HR-4
HR-5
HR-6
Evaluate how well the program has established and maintained physical working
conditions and provided the resources that allow individuals and workgroups to
perform their tasks efficiently without unnecessary distractions.
Evaluate how well the program has established timely communication throughout
the organization and ensured that the workforce has the skills to share information
and coordinate activities efficiently.
Evaluate how well the program has established a formal process by which
committed work is matched to unit resources and qualified individuals are
recruited, selected, and transitioned into assignments
Evaluate how well the program has established objectives related to committed
work against which unit and individual performance can be measured, discussed
performance against these objectives, and continuously enhanced performance
Evaluate how well the program has ensured that all individuals have the
knowledge and skills required to perform their assignments and were provided
with relevant development opportunities.
Evaluate how well the program has provided all individuals with recognition and
rewards based on their contribution and value to the organization
Copyright@ 2014 – Herb Krasner – all rights reserved
Higher Level Process Maturity
Explore and evaluate the organization’s higher level process maturity features
MAT-1
MAT-2
MAT-3
MAT-4
Evaluate if there is an effective measurement capability, and if it is being effectively
used to support management information needs
Evaluate how well does the program analyze possible decisions using a formal
evaluation process that evaluates identified alternatives against established criteria
Evaluate how well the organization has established and maintained a quantitative
understanding of the performance of selected processes in the organization’s set of
standard processes in support of achieving quality and process performance
objectives, and to provide process performance data, baselines, and models to
quantitatively manage the organization’s projects
Evaluate how well the organization has sought to quantitatively manage the project to
achieve the project’s established quality and process performance objectives
From CMMI-dev, Level 4
Copyright@ 2014 – Herb Krasner – all rights reserved
Basic Questions:
l  Is the program within established schedule and scope performance parameters?
l  Is the program on track with respect to its goals and objectives? What are the
major risks to achieving success?
l  Do program deliverables meet the stated acceptance criteria and adhere to
functional requirements?
l  Do the program project deliverables provide a roadmap for success?
l  What are the current processes, procedures, practices and technology?
l  What is good about the current processes, procedures, practices and
technology?
l  What about the current processes, procedures, practices, and technology need
improvement?
l  Are development staff (and contractors) following industry best practice or
comparable practice for development?
l  Is the program documentation and artifacts accurate and up-to-date?
l  What standards is the project following internally? And are they appropriate for
the situation?
l  Are best practices and metrics being employed to identify issues, progress,
performance, risk, etc.?
l  Is there recognizable progress since the last IV&V Review?
Copyright@ 2014 – Herb Krasner – all rights reserved
IT Domain: Empirical Entity Model
(s)
(s)
V&V
activity
Product/
version
Copyright@ 2014 – Herb Krasner – all rights reserved
(s)
Next phase/cycle
The Critical Role of Measurement
l Manage size and complexity – how big is it?
t  #
t  #
t  #
t  #
t  #
of staff assigned and multitasking level
of business process flows
of specified requirements & volatility
of tasks in the WBS
of design components - better suited to COTS-based
and SOA systems; not equal in value/effort
t  # of COTS products and modules
t  # of pages of process documentation
t  # of function points
t  # of SLOC - old fashioned way, language levels and
mixture are problematic; makes little sense with COTS
l Message #3 – focus on measurement (especially
of process and product Quality)
Copyright@ 2014 – Herb Krasner – all rights reserved
IV&V Lessons Learned
l Early focal points
t  PPM discipline
t  QM focus
t  Requirements quality and volatility
t  Development process suitability & quality
t  Skill set gaps
t  Stable organization and realistic work plan
t  System architecture definition and visualization
l Later focal points
t  Design quality
t  Code quality
t  Testing: integration, system, UAT
t  Performance, security, and “illities”
t  Transition from old system to new system
t  Support and maintenance process
Copyright@ 2014 – Herb Krasner – all rights reserved
Across all categories:
measurement against
goals and questions
Strategic Metrics to Consider
l Total Cost of Ownership (TCO) - a financial estimate intended to
help owners determine the direct and indirect costs of a system
over its intended life cycle
t  when incorporated in any financial cost/benefit analysis, TCO provides a
basis for determining the total economic value of an investment (e.g. ROI)
t  TCO tries to quantify the financial impact of developing, deploying and
maintaining/supporting an IT system over its entire life cycle
l Cost of Quality (CoQ) – the cost of NOT creating a quality product
or service. Rework due to defects is usually the largest component
t  represents the difference between the actual cost of a system or service and
what the reduced cost would be if there were no possibility of substandard
work, system failures or defects in their development
l Technical Debt - a recent metaphor referring to the eventual
consequences of poor system analysis, architecture, design and/or
development within a given codebase
t  The debt can be thought of as work that needs to be done before a particular
job can be considered complete. If the debt is not repaid or serviced, then it
keeps on accumulating interest, making it harder to implement changes later
t  Unaddressed technical debt increases software entropy (disorder)
l Defect Removal Effectiveness (or efficiency):
DRE = (Defects removed during a development phase/
Defects latent in the product at that phase) x 100%
Copyright@ 2014 – Herb Krasner – all rights reserved
Conclusions
l For success on large projects, focus on Quality, process
quality, measurement, enhanced IV&V and best practices
l IV&V plus quality measurement is a best practice for very
large projects
l  Focus on measuring Quality and process relative to explicit
goals and questions
t  Size and complexity are key indicators
t  Forces the cost and schedule to come under control
l When you can’t break up a big project into a set of
smaller projects, then you must proactively manage the
complexity of the combination
l IT Improvement ROI can be uniquely determined for each
organization and is based on these assertions:
1.  Quality must be measurable (e.g. translated into $$)
2.  A cause-and-effect relationship must exist between
quality and financial results (e.g. costs, revenue, etc.)
l Quality excellence has an expected ROI > $15 for each
$1 spent
Copyright@ 2014 – Herb Krasner – all rights reserved
The Benefits of Improving Quality
Through Process and Metrics
Development
Cost(pre-release)
Rework
Maintenance
Cost(post-release)
Quality
Development
Productivity
Implementing
Certain
Quality
Practices
Copyright@ 2014 – Herb Krasner – all rights reserved
Time-to-market
(schedule)
Total Cost of
Ownership
Business
Success
Factors
Q&A
Copyright@ 2014 – Herb Krasner – all rights reserved