Value Stream Analysis of IPT's and the

Value Stream Analysis of IPT's and the
Test/Certification Phase of Product Development
By
J. Philip Perschbacher
B. S. Engineering (Aerospace) University of Michigan, 1974
M. S. Engineering Rensselaer Polytechnic Institute, 1977
SUBMITTED TO THE SYSTEM DESIGN AND MANAGEMENT PROGRAM IN
PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF
MASTER OF SCIENCE IN ENGINEERING AND MANAGEMENT
AT THE
MASSACHUSETTS INSTITUTE OF TECHNOLOGY
FEBRUARY 2000
MASSACHUSETTSINSTITUTE
OF TECHNOLOGY
JAN 2
@ 2000 J. Philip Perschbacher, All rights reserved.
LIBRARIES
The author hereby grants to MIT permission to reproduce and to distribute publicly paper and electronic
copies of this thesis document in whole oin part.
Signature of Author:.
K)
U
System Design and Management Program
December 29, 1999
Certified by:
Joyce M. Warmkessel
S~enior Lecturer, Aeronautics and Astronautics
Thesis Supervisor
Accepted by
Paul A. Lagace
Co-director, Systems Design and Management/Leadership for Manufacturing Programs
ProfessoF-of Aeronautics & Astr9naujjcs and Engineering Systems
Accepted by
Thomas A. Kochan
Co-director, Systems Design and Management/Leadership for Manufacturing Programs
George M. Bunker Professor of Management
Value Stream Analysis of IPT's and the Test/Certification Phase of
Product Development
By
J. Philip Perschbacher
Submitted to the System Design and Management Program
On January 4, 2000 in Partial Fulfillment of the Requirements for the Degree of
Master of Science in Engineering and Management
ABSTRACT
This case is drawn from the nonrecurring portion of the test/certification of a helicopter.
This structural test effort had mixed success in terms of cost and schedule. This thesis
evaluates that effort through the lens of value stream and IPT evaluation techniques.
Better ways to measure the performance of the test process are suggested. Changes to the
test process to improve its value to the customer are proposed. This analysis provides an
in-depth view of the nonrecurring side of the process including separate material and
information flows. It examines resultant schedule variations under the influence of
rework at several stages of the system process. A novel standard of measurement that is
largely independent of the project estimator and that measures the progress toward
perfection is proposed.
Analysis of the IPTs revealed a general agreement with the benchmarks of the Lean
Aerospace Initiative (LAI) as espoused by Browning. There were potential problems with
the early involvement of Test Engineering in the IPT. Pressure to start the planning
activities ahead of detail drawing completion led to longer and more costly test processes
as early planning needed to be revised to reflect later changes in design. An evaluation of
single test conduct, akin to single product flow, showed no detrimental effect for two
simultaneous tests.
Thesis Supervisor: Joyce M. Warmkessel
Title: Senior Lecturer, Aeronautics and Astronautics
Keywords
Value stream, lean, test, certification, helicopter, rotorcraft, fatigue, aerospace, muda,
product development, estimation, schedule.
2
EXECUTIVE SUMMARY
Value stream analysis has typically been applied to manufacturing processes, and sometimes to
product design. It seeks to lean out a process by eliminating muda or waste, and retaining only
those activities that provide value to the customer. Integrated product teams (IPTs) and
concurrent engineering have been hailed as the alpha and omega of product development by
bringing focus to all phases of the process throughout its life. This case analyzes the
nonrecurring test/certification of a helicopter. This effort had mixed success in terms of cost and
schedule. This thesis evaluates that effort through the lens of value stream and IPT evaluation
techniques. It suggests better ways to measure the performance of the test process. It proposes
changes to the process to improve its value to the customer.
The application of value stream analysis to test processes was found in only a single reference by
Weigel. That paper's application was limited. This analysis provides an in-depth view of the
nonrecurring side of the process including separate material and information flows. It examines
resultant schedule variations under the influence of rework at several stages of the system
process. It proposes a novel standard of measurement that is largely independent of the project
estimator and measures the progress toward perfection.
This analysis defined value as the timely and accurate verifcation, identification,or
quantificationof proposedor unknown component performanceparametersfor an appropriate
price.
A value stream of the helicopter structural test process was created. It identified a wide difference
between the value added cycle time, 197 days, and the scheduled time, 409 days. Rework
variations associated with planning, instrumentation, installation and facility design showed that
the basic process has so much muda that these rework variations had minimal effect.
Establish Req'ls
Ai
r
K
2 daays
.y dy/Tysd
days
it
Establish Test
srpatmln
C/T:
2 dy
CIT 1
C/T/T
s
Foduow
2
SIT
s
CSIiS
yt
4
days
days9sCoduStT:
4
d
d
days
60 daya
5i
Shop Factiity
CIT
days
s0
sulls
C/IT 5 days
S/T: 10 days
C/T; t5 day
S/T: 20 days
Data Flow
10s Crib
Mateial Flow
C/T: 20 days
S/T: 60 days
C/T total: 197 days
S/T total: 409 days
Analysis of the IPTs revealed a general agreement with the benchmarks of the Lean Aerospace
Initiative (LAI) as espoused by Browning. There were potential problems with the early
involvement of Test Engineering in the IPT. Pressure to start the planning activities ahead of
detail drawing completion led to longer and more costly processes as early planning needed to be
revised to reflect later changes in design.
The basis for measuring success of a project is too often a function of the conservatism of the
estimator. If one is to truly strive toward a lean process, the goal must have a rationale basis. The
equation below represents the best fit of the test data. The structural test cost (hours) is equal to
the 560 + 93*severity factor.
Severity factor
=
difficulty*#setups*(0.5 *#static loads+2*#dynamic loads+#components)
Severity vs Actual Hours
=dffcut#ps(.5*#slatcloads+2dynamicloadS+#cOpoenlt)
00
..................
3000
,
,
.
D+
,A
100D
11WI
a~
0
0
10
20
40
30
50
6D
70
80
Sewnty (wih ompmert fador)
An evaluation of single test conduct, akin to single product flow, showed no detrimental effect for
two simultaneous tests.
4
Biography
The author graduated summa cum laude with a bachelor of engineering degree in
aerospace from the University of Michigan (Ann Arbor) in 1974. His senior design
team's project was a satellite system could demonstrate the potential for transmitting
solar power to earth by microwaves. He recieved the Undergraduate Distinguished
Achievement Award for the Department of Aerospace Engineering.
He graduated with a master of engineering degree from Rensselaer Polytechnic Institute
(Hartford) in 1977. His master's project involved Kalman filters in control systems.
He has spent his entire career (over 25 years) with United Technologies Corporation. He
has worked on jet engine fuel controls and helicopters. His work includes both composite
and metal component testing, rotor and blade performance, impact dynamics, and
pneumodynamics. He has worked on both civilian and military models. His travels for
work have included consultations in East Asia and Europe.
He shared the 1999 Robert L. Pinckney Award of the AHS.
He is a member of the American Helicopter Society (AHS). He has presented papers at
" AHS National Specialists' Meeting - Helicopter Testing Technology - March 1988,
"ACAP Airframe Crashworthiness Demonstration"
" 43rd MFPG Meeting, Oct. 1988, "The Application of Acoustic Emission Techniques
to Composite Airframe Structural Testing"
His published articles include:
" "Correlation of Acoustics Emission and Strain/Deflection Measurements During
Composite Airframe Static Tests" AHS Journal July 1989
" "Nondestructive Testing Handbook" ASNT 1987, Contributing Author Acoustic
Emission
5
Acknowledgements
The author wishes to thank his employer for giving him the opportunity to continue his
education at this prestigious institution. He wishes to thank Lee Teft, William Groth,
Donald Gover, and Kenneth Rosen for their foresight in supporting this fine program for
design and management education.
He wishes to thank his thesis advisor Joyce Warmkessel for the guidance necessary to
turn this raw data into a readable form. He also wishes to thank his other M.I.T.
professors who taught the additional tools that allowed the completion of this work.
He seeks forgiveness from his wife Pam and his two daughters for the amount of his
attention that this degree program took from them.
6
Table of Contents
Value Stream Analysis of IPT's and the Test/Certification Phase of Product Development
........................................................................................................................................
1
ABSTRACT ................................................................................................................
2
Keywords ................................................................................................................
2
EXECUTIVE SUM MARY ....................................................................................
3
Biography ....................................................................................................................
5
Acknowledgements ..................................................................................................
6
Table of Tables............................................................................................................8
Table of Figures ...................................................................................................
8
Table of Equations....................................................................................................
9
The Issue ...................................................................................................................
10
Background...............................................................................................................10
The Industry ..............................................................................................................
13
Data Description....................................................................................................
15
Data limitations ..................................................................................................
19
Integrated Product Teams (IPT's) and the Test/Certification Process......................20
Test Certification History ..................................................................................
24
Test Sequence....................................................................................................
25
Value
s ...........................................................................................
30
Test Purposes ....................................................................................................
30
Standard Test Value Stream ...............................................................................
34
Value Streams with Rework..............................................................................
35
Test Phase Budget Analysis..............................................................................
41
Spending profile analysis...................................................................................
42
Value
n ...........................................................................................
45
Cost Performance .............................................................................................
46
M ultitasking and Performance ............................................................................
55
Schedule Performance .......................................................................................
58
Quality Performance............................................................................................59
Aligning M etrics with Value...............................................................................
7
59
Conclusions ...............................................................................................................
61
Future W ork..............................................................................................................61
APPENDIX ...............................................................................................................
62
Table of Tables
Table 1 Product Development Timetable....................................................................
22
Table 2 Test Participation in IPT's ............................................................................
24
Table 3 Program Integrative M echanism Scorecard.......................................................29
Table 4 Stakeholder Value Summary........................................................................
32
Table 5 Value Stream Statistics .....................................................................................
35
Table 6 Test Phase Budget Performance....................................................................
41
Table 7 Summarized Test Data...................................................................................
53
Table 8 Difficulty Factor Summ ary ............................................................................
54
Table 9 Testing Alternatives.......................................................................................
60
Table 10 Case Data ....................................................................................................
62
Table 11 Case Data ....................................................................................................
63
Table 12 Case Data ....................................................................................................
64
Table 13 Case Data ....................................................................................................
65
Table of Figures
Figure 1 Product Development Stages........................................................................
14
Figure 2 Development Phase Nonrecurring Cost ........................................................
15
Figure 3 Test Phase Nonrecurring Cost .....................................................................
16
Figure 4 Ground Test Phase Nonrecurring Cost.........................................................
17
Figure 5 Structural Test Nonrecurring Cost ...............................................................
18
Figure 6 Company M atrix Organization ...................................................................
21
Figure 7 Program Organization Chart............................................................................23
Figure 8 Component Test #3 Gantt Chart .................................................................
28
Figure 9 Structural Test Value Stream ........................................................................
36
Figure 10 Test Value Stream with Facility Rework .......................................................
37
Figure 11 Test Value Stream with Installation Rework..................................................
38
8
Figure 12 Test Value Stream with Instrumentation Rework.......................................
39
Figure 13 Test Value Stream with Late Detail Design Rework ..................................
40
Figure 14 Resource Summary for a Large Component Test (#3) (Plan & Actuals).........43
Figure 15 Effect of Test Performance on Start Date..................................................
44
Figure 16 Simulated Flight Loads Test #9 ................................................................
47
Figure 17 Test Performance and Rating Factor..........................................................50
Figure 18 Test Performance and Rational Severity Factor .........................................
51
Figure 19 Test Performance and Regressed Severity Factor.......................................52
Figure 20 Engineer D's Tests' Resource Plots ..........................................................
56
Figure 21 Engineer D 's M ultitasking..........................................................................
58
Table of Equations
Equation 1 Estim ation Rating Factor ..........................................................................
Equation 2 Estimation Severity Factor........................................................................48
Equation 3 Regressed Estimation Severity Factor......................................................49
9
48
The Issue
Aerospace, and in particular, helicopter manufacturing has many qualities that would
place it in the lean, or leaning, category. It rarely builds on speculation, it employs multifunctional integrated product teams (IPTs), it organizes and manages around products, it
assiduously services it products in the field, and its final product is produced in a single
piece basis with no inventory between steps.
It also has qualities that are anathema to lean. It takes a long time to develop and market a
product (excluding derivatives of existing designs). It reworks parts rather than fixing the
process. It is made up of a large number of itinerant workers. Its supply chain is not well
integrated. It spends a large amount of time in the test/certification process. This paper
will examine the way in which value stream analysis can provide recommendations for
improvements to the test/certification phase of the product design and development
process, and to IPTs.
Background
The Machine That Changed the World, and1Lean Thinking2 from Womack, Jones, and
Roos, and Womack and Jones set out a method to achieve high value, rapid response, and
low cost in the production of goods for consumers. Value defined as a capability
provided to a customer at the right time, at an appropriate price, as defined by the
customer. Their method strives to achieve savings by focusing on five important
principles:
1. Specifying value
2. Identifying the value stream
3. Making the value stepsflow
4. Letting the customer pull value (products)
5. Pursuing perfection
Womack, J. P. , Jones, D. T. and Roos, D. (1990), The Machine That Changed The World, New York:
Rawson Associates
2 Womack, J. P. and Jones, D. T. (1996), Lean Thinking, New
York: Simon and Schuster
Embedded in this method is the concept of waste, muda in Japanese. It embodies a
"human activity which absorbs resources but creates no value." 3 A Toyota executive 4 has
enumerated seven different kinds of muda, and Womack and Jones 5 have added an
eighth:
1.
Defects in products
2. Overproduction of goods
3. Excess inventories
4. Overprocessing of parts
5. Unnecessary movement of people
6. Unnecessary transport of goods
7. Waiting for an upstream activity
8. Goods and services that don't meet customer needs
Ohno also divides these muda into two varieties:
1. Those that create no value but are necessary in the current production process
2. Those that create no value and can be eliminated immediately.
These lean principles are most commonly applied to companies whose manufacturing
processes produce a tangible product (automobiles, copy machines, etc.), at high rates.
More recently (1993) lean thinking has been applied to aerospace manufacturing, which
is characterized by a lower production rate. The Lean Aerospace Initiative (LAI), a
partnership of government, labor, business, and academia, at the Massachusetts Institute
of Technology (MIT) is a principal advocate of this effort. However, these manufacturing
applications are only a fraction of the potential, for these principles can be more broadly
applied to any activity.
' ibid
4
Ohno, Taiichi, (1988). The Toyota ProductionSystem: Beyond Large Scale Production.Oregon:
Productivity Press.
5 Womack and Jones, ibid
11
Efforts at LAI are being made to apply the principles to information flow as they have
already to material flow. Weigel and Warmkessel6 have also extended the field by
applying lean principles to aerospace testing. Here the product, customer confidence, is
intangible. This required a new definition of value:
Verification, through accepted methods, that the (product)perform as expected in
a timely manner and at the agreedcost.
This study focused on acceptance testing which is a recurring test performed on every
product as a requirement before delivery.
This paper will expand the scope of testing to include structural certification activities
that are performed once for a specific product design. It will also analyze the experience
of integrated product teams (IPTs) in the development of a helicopter.
Browning 7 analyzed the use of people-based design structure matrices (DSMs), design
for integration (DFI), and integrative mechanisms (IMs) in achieving a lean enterprise.
He relates the importance of managing organizational interfaces by stressing the
'integration of teams'. He describes several IMs including:
1.
Interface optimization (through DSMs)
2. Shared information resources
3. Team building
4. Co-location
5. Town meetings
6. Manager mediation
7. Participant mediation specialists
8. Interface management groups
9. Interface contracts and scorecards
6
Weigel, A. L. and Warmkessel, J. (1999), "Formulation of the Spacecraft Testing Value Stream",
ICSE
7 Browning, T. R. (1997), "Summary Report: Systematic IPT Integration in Lean Development Programs,"
LAI Report Series #RP97-01-19.
12
He also suggests several desirable information transfer interface characteristics:
1. Published definitions
2. Directed assignments
3. Permeability
4. Mutability
5. Unencumbered flow
6. Full documentation
7. Measurable performance
8. Scalability
The experiences of the IPTs on this helicopter program will be analyzed against these
characteristics.
The Industry
The helicopter manufacturing industry is over 60 years old. Its roots lie in the brave
inventors of the turn of the century. These were experimenters of the grandest sort. They
had to create, analyze, develop, build, test, and fly their products themselves in this
fledgling industry. No design framework or best principles existed. As Igor Sikorsky, the
founder of the industry, observed, the necessity of the designer to be the flight test pilot
had the consequence of rapidly weeding out poor designs.
The product design and development model, Figure 1 from Warmkessel,8 highlights
several stages of product development. This paper will focus on the Test phase.
8 Warmkessel, J. , (1998) Unpublished work from MIT Class 15.967, System and Project Management
13
I
i
FOCUS
Figure 1 Product Development Stages
The industry remains little changed to this day. The criticality of a design mistake, and
the maturity of the design/analysis function is still not far enough advanced to eliminate
the necessity of significant ground based testing of the product before flight. It does not
eliminate significant development through flight testing before production. It does not
eliminate significant verification of function by ground and flight test before each
delivery.
The helicopter design philosophy is platform based. With a long (4-8 year) development
process, it has become more commonplace to design to a particular weight class rather
than to a specific customer requirement. From this design point, a variety of attributes can
be tailored to provide for a modicum of customization. Among these attributes are
avionics, engine capacity, cabin interiors and seating, external lifting (cargo hooks,
rescue hoists), fuel capacity, and armaments. The system designs of airframe, controls,
and powertrain (exclusive of the engines) are fixed. A secondary advantage to this
philosophy is that it speeds certification of the derivatives since these systems constitute
the major expense in a certification program.
14
Data Description
The data for this paper come from a recent helicopter test/certification effort. The data are
disguised. The method of alteration permits accurate proportional comparisons of the
individual tests.
The data represent a portion of the total development effort for the helicopter. Figures 2-5
show the breakdown for the costs of this project. The Test costs (which include the
nonsaleable flight test aircraft) represent 20% of the program development cost. Material
costs include the flight test aircraft that can be reconfigured and sold after certification.
Figure 2 Development Phase Nonrecurring Cost
Managemen
t
Design
9%
27%
Materials
21%
Test
20%
Tooling
23%
15
This 20% is further split among three test groups: flight, electrical, and ground as shown
in Figure 3. The flight test effort includes all system tests involving the complete
helicopter, including a full vehicle restrained to the ground, but particularly focuses on
those activities that are concerned with flight. The electrical test activities are split off
from the other two because the Design Branch supervises this activity. It encompasses the
electrical and avionics development including the software demonstrations.
Figure 3 Test Phase Nonrecurring Cost
Electrical
18%
Ground
40%
Flight
42%
16
The ground test activities are divided up as shown in Figure 4. These include several
types: component qualification, systems, gearboxes, customer support, research, and
component structural. Qualification tests involve functional subsystems (usually vendor
produced) servos, fuel systems, vibration control, and flight instruments. They include
proof of function, environmental (including vibration), and endurance. System tests
include the aircraft preflight acceptance tests performed on the flight test vehicles
(controls, electrical, fuel, landing gear), as well as other specialized, one time, major
system tests (controls static, rotor system performance, airframe modal, lightning strike,
EMI). Gearbox or drivetrain tests involve the lifting and directional transmissions and
drive shafts, and include lubrication, vibration, endurance, and fatigue. Customer support
includes periodic reports and presentations to the customer, as well as the deliverable
report. Research encompasses all early test activities associated with non-production
component designs. Component structural tests are described below.
Figure 4 Ground Test Phase Nonrecurring Cost
Research
Customer
5%
Support
2%
Gearboxes
8%
Component
Qualification
15%
Component
Structural
54%
Systems
16%
17
Component structural tests include static and fatigue tests as shown in Figure 5. Static
tests determine single application limit, non-yielding, strength levels and/or ultimate,
failing, strength levels. These levels help define the operating envelope for the helicopter.
Single specimens are tested, but increasingly analysis can replace these tests as the stress
analysis tools mature. Fatigue tests determine component performance under repetitive
load cycles, and determine the safe operating period of a component on the helicopter.
Several specimens of a component design are tested to evaluate the statistics of scatter in
strength.
Figure 5 Structural Test Nonrecurring Cost
Static
5%
Fatigue
95%
18
Data limitations
The data used in this paper represent only the fatigue test costs. Moreover, they represent
only the preflight tests. This aggregate does give the breadth of the program, but does not
evaluate the full depth, since it involves only one or two specimens of a multi-specimen
program. These subsequent specimen test efforts benefit from the prior specimens' efforts
(learning curve) and thus are conducted more efficiently. The evaluation of this learning
curve effect is beyond the scope of this paper. Also not included in the data are the formal
reports. These documents are prepared after completion of all test specimens.
19
Integrated Product Teams (IPT's) and the Test/Certification
Process
The data for this analysis represent the preflight testing of a helicopter. The vision for this
aircraft was a medium sized vehicle that had an affordable price, with one of the lowest
maintenance costs per flight hour of the industry. The aircraft needed to be adaptable for
both civilian and foreign military needs. Its customers could use it for cargo and
passenger transportation. Since this helicopter was designed for civilian use its
certification required FAA approval.
This analysis will focus on the test demonstration phase of the Product Design
Development (PDD) process. However, in order to more fully understand the
development process, a short description of the development team follows.
The company has a matrix organization structure. This is shown schematically in Figure
6. This style of organization has existed here for over 25 years. This is a weak program
manager structure since the manager has financial control of only the program, and no
ability to hire or promote the employees. There are parallel structures for Programs
(centered on aircraft models) and Core Competencies (Purchasing, Product Integrity,
Manufacturing, Engineering). Compensation is controlled by the Core Competency
giving it the upper hand. Each (salary) employee has two bosses, one (Program) for job
assignments and the other (Core Competency) for mentoring, evaluations, standard
practices, and (sometimes) assignments for small programs.
The Engineering Test Branch's relationship to the program is shown in Figure 6. A recent
reorganization has increased the emphasis on the Program side. This has been partially
successful. Parallel career paths exist in both areas, recently the strength of the program
career path has been stronger. The laboratory technicians report to a different branch
manager. This requires regular coordination with the test group on specific task priorities.
20
Figure 6 Company Matrix Organization
President
|Vice
A
Program
Test Dietr
Program
Chief
Vice President
President
R&D
Manager
Test Branch
Manager
Design Branch
Manager
Administration
Group Leader
Group Leader
Group Leader
Group Leader
Ground Test
Instrumentation
Facility Design
Labs
.Engineer
Ground Test
Ground Test
E
ProgramA
Instrumentation
.............. ....................... ............... Engineer
Facility Design
Technicians
Laboratory
Reporting responsibilities for performance evaluation and compensation.
-....... Day-to-day assignment, if different from reporting structure.
21
A timetable for the product development process is shown in Table 1 below:
Table 1 Product Development Timetable
Year
Organization
Major activities
Milestones
Test Activity
1
(92)
Small,
multifaceted
group
Early design studies, program
cost and schedule estimation,
drawing and document control
Market
announcement
Risk reduction
testing,
preliminary
standards
estimates
2
(93)
Expanded team
structure
Formal design trades, initial
vendor contact, performance
evaluated, loads modeling
simulations started
FAA
discussions
held on
certification
3
(94)
Formal IPT's
created
Make/buy decisions, forgings
ordered
4
(95)
5
Management
expanded
Program peaks
Vendors under contract,
detailed design started
First machining occurs
Test plans
started
(96)
at 1000 people
First completed components,
fuselage joining
First
component
issues
Aircraft
specification
released.
Benchmark-
Detailed
estimates
ing complete
6
(97)
tests begun
7
(98)
IPT's lose
focus and
members
Engineering begins to scale
back, major rework effort
required
First Flight
Majority of
testing, Test
Engineers peak
at 10
The program organizational structure is shown in Figure 7. The IPT's were typically
composed of representatives from Engineering (Design, Test, and Analysis),
Manufacturing (Engineering, Production), Purchasing, Customer Support, and
Management. Most of the team members were assigned full time to this program. Not all
of the IPT's had Test members.
22
Figure 7 Program Organization Chart
Pusgam Vmce Presdent
Bu s Iies-s-rm
Customer Support
ana g er
Business
Development
Customer Interface
Partner I Liason
Cockpit
Control System
Partner 2 Liason
Mission
Cabin
Auto. Flightj
System
~mnginerng
Eg!
Final Assembly
Pilots
Change Board
Engineerng
Test
]rg
System Integration
Equipment
Package
Integ rated
Product
Team s
Partner 3 Liason
Transition
Interior
Cowling
Partner 4 Liason
Sponsons
Fuel System
Procurement
Electrical
Systems
Avionics Test
Technical
Interface
Structural
Manufacturing
Engineering
nstrumentation
Dynamic Systems
......
....................
...
Partner SLiason
Em pennage
Mechanical
Aircra ..
_
Fa cli t
Active Noise
Vibration
estIn
_ Fii.g.h~t .Te.st"
Propulsion
Drive
Rotors
Hydraulic
Controls
The IPT's, listed in Table 2, represent both major design elements as well as Engineering
disciplines.
Table 2 Test Participation in IPT's
IPT
Cockpit
Cabin interior
Airframe transition section
Cowlings
Propulsion
Cabin
Hydraulic controls
Aircraft systemsSponsons and fuel systems
EmpennageHorizontal Stabilizer
Rotor systems
Transmission and drive systems
Automatic Flight Control Systems
Mission Equipment Package
Electrical
Final Assembly
System IntegrationSystem EngineeringOversight
Test Participation
-oversight of the birdstrike test and acceptance test
support.
-oversight of the partner ramp testing
-engine qualification testing oversight
-acceptance testing
- vendor qualification of all the purchased actuators
-partner qualifications including the demonstration of
crashworthiness
-partner static test oversight
-partner static and fatigue test oversight
-primary source of in-house test requirements. This was
composed of separate blade and rotor component
teams. The rotor component IPT disbanded after
detailed design when the Purchasing and shop
representatives failed to attend.
-gearbox testing was greatly aided by close integration
with this IPT
- Electrical test responsibility
- Electrical test responsibility
- Electrical test responsibility
- acceptance testing presence and its integration into
the aircraft build schedule and factory operations sheets
-specification authoring
-this group was composed of the leaders of the other
teams including one test representative
Test Certification History
This product development activity was marked by significant changes in the way the test
branch participated. Normally the test engineers would be some of the last to 'join' the
Program. However, for this helicopter model IPT's were a primary process tool. This
required an early Test presence. Initially this team presence could be met with one or two
test engineers spreading their time among the teams. Their primary activities were overall
test planning and estimation. Estimates were prepared in the first, and third years of the
Program. Additional engineers were required (part time) for the risk reduction testing that
occurred during this early preliminary design phase. Significant spending was also
consumed in early discussions with the FAA on the scope of fatigue testing. The FAA's
regulations had changed since the company's last helicopter certification, so the basis for
estimation was less certain.
During the detailed design phase in years 4-6 it was necessary to increase the Test team
to more adequately monitor and respond to the team issues. Significant test planning and
long lead-time facility designs were prime activities. Test participation at this stage was
not predicated by the completion of component detail drawings (as was normally done),
but was in response to the Program's request for staffing to meet the imminent release of
these drawings. Test plan writing began in year 5 in anticipation of lengthy FAA
approval sequences.
The first test occurred during year 6. This testing (#s 6,7,9) was of components involved
in earlier risk reduction efforts. Those efforts helped spur the early completion of the
drawings and tooling. The increase in headcount necessary to staff these tests lead to a
premature scale up of the test effort.
During year 7 the test conduct activities peaked requiring 10 (equivalent) engineers. First
flight occurred at the end of this year.
Drawing completions were typically 3 months behind schedule. Specimen deliveries
were 6-8 months behind schedule, first flight was 5 months behind schedule.
Test Sequence
A typical test sequence begins with the estimation of the schedule and cost in response to
a program request.
Estimates are created from a variety of sources. They may reflect recent actual
expenditures from similar programs, or scaled sums from these efforts. They may be
25
newly created to reflect novel requirements. They may be modified sums that encompass
past actuals and anticipated savings from new approaches.
Upon the completion of a component's detailed drawing the test engineer begins the
quantitative planning. Loads and facility needs are researched. At this point the loads are
either scaled from similar historical sources, or are the product of simulated flights.
Aircraft components for the particular test are ordered. When sufficient conceptual
information are developed, a request for a test facility is initiated to provide the means to
support the specimen and apply the loads. The calendar intensity of the planning and
facility design (and fabrication) is scaled to provide a completed test facility and an
approved test plan coincident with the delivery of the test specimen.
The test plan includes sufficient detail to reflect the purpose of the test, the criterion for
success, and the applicable statutory requirements. A sketch showing the loads and a
detailed instrumentation diagram is included. The instrumentation requirement is
composed of Design requests for analytical correlation and for comparison with flight test
data. The internal approval requires at least two reviewers. The external, FAA review
normally takes from 2 weeks to 3 months.
The test facility design is frequently derived from previous similar component tests. It
takes advantage of multipurpose frames. Servo controlled hydraulic cylinders are used to
apply the loads. Sufficient adjoining components are included at all interfaces to ensure
the accurate introduction and reaction of loads into the component of interest. Where
possible, existing loading apparatus is reused. Standard control components are combined
to meet the load application requirements. Design work is done in-house, but fabrication
may occur either in- or out-of-house.
The test specimen is fabricated from the same drawings, materials, and processes as the
production components. However, if defects are present in noncritical areas, they may be
accepted without rework for the test program.
26
After the test specimen is received, it goes to the instrumentation lab where the strain
measuring gages are installed. If load or moment calibration is required of the gages, it is
done before delivering the specimen to the test laboratory. Typically the FAA requires
inspection of the installation and calibration of the strain gages.
In the test laboratory, the specimen is installed in the test facility and the measurement
devices are attached to a console for display and recording. The FAA requires an
inspection at this point also. After initial surveys, the load condition is established and the
component is repetitively loaded to fracture. Design, program management, and
government representatives are offered an opportunity to witness the testing.
Upon fracture or significant strength demonstration, the testing is concluded. Multiple
specimens are required to determine the scatter in the strength. The number of full scale
test specimens varies from 4 for civil designs, to 6 or more for military designs. The
component strength is compared with the expected usage and the flight load environment.
An expected safe operational period is calculated. The test activities are documented in a
report that requires the same signoffs as the plan.
A traditional Gantt chart for test #3 is shown in Figure 8. The amount of slack time is
designed to ensure a high likelihood of meeting schedule. Managers are rated based on
meeting cost and schedule estimates, no matter how conservative those estimates.
27
Figure 8 Component Test #3 Gantt Chart
1997
ID
Task
__Qtr Name
1
Test 3
Duration
Qtr 2
3
1998
1
|Qtr 4 Qtr 1 Qtr 2 Qtr 3 Qtr 4 Qtr
Qtr 2 Qtr 3 Qtr 4
641.09d
2
Complete Detail design
3
Plan [1-002] test
4
Fabricate facility
9/1
1d
20d
272.76d
5
Mechanical design of facility
6
Controls design of facility
118.75d
7
Procure, fabricate, assemble facility
32.05w
112.5d
8
Receive test specimen
109d
9
Instrument 1 specimen
25d
10
Conduct 1 specimen Test 3
11
1st specimen complete
1d
12
Instrument 2 specimen
25d
13
Conduct 2 specimen Test 3
14
2nd specimen complete
15
Instrument 3 specimen
16
Conduct 3 specimen Test 3
17
3rd specimen complete
1d
18
Instrument 4 specimen
25d
19
Conduct 4 specimen Test 3
20
4th specimen complete
21
Report [L003] on Test
64 58d
2/1
64 58d
5
1d
25d
64.58d
/2
64.58d
1d
/8
45d
28
An analysis in Table 3 (following Browning) of the Program organization and the IPTs
indicates compliance with many of the characteristics. The interface optimization was
informal. It was discussed, but no Design Structural Matrices (DSMs) were drawn. Team
building exercises included picnic, program vision statement signing, commemorative
items (hats, pins, and mugs), group viewing of televised first flight, post flight cocktail
party, and an annual party. Team cohesion has lessened since the many of the founding
personnel have been transferred off the program. Town meetings with all participants
were held when the program was small, they evolved into program briefings by the
Program Manager when the group reached its maximum size. Now, even with a smaller
population, they are no longer held. No formal contracts were written on interfaces, and
this has led to some organizational jockeying for power and budget.
Table 3 Program Integrative Mechanism Scorecard
Integrative Mechanisms
Program Employment
Interface optimization (through DSMs)
Shared information resources
Team building
No, somewhat but not formally
Yes
Yes, several exercises early, but now only once a
year, initial participants dispersed
Yes
Only in the beginning
Yes, more like intervention
Not formally
Yes
No
Co-location
Town meetings
Manager mediation
Participant mediation specialists
Interface management groups
Interface contracts and scorecards
29
Value Stream Analysis
An evaluation of the test process should begin with a value stream analysis. The needs of
the stakeholders will form the values. These values will be traced through the process to
determine the degree to which the process contributes to value creation. Test types will
be evaluated to consider if value varies with type. The performance of a test will be
evaluated against the estimate to determine the degree to which effort were wasted and
possible causes. Lastly, the metrics used to evaluate performance will be examined in
light of lean processes.
A value stream analysis must begin with a definition of value. Value here is defined as
the timely and accurate verification, identification, or quantificationof proposed or
unknown component performance parametersfor an appropriateprice.
Test Purposes
Test purposes (and the returned information) fall into several categories: error
identification, operational, and quantification of performance.
Error identification is often called trouble shooting. Problems in the operation of a system
are investigated by test. Value is indirectly derived from the identification of the cause of
the errant behavior, which will lead to the correction of the problem. This will reduce
cost and lead to value increase.
Operational testing includes endurance, or acceptance. Endurance tests are nonrecurring,
while acceptance test are performed on every delivered system. The normal operation or
range of a system is either simulated or replicated to verify the success of the design. The
result is typically pass or fail. Value comes from the confirmation that the system meets
the specified requirements. This value statement is weak, for if the design is successful,
the test is type one muda. As analytical tools become more reliable this test type will
provide less and less value.
30
Quantifying performance is the last type of test. This includes static and fatigue testing,
and envelope expansion as design/analytical predictions are correlated. The result is a
numeric value. There is still a threshold requirement as in the operational testing, but the
test now has sufficient focus that a scaled comparison with analysis can be returned.
Value here is proportional to the exceedance of the requirements.
This paper will focus on the quantification type of testing. In the present case, as in many
others, there are a variety of customers/stakeholders. Value varies with each group. Table
4 provides a summary of the most recognizable stakeholders and their requirements. An
obvious conclusion is that value, while being manifest in the physicality of the delivered
helicopter, takes varying forms when narrowly linked to the output of the
Test/Certification process. The most common physical form is that of the test report.
The stakeholders are divided into external and internal groups. Internal refers to those
delivering the value, employed by the manufacturer. With this interpretation,
subcontractors and partners are considered internal since they function as employees in
delivering value. External stakeholders are those that receive the value in return for
payment, or bystanders who are innocent participants within the influence of the
delivered product.
31
Table 4 Stakeholder Value Summary
Source
Group
Units
Needs
Related Test
Output Value
External
Civilian
Owners
Helicopter
owners
Users
Pilots
Low purchase price, low cost
maintainability, long life,
Component lives,
test cost, timeliness
versatility, availability, safety
Maintenance
Controllability, visibility,
crashworthiness, low vibration,
low noise, comfort (seats,
Vibration response,
component lives,
stability, crash
ventilation, heat), safety
response
Low effort maintainability, long
life components, low hazard
Component lives,
ease of repair
workplace, clear manuals
Passengers
Low cost transport, point to point
speed, low vibration, low noise,
Vibration response,
trust/reputation
comfort, safety
Contractors
Low cost of operation,
availability, versatility (center of
gravity range, gross weight), large
Test cost, efficiency
fuel capacity,
Public
Neighbors
Low noise, low hazard, safety
strength margins,
thoroughness
Officials
Above, plus long term economic
strength margins,
benefit
Usefulness, minimal hazards
thoroughness
Society
Certifying
Agency
safety, regulation obedience,
lessons learned
Test plan and
execution, accurate
analysis, test report
Internal
Management
Shareholders
Investment risk, ROI
cost, lawsuit shield
Program
Hurdle completion, risk
reduction, minimal cost, minimal
Test cost, duration,
accuracy
schedule
Functional
Good practice, safety, lessons
Strength margin
learned
Engineering
Analysis
Design
Tool correlation, failure modes
and effects, finite element
Design verification strength, fit,
Strength, stiffness,
failure mode
Failure mode,
Ground Test
function, life
Reputation
Cost, accuracy, time
Safety, flight envelope limits,
critical measurement parameters
Fatigue strength,
linearity of
Flight Test
I measurements
32
Source
Group
Units
Needs
Related Test
Output Value
Job satisfaction, long term
employment
Job satisfaction, job efficiency,
calibration techniques
Flaw tolerance, assembly
techniques
Inspection techniques, critical
areas
Repair strength, techniques
Test Lab
Instrumentation
Manufacturing
Product integrity
Customer
Overhaul &
Support
repair
Group
Failure modes
Repair
demonstrations
Manuals
Source
Clear directions and
expectations
Measurement
location
Failure modes
Assembly photos, component
Early photos,
lives,
component lives
Units
Needs
Related Test Output
Value
External
Military
Procuring
Agency
Army
Cost, survivability,
detectability,
upgradeability, technical
Test cost, -ility accuracy
superiority
Defense
Department
Low purchase price, low
cost maintainability, long
Commanders
Availability, versatility,
speed, gross weight, time
Pilots
Controllability, visibility,
crashworthiness, low
vibration, low noise,
comfort (seats, a/c, heat),
Strength margin, test
cost,
life, safety
Endurance, efficiency
on station
Users
Controls response,
stability, audio response,
crash response, landing
gear performance
safety, survivability
Maintenance
Low effort maintainability,
long life components, quick
replacement with common
Component life, ease of
repair/ assembly
tools, clear manuals
Troops
Public
Taxpayers
Officials
point to point speed, low
vibration, low noise,
comfort, safety,
survivability
Cost, life, state of origin
Above, plus long term
Vibration response,
crash response, ballistic
tolerance
Test cost
economic benefit
Society
Usefulness, minimal
hazards
33
Trust, reputation
Standard Test Value Stream
Value as shown in the table is delivered only by the information creation while the Test
process makes use of both material flow and information creation (transfer and
transformation). Here the only physical manifestation of value is in the conveyance of
information by report or presentation. These material flows and information creation are
shown schematically in Figure 10 for a typical test process. Cycle time (C/T) or task
time represents the value-added time (in business days). This is the time required to
complete the task as fast as possible. Schedule time (S/T) represents the calendar time (in
business days). The total times are simply sums of each task, and represent not the critical
paths but the total effort. Many of the steps can proceed in parallel as shown in the
traditional Gantt chart of Figure 9.
Lightning bolts represent the information flow while the material flows are shown with
traditional arrows. The steps in the information flow are identified by the letter 1: 11, 21,
etc. There are two material flows:
Component- This flow involves the actual test specimen. The specimen flow steps
are identified by letter s: Is, 2s, etc..
Facility- This flow addresses the construction of the facility that will be used to
test the specimen. It joins the component flow with the installation of the
specimen. The material flow steps are identified by the letter f, If, 2f, etc.
The choke point is known to be the calibration task. This step has the most inventories
stacked in front of it, as described in Critical Chain 9 . Another common delay point is
between similar, but competing, test specimens at the multi-use, capital, test facilities.
This value stream represents only a portion of the product development process. There
are many groups and activities (Design and Manufacturing Engineering, Shop,
Inspection, Suppliers) that fall outside the scope of this paper. The intent here is to
address those actions that form the bulk of the Test Group's charter. These are the
34
activities that are estimated and conducted by personnel that report to a common
manager. Other activities represented in gray in Figure 10 are in the value stream, but are
controlled by a different line manager. Negotiations between managers are required to
achieve mutually acceptable schedules.
Value Streams with Rework
Figures 11 to 13 show the more realistic value streams with rework cycles at the
planning, instrumentation, installation, and facility tasks. As shown in the schedule totals,
the most critical time for rework is during the planning and facility stages. As noted in
the Table 5 below, the effect of the rework is to a great extent buried by the conservative
scheduling of the test. The cycle time is less than 50% of the schedule time.
Table 5 Value Stream Statistics
Figure
10
11
12
13
14
Value Stream
Baseline, no rework
Facility rework
Installation rework
Instrumentation rework
Late Detail Design
9 Goldratt, E., (1997), Critical Chain, North River Press,
35
Cycle Time
Schedule Time
Summary
Summary
197
224
199
202
215
409
435
413
420
437
Figure 9 Structural Test Value Stream
Ii Establish Req'ts
Na7Establish Test
C/T:------
dmC
2r
4i Prepare Plan
:1dyys
d1y
25 ds0/T:4/T:
Cid/ysdaT:
da:2
dayT28
s
3Si
Design FacilityFacility
6i Order Specimen
C/T:3dasq:
2d
S/: 60 days
C/T: .5 day
S/T: 2 days
Rework
4sIn5s
a
if
:
S
dy3s
Crib
Su F c y5f
C/T:2.5 day
S/T: 5 day
Assemble
2d
4 days
4f
1
ribCT
a
B,-
Mt
daysCu
8 days
ST2dy
S/T:
0days
CT
7s install specimen
FxueC/T:5
Suppier
apc
Calibrate
C/T 2 days
S/T: 2 days
2dy
dy
or
3f Shop Facility
Build
C/T: 20
days
9s Conduct Test
C/T: 15 days
S/T:- 20
-
days
e
cont~
y 1Ds
Crib
Matra
Flw/T
36
7i Analyze Data
8i Report Results
C/T: 5 days
C/T 20 days
S/T:- 10
days
S/T: 60 days
C /T to ta l: 1 9 7 da y s
total: 409 days
Figure 10 Test Value Stream with Facility Rework
I.
31 Establish Test
Req'ts
41 Prepare Plan
dayC/T:
C/T:
j
S/T:
S/T:
7 days
... M.. ........
i Design
12 days
22 days
Facility
Rework
T
S
-
CT4
C/T: 40 days
S/T: 65 days
T351
-me
--- -
Rework
.........
..
.......k.. ..
......
.........
..... s....
C.nu. T s ....
...
R....rk
.r . ... .....
......
, ....... ...
3f Shop Facility
Build
C/T: 30 days
S/T: 45 days
SIT: 25 days
CIT total: 224 days
S/T total: 435 days
37
Figure 11 Test Value Stream with Installation Rework
I.
2
V
.....................
...........
...................
-7
................
......................................
.....................
...
........
................
........................
..................
............
..................................
................................................................
..........................
...... ... ......... ................ ...............................
.......
.
........
..
...............
....
.......
Its
Rework
7s install specimen
............................
C/T:7 dayS
S/T: 10 days
S.:
............
.........................
4.
..............
..................
.................................................................................................................................................
................. ...........
-
............
....................
.........................................
......................... .......................
5
C/T total: 199 days
S/T total: 413 days
38
Figure 12 Test Value Stream with Instrumentation Rework
........
...
...................
.....
.............
...
....
...
...............................
...
.................
M L
...........
................ .
..................
...........
.............
..........................
-
Rework
"M
FIN
4s Instniment
POW
M11 I
............
5s Calibrate
CITA days
SM 10 days
......
....
C/T: 4 days
Srr: 4days
6 days
...................
...........................................
................
.
.............
-
............................
...........
......
..........
.......................
.................
................
I..................
............
...............
...................................
......
...... ...............
...................... . .. .I
II....................
...................... .
UT total: 202 days
S/T: total: 420 days
39
Figure 13 Test Value Stream with Late Detail Design Rework
rework
ReworkI i Establish Req'ts
2i Establish Test
Req'ts
C/T: 2 days
S/T: 5 days
C/T: 2 day
S/T: 8 days
6i Order Speci
4i Prepare Plan
mn
rework 7
C/T: 15 days
S/T: 30 days
31 Desig n Facility
an
C/T: 45 days
S/T: 70 days
C/T: .5 day
S /T : 2 d a y s
)S T
..................
...........
.........
..............
.......................
.........................................
.....................
.......................
............
............
..........
T'.
.
.................. ..........
..........
...
..........
.............
............................
..........................
.......
..................
.......................
..................................
),
...................
................................
go
............. x .....................
.......................
........
.............................
..............................................................
........................... ..................
.....................
............
..................
.....
......................................
................... ........................
..................... ...................................................
.....
...........
..... ..........
...........................................................
......................
..................................
...
...................
.......................
.........................
...
C/T total: 215 days
S/T total: 437 days
40
-
Test Phase Budget Analysis
An evaluation of the cost performance of the various test phases is summarized in Table
6. Each phase is evaluated relative to its estimated cost. The result is either
on budget (within 10%), or
Table 6 Test Phase Budget Performance
Dwg
Test
No. Engr Lead Plan
A
4
A-%
5
A
23
B
29
B
31
C
9
C
17
1
D
D
6
D
24
D
25
D
26
D
28
D
42
F
3
F2
32
G
13
G
30
H
7
H
27
E
8
E
14
E
21
E
21
TOTALS
Facilit
Conduct Overall
Instr.
-1
4%
-7%
0
-2
-"
0
4%
3%
2%
-7%
0%
%
0%
0
-1%
4,0%
4
9%
overrun
under
on target
%
Several things are apparent from this summary.
"
The planning phases were regularly (19/24) over budget. This occurred even if the
drawing was released ahead of the test plan start. This indicates a fundamental under
estimate for this test phase.
"
When the drawing phases did not lead the test plan starts by at least 2 weeks (15), the
tests were regularly over budget 10/15.
41
"
Except for engineer H, no engineer had more than 50% total cost overruns.
" Instrumentation was underestimated in 13/24 tests. This also indicates a fundamental
problem.
Spending profile analysis
The scheduling reality of a test often highlights the muda. It reveals the inactivity periods.
Figure 15 compares an actual large test program (#3) results, where the stretch-out of the
performance is readily apparent, with the plan. Reduced calendar time is always a sign of
a more efficiently run test. Spikes in manpower also indicate inefficient use of resources.
A rectangular block of the shortest calendar duration would be the ideal spending profile.
One could envision a metric that would resemble the energy efficiency calculation of
landing gear load versus stroke curves. The derivation of that metric is beyond the scope
of this paper.
One similar characteristic of the overruns during this test program is that the early
portion, planning, consumed much more time than estimated. This is partially due to the
pressure exerted by the Program management, through the IPT, to begin test planning and
facility design activities ahead of the (belated) release of the component drawings.
Management hoped to recover schedule slippage by early completion of the test plan.
This more frequently led to plan and design rework as the plan and facility design had to
reflect the drawing change.
42
Figure 14 Resource Summary for a Large Component Test (#3) (Plan & Actuals)
Large Component Test Actuals
-
240
200
0 Other
ESuperv
0 Instr Tech
ELab Tech
* Instr Engr
160
120
-Z% 80,
40
FT
*1
0
0)00)00)00)00)00))0))0))0))0))0
O
co
(
-
O
m
M
M
I
I
Test
Facilit)
Complhte Start
Specimen
Drawing
Done
Co
)0
M
I
Test
end
Specimen
in instr.
Large Component Test Plan
240
s 200
10
*Superv
[I Instr Tech
ELab Tech
D Instr Engr
DA/C Mgr
* Fac Dsgnr
Q
-
,g 120 0 80
a)
40 -
* Test Engr
0
M
M
r-
M
M
M
V-
r
Delijery
Irwn
Test
Start
43
I
o
U'
OCo
*A/C
Mngr
* Fac Dsgnr
Figure 16 examines a metric that reflects the amount of time the initiation of planning
lags (+) or precedes (-) the release of the component drawing. This metric is plotted
versus the % overrun in the test. As shown, the figure suggests that the early start does
(weakly) correlate with the overrun. Clearly, there are other issues affecting the overrun.
o-Al
o:
o0
'
~
~
4,/
,J~-
uip ag Dmwrng rele
hatdn
Mbntt ud
Figure 15 Effect of Test Performance on Start Date
44
Value Based Tracking
A tool needed by lean enterprises is performance tracking, both to establish current
efficiency and to form a baseline to evaluate future improvements. One cannot move
toward perfection if one cannot determine one's direction or know how fast one is going.
This section will examine measures of performance and by inference estimation.
Test performance is currently measured in three ways: cost, schedule, and (less
frequently) quality.
For the test certification effort, cost includes labor and material charges including facility,
instrumentation, and component costs. Costs are accumulated from earliest requirements
gathering phases to the approval of reports, including any subsequent retest and revisions.
This can include Design support if it is related solely to the accomplishment of the value
statement/test objectives and is not just a by-product of being associated with the
program at that time.
Schedule is relative, a specific test's clock starts when the requirements are firm enough
for it to exist as a separate activity apart from the general test planning. Normally this
will be after the component drawing is completed, or defined sufficiently to permit
planning. While this forms the overall date for collecting time, if is unfair to solely
ascribe all this time to the test since the fabrication of the test component can follow a
schedule totally divorced from the test. Relations with suppliers of forgings,
subcomponents, can be lengthy processes that relate to design-to-cost, not certification.
As such, at this point the test process is hostage to the early production performance.
The last measurement is quality. This is the most ephemeral. Poor quality can easily be
noticed in the need for revision of plans, and retesting of components. Poor quality
contributes to the rework cost and schedule increases. It can have a hidden side as well.
The failure to adequately measure parameters or ascribe failure modes may require
45
poorly directed acceptance testing (which does not address root causes), or to reduce the
replacement interval/life of rotor components. This would raise the cost to the customer.
Cost Performance
The typical measure of a program is how it performed relative to the plan. This is a
flawed basis for comparison. If the planner was overly generous, then even a poor
execution can be seen as successful. It would be better to establish a more rational basis
for the initial estimate. This will then allow a truer yardstick to measure performance
across several years and programs. The imprint of the estimator needs to be minimized.
The source'0 for this rational estimation model is derived from the theory that the major
cost driver is the flight loading scheme. It influences the intricacy of the plan, the
complexity of the facility, the setup time, the instrumentation costs, and ultimately the
analysis and reporting. Figure 17 highlights the simulated loads of a test setup.
10 D. 0. Adams, unpublished work on Test Metrics, Sikorsky Aircraft
46
Figure 16 Simulated Flight Loads Test #9
All phases scale with the loads. The planning of the simulated flight loading condition
requires an understanding, researching of each load and thus scales with their number.
The facility design must model, represent these loads. Its cost is related to the number, for
each requires a load creation device and controls. The most costly part of the
machining/fabrication of the facility is attachment points for the loads. The setup time for
the test is composed of the installation of the test specimen, but more significantly by the
installation and setup of each load creation device. The number of measurements is not as
straight forward. Here there are a limited number of parameters that measure the applied
loads. There are many more, though, that track these loads as they diffuse into the
structure. These scale with the number of applied loads. More loads mean a longer survey
period. It means more time to phase and manage the loads: a single test may run at speeds
47
from 5-20 hertz. As more loads are added, the cycle rate shortens to one hertz or less.
The reporting of the test will also vary with the number of loads since flight data must be
analyzed for each one, then applied to the test results.
The rational estimation, Figure 18, starts with the number of loads, setups, and relative
difficulty. The number of setups and difficulty should be multipliers since they replicate
load efforts. It sees a difference between the importance of the static loads and the
dynamic loads. The static loads are easily controlled unchanged for the test duration, the
dynamic loads require significant attention as to the methods for creating, maintaining
and recording the magnitude and frequency of the variation. For this reason, the static
loads are considered only
the effort of the dynamic. A simple combination, rating, is
plotted versus the amount of labor hours. The correlation is fair, 75%.
Equation 1 Estimation Rating Factor
Rating = difficulty*#setups*(0.5*#static loads+2*#dynamic loads)
The makeup of a better, more severe estimation, Figure 19, also sums the loads.
However, added to these is a consideration for the number of components under test in
the setup. The assembly time for the tested components varies with their number. The
amount of instrumentation also varies with the number of components. This estimation
has a better correlation, 91%.
Equation 2 Estimation Severity Factor
Severity = difficulty*#setups*(0.5*#static loads+2*#dynamic loads+#component)
The difficulty factor is relatively simple: 1,2, or 3. One represents the simplest test, well
understood, a proven facility (or at least concept), no surprises in strength, no risk. Three
is for a very complicated test, with many loads, components whose strengths are
unknown, with components of many different kinds of materials, a facility that is novel,
high risk, a lot of instrumentation. Two is in between, a few loads, a common design,
48
moderate instrumentation. Table 8 identifies the contributors to the difficulty factor. It
also includes a description of the problems encountered in the testing.
The last estimation, Figure 20, is a linear regression of the variables. The coefficients are
2 for difficulty, 1 for static loads, 2.5 for dynamic loads, 3 for components, and 4.5 for
setups. This estimation does not do better, it has a correlation coefficient of 77%.
Equation 3 Regressed Estimation Severity Factor
Regressed severity = 2*difficulty+ 4.5*#setups+ #static loads+ 2.5*#dynamic loads+
3*#components
The result is a mathematical (linear) relationship between the severity factor and the
hours expended that is not dependant on the estimator. Judgements can now be made
without consideration for the people involved. Those tests (Figure 19) that fall to the right
of the curve are more efficient than those that are on the left side. The slope of the line is
an indication of the company performance, 'perfection.' The lesser the slope the closer
the test process is to perfection.
Certainly, this evaluation has a limited scope. Its form applies only to fatigue testing.
Another limitation to this analysis is that it submerges within it the documenting process
performance. (The number of pages is often the ideal choice for this metric. However,
this has the unintended consequence of favoring long-winded treatises. This is inimical to
the precepts of Strunk & White.") This analysis could be applied to other forms of
testing: static, vibration, endurance but the factors may likely be different, as would the
coefficients.
" Strunk Jr.,W., White, E.B., (1953), The Elements of Style, The Macmillan Company, New York
49
Rating vs hours
=difficulty*#setups(.5*#static loads+2*#dynaMic loads)
'y =112 52x
7000
CL
0
7000
2000
1000
awA
0
10
30
20
Rating
Figure 17 Test Performance and Rating Factor
50
40
50
60
Severity vs Actual Hours
=difficulty*#setups(.5*#static loads+2*#dynamic loads+#component)
6000
7000
-
5000-
-
~-
-
-----..
0
. ..
4000
3000-
2000
1000
0
10
20
30
40
50
Severity (with component factor)
Figure 18 Test Performance and Rational Severity Factor
51
60
70
80
Regression summary
=2.1*difficulty+#static
loads+2.4*#dynamic loads+3*#components+4.3*#setups
---
8000
7000
6000
50000
4000
3000
2000-
1000
0
10
20
30
Regressed severity
Figure 19 Test Performance and Regressed Severity Factor
52
40
50
60
Table 7 Summarized Test Data
Test
No. Engr
A
4
A
5
A
23
B
29
31
B
9
C
17
C
1
D
6
D
24
D
25
D
26
D
28
D
42
D
E
8
E
14
E
21
E
21
F
3
F
32
G
13
G
30
7
H
27
H
Actua
Factors
Lengt
# Loads
# com- #
Diff Static[ Vibratq onent setupsiRating Severit; ... ___ Iwk)
4 14.78 31
2
0
1
0
1
8 16.9 32
4
1
3 11.8
2
0
6
1
8 16.9 21
0
4
3 11.8
0
1
16
2
1
3
39
51 29.13 23
1
2
25.5 28.39 17
13.5
2
4
5 14.23 25
0
33 25.87 18
1
2
27
3 11.8
23
1
2
0
39 31.17 25
4
36
8
25.5 23.84 17
3
22.5
3
10 22.07 20
8
4
0
2
4
5 14.23 8
0
46 41.99 38
1
14
3
4
8 19.05 31
1
0
1
2
3 11.8
0
3 11.8
1
27
2
0
4
33
60 53.05 32
6
1
20 24.15 24
8
0
2
12 19.33 30
8
0
15
22
6
6
1
0
42 27.68 21
2
2
30
69 33.52 36
57
3
4
53
Plan
Schedule
Length Stretch Dwq Lead Overrur Overrun Planned actual
act/sev
(wk
%
wk)
(hrl %
h
(hrl
1225
-56%
2200
975
10
3.10
-5
1645 219.33
525
-24%
2170
10.5 3.05
-10
670
2245
-70%
-1
1575
8.5
0.71
935 153.18
-70%
3110
2175
-3
11
1.91
750
1380
630
-46%
10
9
1.78
5670
3000
89%
-2670
10
2.30
-8
3670
3053 114.03
-17%
10
1.70
-7
617
1665
1240
34%
-4
-425
7
3.57
3440 113.82
2120
62%
10
1.80
-1320
-8
1270
1760
-28%
490
8
2.88
-3
3455
15%
3000
-455
-2
9
2.78
2085
3340
14
1.21
1255
-38%
0
1275
965
310
-24%
-2
6
3.33
835
1580
-47%
745
7
1.00
8
4410
1900
-2510 132%
10
3.80
-8
850
1855 137.67
-6
-1005 118%
7
4.43
225
-20%
1100
875
-39%
1840
1120
720
-4
8.5
3.18
7442
9980
2538
-25%
18
1.78
-7
2035 118.46
33%
1530
-505
8
3.00
2
1350
1920
42%
-570
9
3.33
-6
1370 121.85
10
0.60
2300
930
-40%
0
5574
2200
9
2.33
-8
-3374 153%
6180 105.89
77%
3500
-2680
12.5 2.88
-4
Table 8 Difficulty Factor Summary
Test #
Dff.
Facility
Factor
Design
Other Influential Factors
Light
moderate
Moderate
Moderate
Light
heavy
understood
understood
Understood
Understood
Understood
modified
Vendor made
Material problems
Vendor made, material problems
Material problems
Part had marginal strength
Instrument
-ation
modified
New
New
New
New
New
4
5
23
29
31
9
1
2
1
2
1
3
17
1
6
3
1
2
New
modified
existing
Moderate
light
Heavy
24
25
1
3
Light
Heavy
26
3
Heavy
Novel
28
42
8
1
1
2
New
New,
complex
Existing,
complex
New
Modified
new
Novel
understood
Modified,
difficult
Understood
Novel
Light
Light
moderate
14
21
3
1
1
1
existing
New
Modified,
Light
Light
Moderate
Understood
Understood
Novel
features
Understood
Understood
Novel
Moderate
Light
Heavy
Heavy
Novel
Understood
Novel
Modified,
Heavy
Novel
complex
Vendor made
Part had strength issues
Several modes
Vendor made
One part had marginal strength.
Material problems
complex
32
13
30
7
2
2
3
3
27
3
New
New
New
New,
Part had strength issues
Loading was not well understood
complex
complex
Modified,
One part had marginal strength
Loading was not well understood,
strength problems
complex
54
Multitasking and Performance
One concern is the potential that engineers are overloaded such that the delays are not
due to inefficient attention but rather no attention at all. The engineer may stagger his
tests such that no two (or more) are conducted at the same time.
This analysis will examine the data to evaluate this argument. The data of Table 7 is a
grouping of tests by engineer. One can graphically represent the workload of engineer D
as shown in Figures 21 and 22. Figure 21 shows all of the test disciplines by test. Figure
22 shows only the test engineer's hours, but for all his tests. This includes all activities
from planning to conduct. (The amount over 40 hours per week represents overtime.)
Figure 22 indicates that there were only three periods when two tests were active (in the
conduct phase) at one time. First test 1 and 42, then 24 & 28, and lastly 24 & 25. During
this period there was no obvious stretch out or slippage of these tests. The data also show
that for these tests, which were conducted in parallel, budgets were not automatically
overrun. Of the five tests, only tests 1 and test 25 overran.
Plotted in Figure 21 are the time history profiles of each test of engineer D. If the
concurrence of the tests were inhibiting, there would be a shift or delay in the conduct
phase. However, the data show that once conduct is begun, it is pursued vigorously till
the test is completed.
Engineer D is not a superman. He has tests on both sides of the severity line of Figure 19.
Nor are all his tests easy. They vary from a severity of 3 to 40.
55
0
Oi
0
-
5
r
e
:3
co
0
Co
-'
Co
=r
o0
Co=o
Co
Co U3
MD
=
M CD4 -
=
,EDOo.E
MODOMME
Co
May-99
May-99 -
:2
(j
Mar-99
Mar-99-
>
0
Mar-99 -
Jan-99
Jan-99-
~
Jan-99 -
Nov-98
Nov-98-
i-n
)
Nov-98 -
Sep-98
Sep-98-
>
0~
Sep-98
Jul-98
Jul-98
o
0D
c
Jan-98 -
Nov-97 -
(
0
0(a
t-
~
May-99 -1
-
Jul-98 -
0~
=r-
-
May-99
Mar-99
Jan-99
0) 0
0
C (a
3
Co
(a
=
0(
0
C,-
EEOOEEEO
Cc
May-98
Mar-98
> 5
May-99
Mar-99
Jan-99
Nov-98
Sep-98
Nov-98
5 w 0
0p.0000CY
C
Jan-98C
Nov-97.
Sep-98 -
-3
-4
0
S
1%)
0
Jul-97.
May-97.
Mar-97
Jan-97
Nov-96.
Sep-96
Jul-96
May-96
Mar-96
Jan-96
Jul-98
co N) 0)
Wk Labor Hours
Jul-98 -
May-98 -
CD
Sep-97
May-98 -
(A)
CD
--I
Jul-97 -
May-97 -
Mar-97 -
Jan-97 -
Nov-96 -
Sep-96 -
Jul-96 -
May-96 -
Mar-96 -
00000
Jan-96 -
Mar-98 -
May-98
r0
Wk Labor Hours
Mar-98 -
May-98
Jan-98 -
Mar-98
0
Mar-98
CD
Jan-98
Jan-98
S
Sep-97
Sep-97 -
Sep-97
0
Jul-97-
Jul-97
Jul-97
Nov-97 -
May-97-
May-97 -
May-97
-I
Mar-97-
Mar-97 -
Mar-97
Nov-97
Jan-97.
Jan-97 -
Jan-97
CD
Nov-96
Nov-96 -
Nov-96
CA
Sep-96
Sep-96 -
Sep-96
Nov-97
Jul-96
Jul-96 -
Jul-96
o a
00000
May-96
Jan-96
May-96 -
o
May-96
-4
.C~ CD N~
00000
Jan-96 Mar-96
-A c
Weekly Labor Hours
Mar-96 -
0000
Weekly Labor Hours
Mar-96
Jan-96
Wk Labor Hours
Sep-97Nov-97-
Sep-97
Nov-97
Nov-98
Jan-99
Mar-99
May-99
Nov-98Jan-99Mar-99
May-99
Nov-98
Jan-99
M ar-99
May-99
*'I0IUI
Sep-98
Sep-98-
Sep-98
,.OO.,EO0
Jul-98
Jul-98-
Jul-98 -
0
May-98
May-98
May-98
m m
Mar-98
Mar-98-
Jan-98
May-97
Mar-97
Mar-98
m"
-
Jul-97
Jul-97
ma
Nov-97
-
May-97
May-97
i
Sep-97
-
Mar-97
Mar-97
H
Jul-97
-
Jan-97
Jan-97
Jan-98-
Nov-96
-
Nov-96
Nov-96
Jan-98
Sep-96
Sep-96 -
Sep-96
(a
Jul-96
Jul-96 -
Jul-96
0
May-96
May-96 -
May-96
Jan-97
Mar-96
Mar-96 -
Jan-96
Mar-96
"SSg38
Weekly Labor Hours
Jan-96
-
Wk Labor Hours
Jan-96
Weekly Labor Hours
ft
-I
Figure 21 Engineer D's Multitasking
Engineer D's Multitasking
100
90
t 80
0 70
Z 60
S
0Test 42
NTest 28
ETest 26
S40
0Test 25
30
M Test 24
Te.st 6
20
Test 3
ETest 1
10
M
M
M
IL
I
I
CO
ID
0
1
CUI
a
75
Task Name
Engr D's Test Conduct Phases
0
M0
Ms
NM
I
I
I
U)s z
I.
CO
,U)m
>'-
M
75
(D
I
I
7U-,
>U
0
z
Oc
Test 42
Test 25
1998
No D
Jan FeI Ma AprMa Jun Jul Au
15C
Test I
3
Test 6Specimenl1
4
Test 6Specimen 2
23
5
Test 24
29c
6
Test 25
65c[ii
E21
Test 26
59Z
8
Test 28
19
9M-
Tas 412
14c
E
N
1997
Duration Oct Nol Del Jan Feq Mad Apr Mag Jun Jul Aug Se
429c
2
7
o
)
I
LI
UI
Schedule Performance
Schedule performance is typically measured as ahead or behind plan. This is again a
relative measure whose use does not move one toward perfection. A better choice would
be to create, as in cost, a metric that defines a rational basis for judging the length of a
test. With this independent tracking parameter the estimator would be removed from the
metric and a fairer comparison could be made.
58
While it is beyond the scope of this paper to derive an equation for this parameter, it
would no doubt contain similar parameters as the cost equation. In addition, it should
have a term that addresses the relative workload for the department. The goal of the
metric would be to reduce the calendar time required for the test.
Quality Performance
As indicated earlier, this metric is the most difficult to quantify. Rework measures the
frequency with which errors are detected, but it is also dependent on the ease with which
one can correct the problem. If one considers a test plan done before word processing, the
frequency with which the document would be revised would be very low since it was a
painful process. Now, changes can be made within seconds, so the most efficient method
may include several revision cycles. Instrumentation rework also varies with the test. If a
component requires internal strain gaging that is difficult to access after assembly, then
the importance of the rework is much greater than for surface mounted gaging.
Again it is beyond the scope of this paper to derive an equation but the number of rework
cycles should be included with some qualitative severity measure.
Aligning Metrics with Value
The sole purpose of metrics should be to provide a way to measure the degree of progress
toward the company's objectives. The cost, schedule, and quality metrics align few of the
stakeholder value/objectives of a test as listed in Table 4 previously.
Schedule aligns with value metric if one keeps the final deliverable in mind: the report or
transferal of information. Minimizing schedule brings higher value. Before launching
into the test-planning phase, one needs to review the requirements and all possible
alternatives. These alternatives can provide a faster response. One of these alternatives is
not to conduct the test. Table 9 below lists several reasons for choosing non-test paths.
59
Table 9 Testing Alternatives
Ground
Description
Pros
Cons
Analysis
Analytical calculation
using predicted loads,
part geometry, material
allowables.
Cheaper, faster,
already exists
Simulation
Loads are created by
representative flight
combinations
Change loading from
dynamic to static
Use of existing test
data/experience
Proof of performance in
Cheaper for
system tests
Necessary simplifying assumptions
may not resolve all issues, not all
areas are considered equally, real
world flaws may exceed designers
imagination
More expensive for simple tests, not
all flaws may be represented
Test
Alternative
Smaller scope
of test
Similarity
Flight test
Cheaper, faster
Cheaper, faster
Faster
Secondary loads may be important
to the answer
Design must be very close to
current to qualify
Risky for all but failsafe
components
actual use
60
Conclusions
Value stream analysis has a place in planning and conducting Test/certifications. It
highlighted the degree to which excessive slack time was built into the test estimates. It
can promote faster response and lower costs. The primary focus should to be on meeting
the customer's needs in the quickest way possible.
An analysis of a recent helicopter test/certification effort revealed the need for an
estimator blind cost metric. An equation is postulated to provide this estimate using 4
common parameters and a sliding difficulty factor.
IPTs provide early opportunities for Design-Test integration. However, without serious
adherence to meeting the gate requirements of detail drawing completion, they can
encourage premature test starts.
Multitasking was found not to increase the likelihood of test conduct slippage or overrun.
The more compact the test schedule the lower the test cost.
Future Work
Metrics for schedule and quality are also needed, and are sketched out, but their
completion is beyond the scope of this paper.
Managers' compensation is currently tied to meeting or under-running the schedule and
budgets that they create. A better method needs to be developed that will reward honest,
not inflated estimates.
A schedule analysis of the data is needed to investigate the degree to which overruns are
related to the length of the activity.
61
APPENDIX
Table 10 Case Data
Test No.
29 | 31 1 9
23
|
4
1 5 1
~
"
"
"
Dwg Planned
Feb-96 Jan-96 Mar-96 Apr-96 Apr-96 Nov-95
Dwg actual
Aug-96 Apr-97 Aug-96 Apr-97 Sep-96 Oct-96
Test Plan Start
Mar-96 Mar-96 Jul-96 Jan-97 Nov-98 Feb-96
Test Finish
Dec-98 Jan-99 Jan-97 Oct-98 Mar-99 Jan-98
Actual Length
21
23
6
16
32
31
9
Planned Length
10
8.5
11
10
10.5
1.91
2.30
Schedule Stretch
0.71
3.05
3.10
-3
-8
-5
-1
-10
Dwg Lead
-2670
Overrun hr
-70%
Overrun %
-70%
89%
-56% -24%
Specimen Plan
Aug-97 Aug-97 Apr-97 Dec-97 Mar-!rn Sep-96
specimen actual
May-98 Aug-98 Jul-98 Aug-98 Dec-98 Mar-97
1380
3000
2200
2170
2245
3110
planned-hr
total hr
975
1645
670
935
750
5670
total $
114300 180400 54500 116700 64900 586800
planned $
225000 263600 228600 293800 113900 260000
Plan
plan
105
170
100
105
105
300
actual
123
190
353
170
28
505
overrun
17%
12% 253%
62%
-73%
68%
Facility
|Design
480
600
600
600
120
812
actual
468
666
154
212
82
1316
560
660
540
900
900
800
IFabric.
actual
77
194
112
190
234
1765
3500 20000
49000 90000 49000 45000
Matl $
11200 17500
900 21300
500 57300
actual $
odc$
25100 31300
20600
4400 75900
93000 139500 98500 74700 39800 50800
IFabri$
6160 31050 17270 172975
sumact$
29335 41970
overrun
-68%
-70% -94%
-58% -57% 241%
66
42
154
65
66
44
Instrument plan
actual
40
54
26
38
33
348
-38%
-18%
-41%
-42%
-21%
126%
Test
Iplan
391
570
570
424
671
266
540
326
348
1154
Conduct
actual
-43%
-18%
72%
-32%
-5%
154
Instr 2
Iplan
actual
205
671
Conduct 2 1plan
329_
actual
62
17 1 1
~
Sep-96 Jun-96
Nov-96 Jul-96
Apr-96 Mar-96
Sep-98 Apr-98
17
25
10
7
1.70
3.57
-7
-4
617
-425
-17%
34%
Aug-97 Jun-97
Mar-98 Jun-97
3670
1240
3053
1665
329440 184100
383600 1112001
305
135
374
215
23%
59%
850
470
770
335
400
800
485
536
90000 12000
25700 10900
59500 40000
134000 34000
86175 69480
-36% 104%
118
170
154
81
31%
-52%
926
425
1246
441
4%
35%
Table 11 Case Data
Test No.
6 .A---
Plan
Facility
Dwg Planned
Dwg actual
Test Plan Start
Test Finish
Actual Length
Planned Length
Schedule Stretch
Dwg Lead
Overrun hr
Overrun %
Specimen Plan
specimen actual
planned-hr
total hr
total $
planned $
plan
actual
overrun
|Design
actual
IFabric.
actual
Matl $
actual $
odc$
IFabri$
sumact$
overrun
Instrument plan
actual
Test
Conduct
1plan
actual
Iplan
actual
Conduct 2 1plan
actual
Instr 2
Mar-96
Sep-97
18
10
1.80
-8
-1320
62%
Sep-96
Jan-97
2120
3440
334400
173300
300
353
18%
840
720
880
397
3700
54200
5000
52100
26835
-48%
154
343
123%
688
1625
136%
24 1 25 1 26 | 28 |
Jun-96 Jun-96 Jul-96 Aug-96
Dec-96 Oct-96 Aug-96 Nov-96
Sep-96 Aug-96 Aug-96 Sep-96
Aug-98 Sep-98 Jan-98 May-98
23
25
17
20
14
8
9
6
2.88
2.78
3.33
-2
-2
-3
1200
490
310
-455
-24%
-28%
15%
Sep-97 Mar-98 Jun-97 Jun-97
Mar-98 Mar-98 Jul-97 Feb-98
1275
3000
3340
1760
1270
3455
2085
965
115500 507400 249500 131300
171800 545000 323000 227000
245
255
105
105
183
201
283
264
4%
74%
16%
91%
560
795
350
250
798
310
443
300
450
400
900
600
348
121
266
900
31000 305000 55800 125000
13900 121000 24200 28900
110000 58500 25200
64000 354500 77800 149750
14630 159500 77640 31855
-55%
0%
-79%
-77%
64
150
124
53
100
66
71
372
-19%
3%
34% 148%
754
393
411
998
669
151
366
1072
-11%
-62%
-11%
7%
124
100
754
295_
63
42 1
Apr-96
Dec-96
Jul-97
Mar-98
8
8
14b
Aug-97
Nov-97
1580
835
67600
138400
70
60
-14%
460
321
590
92
12000
800
44450
5060
-89%
60
110
83%
258
211
-18%
3 - | 32 Sep-96 Aug-96
Oct-96 Nov-96
Mar-96 Jan-97
Jan-99 Jan-99
24
32
8
18
1.78
-7
2538
-505
-25% MM
Feb-98 Sep-97
Apr-98 Mar-98|
9980
1530
7442
2035
903360 187800
1081400 1259001
365
105
372
149
42%
2%
2750
1260
2400
697
100
2000
2020
339
283000
3500
5100
163000
145000 19900
9000
393000
256100 38545
-35% 328%
63
252
328
73
16%
30%
309
2702
2383
726
135%
-12%
Table 12 Case Data
Dwg Planned
Dwg actual
Test Plan Start
Test Finish
Actual Length
Planned Length
Schedule Stretch
Dwg Lead
Plan
Facility
Overrun hr
Overrun %
Specimen Plan
specimen actual
planned-hr
total hr
total $
planned $
plan
actual
overrun
IDesign
actual
IFabric.
actual
Matl $
actual $
odc$
IFabri$
sumact$
overrun
Instrument plan
actual
Test
Conduct
Instr 2
Iplan
actual
Iplan
actual
Conduct 2 Iplan
actual
est No.
13
Jul-96
Nov-96
May-96
Nov-98
30
9
3.33
-6
-570
42%
Jun-97
Jul-98
1350
1920
199900
135000
69
133
93%
640
855
381
249
27000
15600
30700
47955
44395
-7%
59
59
0%
266
266
0%
39
40
266
309
7
30
27 1 8
|
May-97
Sep-96 Jul-96
Aug-98
Oct-96 Jan-97
Aug-98 Mar-96 Jun-96 May-96
Feb-99 Dec-97 Jun-99 Jul-99
21
36
6
38
12.5
10
10
9
2.88
3.80
2.33
-8
-4
-8
930 -3374 -2680 -2510
77%
132%
153%
OM
Oct-96 Jul-97 Jul-97
Oct-98 Feb-97 Feb-98 Jul-98
1900
2300
2200
3500
1370
5574
6180
4410
110500 536320 593900 506000
195000 179700 487000 251000
100
300
180
200
257
185
549
433
85%
83% 141%
29%
800
890
900
1150
339
1550
1117
1460
630
1400
730
800
100
1314
1086
867
11000
3700 207000 99000
900 32100 38000 68500
58300 61500 84700
55000 43850 284000 133650
5500 130570 121230 132385
-57%
-1%
-90%
198%
154
183
64
64
100
340
662
125
262%
95%
56% 121%
551
721
839
549
1700
1798
2879
591
7%
149% 243%
210%
21
14
Jul-96
Nov-96
May-96
Dec-98
31
7
4.43
-6
-1005
118%
Jun-97
Mar-98
850
1855
159000
75000
68
174
156%
450
507
200
207
7000
6400
4200
18000
15585
-13%
59
59
0%
266
905
240%
225
-20%
Sep-97
May-98
1100
875
70000
88000
80
95
19%
400
248
64
1
Jan-97
Sep-96
Dec-98
27
8.5
3.18
-4
720
-39%
Sep-97
Dec-97
1840
1120
100400
239600
135
147
9%
470
218
300
181
92400
10800
22000 108900
13640
9955
-38%
-91%
24
170
30
85
-50%
25%
250
425
503
490
101%
15%
_
1
21
Sep-96
Table 13 Case Data
I Totals
Dwg Planned
Dwg actual
Test Plan Start
Test Finish
Actual Length
Me
Planned Length
Schedule Stretch
outliel
Dwg Lead
Plan
Facility
Overrun hr
Overrun %
Specimen Plan
specimen actual
planned-hr
total hr
total $
planned $
plan
actual
overrun
|Design
actual
IFabric.
actual
Matl $
actual $
odc$
IFabri$
sumact$
overrun
Instrument plan
actual
Test
Conduct
Instr 2
Iplan
actual
jplan
actual
Conduct 2 Iplan
actual
3%
58640
60194
$6,404,020
$6,315,800
3872
5649
46%
16627
15820
16021
12147
1532200
717900
859800
2413355
1527885
-37%
2252
3612
60%
13932
20166
45%
317
345
1691
933
1.4589
0.9515
0.7582
0.6331
1.6039
1.4475
1.0883
0.5517
65