copy of my defense presentation

advertisement
Efficient, Accurate, and Non-Gaussian
Statistical Error Propagation Through
Nonlinear System Models
Travis V. Anderson
July 26, 2011
Graduate Committee:
Christopher A. Mattson
David T. Fullwood
Kenneth W. Chase
Presentation Outline
Section 1: Introduction & Motivation
Section 2: Uncertainty Analysis Methods
Section 3: Propagation of Variance
Section 4: Propagation of Skewness & Kurtosis
Section 5: Conclusion & Future Work
2
Travis V. Anderson
Section 1:
Introduction & Motivation
3
Travis V. Anderson
Engineering Disasters
Tacoma Narrows Bridge
Space Shuttle Challenger
Hindenburg
Chernobyl
Travis V. Anderson
F-35 Joint-Strike Fighter
5
Research Motivation
• Allow the system designer to quantify system model
accuracy more quickly and accurately
• Allow the system designer to verify design decisions
at the time they are made
• Prevent unnecessary design iterations and system
failures by creating better system designs
6
Travis V. Anderson
Section 2:
Uncertainty Analysis Methods
7
Travis V. Anderson
Uncertainty Analysis Methods
• Error Propagation via Taylor Series Expansion
• Brute Force Non-Deterministic Analysis
(Monte Carlo, Latin Hypercube, etc.)
• Deterministic Model Composition
• Error Budgets
• Univariate Dimension Reduction
• Interval Analysis
• Bayesian Inference
• Response Surface Methodologies
• Anti-Optimizations
8
Travis V. Anderson
Brute Force Non-Deterministic Analysis
• Fully-described, non-Gaussian output distribution
can be obtained
• Simulation must be executed again each time any
input changes
• Computationally expensive
9
Travis V. Anderson
Deterministic Model Composition
• A compositional system model is created
• Each component’s error is included in an erroraugmented system model
• Component error values are varied as the model is
executed repeatedly to determine max/min error
bounds
10
Travis V. Anderson
Error Budgets
• Error in one component is perturbed at a time
• Each perturbation’s effect on model output is
observed
• Either errors must be independent or a separate
model of error interactions is required
11
Travis V. Anderson
Univariate Dimension Reduction
• Data is transformed from a high-dimensional space
to a lower-dimensional space
• In some situations, analysis in reduced space may be
more accurate than in the original space
12
Travis V. Anderson
Interval Analysis
• Measurement and rounding errors are bounded
• Arithmetic can be performed using intervals instead of a
single nominal value
• Many software languages, libraries, compilers, data types,
and extensions support interval arithmetic
• XSC, Profil/BIAS, Boost, Gaol, Frink, MATLAB (Intlab)
• IEEE Interval Standard (P1788)
13
Travis V. Anderson
Bayesian Inference
• Combines common-sense knowledge with
observational evidence
• Meaningful relationships are declared, all others are
ignored
• Attempts to eliminate needless model complexity
14
Travis V. Anderson
Response Surface Methodologies
• Typically uses experimental data and design of
experiments techniques
• An n-dimensional response surface shows the output
relationship between n-input variables
15
Travis V. Anderson
Anti-Optimizations
• Two-tiered optimization problem
• Uncertainty is anti-optimized on a lower level to
find the worst-case scenario
• The overall design is then optimized on a higherlevel to find the best design
16
Travis V. Anderson
Section 3:
Propagation of Variance
17
Travis V. Anderson
Central Moments
•
•
•
•
•
18
0th Central Moment is 1
1st Central Moment is 0
2nd Central Moment is variance
3rd Central Moment is used to calculate skewness
4th Central Moment is used to calculate kurtosis
Travis V. Anderson
First Order Taylor Series
19
Travis V. Anderson
First-Order Formula Derivation
Square and take the Expectation of both sides:
Covariance
Term
Assumption:
• Inputs are independent
20
Travis V. Anderson
First-Order Error Propagation
• Formula for error propagation most-often cited in
literature
• Frequently used “blindly” without an appreciation
of its underlying assumptions and limitations
21
Travis V. Anderson
Assumptions and Limitations
1. The approximation is generally more accurate for linear
models  This Section
2. Only variance is propagated and higher-order statistics are
neglected  Section 4
3. All inputs are assumed be Gaussian  Section 4
4. System outputs and output derivatives can be obtained
5. Taking the Taylor series expansion about a single point
causes the approximation to be of local validity only
6. The input means and standard deviations must be known
7. All inputs are assumed to be independent
22
Travis V. Anderson
First-Order Accuracy
Function:
Input Variance:
y = 1000sin(x)
0.2
100% Error
Unacceptable!
23
Travis V. Anderson
Second-Order Error Propagation
Just as before:
1.
Subtract the expectation of a second-order Taylor
Assumption:

• Inputs
are
Gaussian
series from a second-order Taylor series
2. Square both sides, and take the expectation
 Odd moments are zero
24
Travis V. Anderson
Second-Order Error Propagation
• Second-order formula for error propagation mostoften cited in literature
• Like the first-order approximation, the second-order
approximation is also frequently used “blindly”
without an appreciation of its underlying
assumptions and limitations
Travis V. Anderson
Second-Order Accuracy
Function:
Input Variance:
26
y = 1000sin(x)
0.2
Travis V. Anderson
Higher-Order Accuracy
Function:
Input Variance:
27
y = 1000sin(x)
0.2
Travis V. Anderson
Computational Cost
28
Travis V. Anderson
Predicting Truncation Error
• How can we achieve higher-order accuracy with
lower-order cost?
29
Travis V. Anderson
Predicting Truncation Error
• Can Truncation Error Be Predicted?
30
Travis V. Anderson
Adding A Correction Factor
Trigonometric (2nd Order): y = sin(x) or y = cos(x)
31
Travis V. Anderson
Trigonometric Correction Factor
32
Travis V. Anderson
Correction Factors
33
Natural Log (1st Order):
y = ln(x)
Exponential (1st Order):
y = exp(x)
Travis V. Anderson
Correction Factors
Exponential (1st Order):
y = bx
where:
34
Travis V. Anderson
So What Does All This Mean?
• We can achieve higher-order accuracy with lowerorder computational cost
Average Error
35
Computational Cost
Travis V. Anderson
Kinematic Motion of Flapping Wing
36
Travis V. Anderson
Accuracy of Variance Propagation
Order
2nd:
3rd:
4th:
CF:
37
RMS Rel. Err.
40.97%
11.18%
1.32%
1.96%
Travis V. Anderson
Computational Cost
Execution time was reduced from ~70 minutes to ~4 minutes
 A computational cost reduction by 1750%
Fourth-order accuracy was obtained with
only second-order computational cost
38
Travis V. Anderson
Section 4:
Propagation of Skewness &
Kurtosis
39
Travis V. Anderson
Non-Gaussian Error Propagation
40
Predicted Gaussian Output
Actual System Output
Predicted Non-Gaussian Output
Actual System Output
Travis V. Anderson
Skewness
• Measure of a distribution’s asymmetry
• A symmetric distribution has zero
skewness
41
Travis V. Anderson
Propagation of Skewness
• Based on a second-order Taylor series
42
Travis V. Anderson
Kurtosis & Excess Kurtosis
• Measure of a distribution’s
“peakedness” or thickness of its tails
Kurtosis
Excess Kurtosis
43
Travis V. Anderson
Propagation of Kurtosis
• Based on a second-order Taylor series
44
Travis V. Anderson
Flat Rolling Metalworking Process
Maximum change
in material
thickness achieved
in a single pass
Roller Radius
Coefficient of Friction
45
Travis V. Anderson
Input Distribution
46
Travis V. Anderson
Gaussian Error Propagation
• Probability Overlap:
53%
Predicted Gaussian Output
47
Travis V. Anderson
Actual System Output
Non-Gaussian Error Propagation
• Probability Overlap:
93%
Predicted Non-Gaussian Output
48
Travis V. Anderson
Actual System Output
Benefits of Higher-Order Statistics
Accuracy:
Max ΔH:
Gaussian
Non-Gaussian
53%
3.0 cm
93%
7.9 cm
(99.5% success rate)
That’s a 263% reduction
in the number of passes!
49
Travis V. Anderson
Section 5:
Conclusion & Future Work
50
Travis V. Anderson
Conclusion
• Fourth-order accuracy in variance propagation can
be achieved with only first- or second-order
computational cost
• Designers do not need to assume Gaussian output.
 A fully-described output distribution can be
obtained without significant additional cost
51
Travis V. Anderson
Future Work
• Develop predictable correction factors for other
types of nonlinear functions and models
(differential equations, state-space models, etc.)
• Apply correction factors to open-form models
• Can correction factors be obtained for skewness and
kurtosis propagation?
52
Travis V. Anderson
Questions?
53
Travis V. Anderson
54
Travis V. Anderson
Variance Example: Whirlybird
55
Travis V. Anderson
Variance Example: Whirlybird
System Model (Pitch)
Compositional Model
56
Travis V. Anderson
Higher-Order Stats Example: Thrust
Thrust Output
57
Travis V. Anderson
Higher-Order Stats Example: Thrust
Input Distribution
58
Gaussian Output
Non-Gaussian Output
Overlap: 65%
Overlap: 79%
Travis V. Anderson
Actual Output
Non-Gaussian Proof
Propagation of Skewness
Even Gaussian Inputs Produce Skewed
Outputs If 2nd Derivatives Are Non-Zero
(Nonlinear Systems)
59
Travis V. Anderson
Download