SkyBot Final Presentation

advertisement
Presented By:
TEAM SKYBOT
- Bradley Wilson
- Harry Ulrich
- Kumaraswamy MS
- MalarVizhi Velappan
- Shivani Pandey
TEAM HOMEPAGE :
http://www.andrew.cmu.edu/org/skybot/
1
Agenda
•Race Overview
•Stakeholders
•Requirements Analysis
•Trade Study
•Functional Analysis
•Reliability Analysis
•Systems Integration Plan
•Unit, Systems and Integration Test
•Technical Performance Metrics
•Lessons Learned
•Q&A
2
Race Overview
When – September 23rd, 2006
Where – Pikes Peak Highway, Colorado
Why – To further the science of robotic vehicles
What – Ascension of the twisting 12.4 mile course
OUR MISSION  TO WIN THE RACE
3
Stakeholders (1/2)
Exposure
Spectat
ors
Funding
Entertainment
Race
Administrat
ion
Land /
Rules
Sponsor
s
Safety/
Facilites
Feedback
Performance
Evaluation
Participation
Park
Servic
es
Funding
Decisi
on
Maker
s
Marketing
Payment
Vetted
Technology
/ Money /
Feedback
Funding
A
story
Material /
Support
Invest
ors
Technology/
Intellectual
Property
Media
Brand
Building
Suppliers/
Maintenanc
e
4
Stakeholders (2/2)
Race
Administration
Define
Rules
Sponsors
Race Vehicle
Subsytem
Funds
Brand
Recognition
Audience
Entertainment
Provide engine
and other parts
Suppliers
5
Race Vehicle Requirements1
• Safety
-
Requirement 2.1
• Performance
-
Requirement 2.2
• Schedule
-
Requirement 2.3
• Cost
-
Requirement 2.4
• Reliability
-
Requirement 2.5
1 “Race Vehicle Requirements”, SkyBot Requirements Analysis, http://www.andrew.cmu.edu/org/skybot/documents/SkyBot_RaceVehicle_Subsystem_Requirements_V1.4.doc
6
Race Vehicle Requirements
Considerations
•Race Administration Rules
•Qualification Information
•Finances
•General Considerations
7
Trade Study for Engine Selection
OBJECTIVE
To recommend engine type for SkyBot in the Pikes Peak Hill Climb
CANDIDATE ENGINE TYPES
•
•
•
•
•
•
Gasoline
Electric
Gasoline-electric
Ethanol
Solar
Diesel
8
Trade Study for Engine Selection
QUALITY ATTRIBUTES
•
Safety
-Fuel safety
-Control
-Environmental impact
•
Power
•
Reliability
-Availability in market
-Reliability
-Maintainability
•
Cost
9
Trade Study for Engine Selection
Trade
Variables
Safety
Sub Variables
Scale
Score
Range
Fuel safety
Linear
1-10
10%
Control
Linear
1-10
15%
Environmental impact
Linear
1-10
5%
30%
Linear
1-10
25%
25%
Availability of engine in market
Linear
1-10
15%
Reliability of engine
Linear
1-10
15%
Maintainability of engine
Linear
1-10
5%
35%
Linear
1-10
10%
10%
Power
Reliability
Cost
Weight
(Percentage)
Total Weight
10
Trade Study for Engine Selection
Trade Variables
Sub Variables
Fuel safety
Control
Safety
Environmental
impact
Cost
Total Weighted
Score
Gasoline
0.1
0.15
Description
Raw
Score
Weighted
Score
6
0.6
Flammable nature of the fuel
0.9
Ability of the vehicle to stop or pause
within a few seconds of activating the
control
6
0.05
3
0.15
Approval by the United States Department
of Transportation
0.25
10
2.5
Sufficient power to propel the vehicle up the
peak at 30 mph
Availability
0.15
10
1.5
Available at least six weeks before the race
Reliability
0.15
9
1.35
99.9% reliable to operate 2 hours at 30 mph
during adverse weather conditions
Maintainability
0.05
10
0.5
Easy to maintain like an everyday car
engine
0.1
9
0.9
Purchase, cost and maintenance shall not
exceed USD 15K
Power
Reliability
Weight
8.4
11
Trade Study for Engine Selection
GASOLINE ENGINE
•
•
•
•
High power
High reliability
Low cost
Less safety
12
Race Vehicle Sub System (1/2)
2.0
1.0
Requirements
Analysis and Validation
8.0
3.0
Design the race
vehicle subsystem
7.0
Post-Race Vehicle
Maintenance and
Operations
Operate the race vehicle
in actual race
environment
4.0
Procure or
Develop the
subsystem
components
Perform integration
and sub system
acceptance test
6.0
Operate the vehicle in
Pre-Race environment
5.0
Package and
Ship the race
vehicle
Level 1 - Functional Block Diagram
13
Race Vehicle Sub System (2/2)
7.0
Ref
Operate the race
vehicle in actual race
environment
7.1
Prepare the race
vehicle in inertial
state
7.2
Move the race
vehicle to the start
line
7.3
Start the race
vehicle subsystem
at the start line
7.4
Move the race
vehicle from the
start line to the
finish line
7.5
Perform the race
vehicle subsystem
shutdown
7.6
Manually control /
maneuver the race
vehicle
Level 2 - Functional Block Diagram
14
Functions of Race Vehicle
7.4.1
7.4
Start vehicle
at start line
Sense
7.4.2
Perceive
7.4.7
Plan
Reach the finish
line safely
7.4.3
7.4.4
Navigate
7.4.5
Record
7.4.6
Ensure Safety
15
Functions (1/3)
Sense
- Interpret using sensors such as GPS, RADAR , LIDAR ,
contact sensors
-Pose Sensing
-Obstacle Detection
Perceive
- Geometry characterization
16
Functions (2/3)
Plan
-Speed
-Path
Navigate
-Steering
-Road Finding
-Braking
-Speed Control
-Route Following
17
Functions (3/3)
Record
- Capture path
Ensure Safety
-Make Sound
-Emit Light
-Suppress fire
-Continuous monitoring
18
Reliability Analysis
Based on the Requirement – 2.5.1
•The Reliability of the race vehicle subsystem, R = 0.999.
•The total race time, t = 0.4 hrs
•The MTBF of the race vehicle = 400hrs
Reliability Analysis
Reliability
MTBF (in hours)
Failure Rate (per hour)
0.999
400
0.0025
0.98
20
0.05
0.95
8
0.125
0.90
4
0.25
0.85
2.5
0.4
0.82
2
0.5
0.67
1
1
19
Reliability Block Diagram
Input
Output
Navigation
Control
Sensor
Percieving
Planning
Safety Control
RRaceVehicle  RSensor RPercieving RPlanning RNavigation RSafetyControl   Rsubsystem 
5
 Rsubsystem  5 RRaceVehicle  5 0.95  0.989
Re
t MTBF 
 MTBF 
t
 0.4

 36h
ln( R) ln( 0.989)
20
Reliability of Individual Subsystems
Sensor Subsystem
GPS
RSensor  1  (1  RGPS )(1  R Radar )(1  R Lidar )(1  RContactSensor )
LIDAR
Input
RSensor  1  (1  RComponent ) 4  0.99
Output
MTBF 
RADAR
 0. 4
ln( 0.97)

40h
Contact Sensor
21
Reliability of Individual Subsystems ….contd
Perceiving Subsystem
Input
Image Processing
Software
DBMS
Output
R Perceing  RIm age Pr oces sin gS / W R DBMS 
R Perceing  0.995
MTBF 
t
 0. 4

 80h
ln( R ) ln( 0.995)
22
Reliability of Individual Subsystems ….contd
Planning Subsystem
RPlanning  0.989
MTBF 
t
 0.4

 36h
ln( R) ln( 0.989)
Navigation Control Subsystem
Input
Procured Vehicle
Actuator
Output
R Navigation  RPr ocuredVehicle R Actuator 
R Navigation  0.99
MTBF 
t
 0.4

 40h
ln( R ) ln( 0.99)
23
Reliability of Individual Subsystems ….contd
Safety Control Subsystem
Safety Control
Button
RSafety  1  (1  RSCB )(1  RE  Stop )(1  RSafetyMonitor )(1  RKlaxon
E – Stop
Transmitter
Input
RSafety  1  (1  RComponent ) 4  0.995
Output
Safety Monitor
MTBF 
 0.4
ln( 0.995)

80h
Klaxon
24
Reliability of Individual Subsystems ….contd
Media Control Subsystem
RMCS  0.90
MTBF 
t
 0.4

 4h
ln( R ) ln( 0.90)
25
Reliability of the Race Vehicle
R RaceVehicle  RSensor R Percieving R Planning R Navigation RSafetyControl 
 0.99 0.9950.989 0.99 0.995
 0.96
The Race vehicle has a probability of more than 96% that is
will cross the finish line.
26
Failure Mode, Effects and Criticality Analysis [FMECA]
Sensing
Failure
Planning
Failure
Contact Sensor
Failure
Perceiving
Failure
RADAR Failure
Software Crash
Image Processing
Software Fail
LIDAR Failure
Unrecognized
Input
Invalid
Input
GPS Failure
Database Crash
Fail to Accomplish
Mission
Safety Control
Buttons Failure
Procured Vehicle
Failure
E-Stop Failure
Brake
Failure
Accelerator
Failure
Safety Monitor
Failure
Steering
Failure
Klaxon Failure
Actuator Failure
Safety Control
Failure
Navigation
Failure
Ishikawa – Cause and Effect Diagram
27
Failure Mode, Effects and Criticality Analysis [FMECA]
Failure Detection
Failure
Cause of
Effects of
Means
Mode
Failure
Failure
Safety
control
failure
Probability
Potential
Safety
Control
Button Failure
The race vehicle
does not stop
During Testing and the
tracking the movement in
GPS
8
7
2
112
E-Stop
Failure
Race vehicle does
not respond to
output signals
During Testing and by
tracking the movement in
GPS
9
2
7
126
Safety
Monitor
Failure
Safety Monitor
does not exhibit
mode of
operation
During Testing and by
evaluating the data on the
Safety Monitor
4
8
7
224
Klaxon
Failure
There is no
respond to signal
During Testing and by
verifying the sound
3
5
3
45
RPN
Detection
Potential
Frequency
7.5.5
Potential
Severity
Ref Number
Reference Number = 7.4.5
28
Race Vehicle
Subsystems
 Sensor – GPS, Radar, Lidar
 Perceiving – Object recognition, map of environment
 Planning – Where to go? How to get there?
 Navigation – Go
 Media Control – Record progress
 Safety Control – Monitor all systems, handle failures,
emergency stop
29
Integration
Strategy




Integrate most crucial subsystems first
Begin low level integration
Continue on, achieve more sophisticated functionality
Iterative process
Develop -> Integrate -> Test
30
Integration
Plan
 Navigation Control –
Stop, Start, Steer the vehicle
 Safety Control – Emergency shutdown,
System monitor
 Sensing to Perceiving – Raw data from Sensing
transformed into Perceived Truth
 Perceiving to Planning – Plan a route based on Perception
 Planning to Navigation – Follow route, achieve goals
31
Unit , Integration and Systems Test
OBJECTIVE
To verify and validate the Race vehicle subsystem based on the Requirements
Analysis for the Pikes Peak Hill Climb Race
FUNCTIONAL ELEMENTS
•Sense – Sensor Subsystem
•Perceive – Perceiving Subsystem
•Plan – Planning Subsystem
•Navigate – Navigation Control Subsystem
•Record – Media Control Subsystem
•Monitor Safety – Safety Control Subsystem
32
Unit , Integration and Systems Test
SCOPE
Analytical Evaluation – Design relationships and potential issues
Type 1 Testing – Performance models and design characteristics
Type 2 Testing – Initial qualification of the race vehicle for the race
Type 3 Testing – Operation of the race vehicle after integration
Type 4 Testing – True capability and operational effectiveness
SEQUENCE
Static Testing
Operational Testing
33
Unit , Integration and Systems Test
TYPE 2 TESTING
Performance test
Environmental qualification
Structural test
Technical data verification
Software verification
Reliability qualification
Maintainability demonstration
TYPE 3 TESTING
Compatibility between the prime equipment and the software
Compatibility between the prime equipment and the support equipment
Operational activities of the Race vehicle subsystem
34
Unit , Integration and Systems Test
PREPARATION FOR TEST AND EVALUATION
Test and evaluation procedures
Test site selection
Test personnel
Test facilities and resources
Test supply support
TEST
35
Unit , Integration and Systems Test
REVIEWS
UNIT TEST FOR EACH OF THE
COMPONENTS IN THE 6 SUBSYSTEMS
SUBSYSTEM TEST
SYSTEM INTEGRATION
TEST
SYSTEM
TEST
QUALITY
36
Unit , Integration and Systems Test
Sense_unit
Sense
Perceive_
unit
Plan_
unit
Perceive
Plan
Sense
Perceive
Plan
Be safe
Navigate safe
Navigate
safe, Sense
Plan
Navigate
safe, Sense,
Perceive
Run
Record
_unit
safe_unit
Navigate
Plan
Perceive
Be
Navigate
_unit
Record
Record
Record
Record
Record
Unit test
Subsystem test
Subsystem Integration test
System test
Win
RELIABILITY
37
Unit , Integration and Systems Test
POST TEST AND EVALUATION
Test plan review
Reliability test
Regression test
Acceptance by Race vehicle administration
KEY NOTES
A few tests might not completely succeed
Mission critical tests should pass
38
The SkyBot Team Uses Technical Performance Measures
(TPMs) To Improve Quality
 Influence the system design process to incorporate the right
attributes to produce a system that will ultimately meet
customer requirements effectively. (Blanchard and
Fabrycky, 2006)
 Technical performance measures are used to mitigate risk
during design and manufacturing. Measures (or metrics) are
used to help manage a company's processes. (Moody, et al.,
1997)
 Measurement is the key. If you cannot measure it, you
cannot control it. If you cannot control it, you cannot
improve it. (Dean and Bahill, Sandia National Labs, 2006)
39
SkyBot TPMs Bear In Mind Three Keys To Quality
•Development
Cost is a TPM
•Only five
TPMs were
selected
Three Key SkyBot
Performance
Parameters Selected
•Schedule is a
TPM
•Simulation
Testbed
40
The Five TPMs Are Derived From Requirements
Specifications
1.) Remote Shutdown
-Requirement 2.1.2.4
2.) Schedule
-Requirement 2.3.1
3.) Velocity
-Requirement 2.2.1
4.) Endurance
-Requirement 2.5.1
5.) Development Cost
-Requirement 2.4.1
Key
Performance
Parameters
Business
Metrics
41
TPM Monitoring Is Closely Integrated With System
Testing
Unit Tests
Subsystem Tests
Subsystem
Integration Tests
System Tests
Remote
Shutdown
Velocity
Endurance
Milestone 1
Milestone 2
Milestone 3
Milestone 4
42
The Highest Priority TPM Was Remote Shutdown
 The metric is the time (in seconds) it takes for the race
vehicle to come to a complete stop
 Isolation of problem areas such as software processing time
or actuation errors will be important
 The vehicle must come to a controlled stop, extremely low
times could be misleading
43
Constrained Development Time Makes Schedule An
Important Measure
 Requirement 2.3.1 stipulates that integration should take place
at least four weeks before the race
 This leaves less than a month for the development phase
 Testing milestones will be on a weekly basis, but schedule will
monitor daily looking for areas that can be accelerated
44
Race Vehicle Velocity Will Be Critical In SkyBot’s
Success
 A marginal positive trend from the first two milestones is
expected, but Subsystem Integration Testing will prove
most valuable
 The measure will be the maximum speed reached during
testing
45
The Race Vehicle’s Endurance Will Help Win The
Race
 The endurance measure is the Mean Time Between Failure
for two hour runs
 Failure could be caused by any subsystem, examples include
poor garbage collection and poor startup routines
46
The Cost of Development Must Be Monitored
 The project management plan provides a budget, but
ultimately, spending will vary
Estimated Spending Per Milestone
120
Dollars (in thousands)
100
80
60
Total Spending
40
20
0
1
2
3
Milestones
4
47
Milestones Will Be Closely Coordinated With The Testing
Schedule
August 2006
1
2
3
4
7
8
9
Milestone 1
10
11
14
15
16
Milestone 2
17
18
21
22
23
Milestone 3
24
25
28
29
30
Milestone 4
31
Milestones
Unit Testing
System Testing
Subsystem Integration Testing
System Testing
48
Simulation Benefits And Utilization
 A difficulty in performance metrics is getting good metrics
during development
 Through the use of simulation we can get informed data
about the race vehicle during its development, instead of
waiting for all subsystems to be available
 We intend to do some exploratory work to help define
“where we should be” early on
49
Lessons Learned
1. Change of perspective from Object Oriented to Systems approach.
2. Learned Decision Making Tools and Techniques
of Systems Engineering .
3. Developed better understanding of Stakeholders perspective.
4. Understood importance of Version controlling of documents.
5. Realized importance of good communication within Team.
50
Q&A
???
51
Download