Parameter Dispersion Analysis for ARES

advertisement
LADEE Flight Software:
Models to Flight
Karen Gundy-Burlet, Ph.D.
LADEE FSW Lead
NASA-Ames Research Center
Moffett Field, CA
Nathan Benz
Peter Berg
Howard Cannon
Pat Castle
Scott Christa
Eleanor Crane
Doug Forman
Greg Limes
Mike Logan
Masoud Mansouri-Samani
Craig Pires
Fritz Renema
Larry Shackelford (GSFC)
Danilo Viazzo
1
Mission Overview
• Lunar Atmosphere and Dust Environment
Explorer (LADEE) is a NASA mission that
will orbit the Moon and its main objective
is to characterize the atmosphere and
lunar dust environment.
– Low cost, minimal complexity and rapidly
prototyped “common bus” design.
– Model-Based Software Development
• Specific objectives are:
– Determine the global density,
composition, and time variability
of the lunar atmosphere;
– Confirm the Apollo astronaut
sightings of dust jumps and
diffuse emission
– Laser Communications
Demonstration: 622 Mbs Record
download rate from the Moon!
Clementine
spacecraft image of
moon dust corona
Gene Cernan’s
drawings of the lunar
sunrise
2
Outline
Review of LADEE Development Process & Software Architecture
Some Lessons Learned
•Model Based Development
•NPRs/CMMI
•Model Based Development
•“Just In Time” organizations
•Interface Control Documentation
•Emergent Behavior
Current Status
3
Flight Software Overview
•
Scope
-Onboard Flight Software (Class B)
-Support Software and Simulators
(Class C)
-Integration of FSW with avionics
•
Guiding Documents
-NPR 7150.2 Software Engineering
Requirements
-CMMI Level 2
-NASA-STD-8739.8 NASA Software
Assurance Standard
•
Development Approach
– Model Based Development Paradigm (prototyped process using a “Hover Test
Vehicle”)
– 5 Incremental Software Builds, 2 Major Releases, 3 final sub-releases
•
•
•
•
5.1: Defects found by I&T and 3DOF
5.2: Defects found by Mission Operations Testing
5.3: Final RTS set for Golden Load
Leverage Heritage Software
– GOTS: GSFC OSAL, cFE, cFS, ITOS
– MOTS: Broad Reach Drivers
– COTS: , VxWorks, Mathworks Matlab/Simulink & associated toolboxes
4
Model Based Development
Iterate Early and Often
Requirements
Verification
Design/Algorithm
Development
Flight Software
Modeling
Heritage
Models
Analysis
Vehicle &
Environment
Modeling
Workstation
Simulations
(eg. Simulink)
Hand
Developed
Apps
Heritage
Software
•
•
•
•
•
Unit
Tests
Automated
Reporting
Code
Generation
Integrated
Tests
Processor-in-the-Loop
Hardware-in-the-Loop
Develop Models of FSW, Vehicle, and Environment
Automatically generate High-Level Control Software
Integrate with hand-written and heritage software.
Iterate while increasing fidelity of tests – Workstation Sim (WSIM), Processor-In-The-Loop
(PIL), Hardware-in-the-Loop (HIL)
Automated self-documenting tests providing traceability to requirements
5
FSW Architecture
OFSW
Cmd
& Mode
Processor
Actuator
Manager
State
Estimator
Safe Mode
Controller
Attitude
Control
System
Thermal
Control
System
Power
Control
System
Battery
Charge
System
Memory
Scrub
Hardware
I/O
Simulink Interface Layer
Scheduler
Stored
Commands
Health &
Safety
Memory
Manager
File
Manager
Memory
Dwell
Limit
Checker
CCSDS File
Checksum
Delivery
Housekeeping
Data
Storage
Telemetry
Output
Command
Ingest
System Support and O/S Services
KEY
FSW Internal
FSW External
Simulink
Task
cFS
Task
Hand
Written
Task
Telemetry
Gnd Cmds
Hdwr Cmds
Sensor Data
GSFC OSAL, cFE, cFS, ITOS (GOTS)
Broad Reach Drivers (MOTS)
Simulink/Matlab, VxWorks (COTS)
6
The Basic Question
Model-Based Development with significant software re-use:
- Hype or Help?
Advantages:
•High level control software in the “Native Language” for GN&C developers
•Allowed development to be highly parallelized
– Modular design for all applications, Simulink or hand-developed
– Software developers could prototype autocode/integration process independent
of Simulink module development
– Early Requirements definition led to ability to formalize test infrastructure early in
development cycle.
•Easy to communicate design/algorithms/data flow with stakeholders and other
subsystems.
•Simulink Report generator is an extremely powerful tool for driving verification
system.
•Generated code was generally clean and efficient.
– Static analysis tool found some defects, which were eliminated through changes
in modeling practices.
7
The Basic Question, cont.
Disadvantages:
•
•
Have to be very careful when patching Simulink applications. Must upload
associated parameter table. Cannot change interfaces/buses in flight.
Bus changes during development induced significant rework of test
harnesses.
Personal Experience from a Simulink Newbee: LADEE Propulsion Model
• Surprisingly fast to model. Amenable to modular development.
• Think carefully about flow and propagation of signals. Painful to update
models/test harnesses.
• Absolutely needed advice/example models from Simulink expert because of
the complexity, range of modeling choices/parameters and their effects on
the autocode.
Lesson: Model-Based Development is effective when used in a highly
disciplined manner. Sloppy development practices lead to bad outcomes, no
matter what the language.
8
NASA Procedural Regulations
& Capability Maturity Model Integration
My Perspective:
•
•
•
Daunting set of rules/regulations/practices that have arisen from lessons
learned and bad outcomes from other projects.
They are an effective roadmap/checklist that help guide on how to plan
and document our solutions to prevent those problems.
Strategy: Comply as simply and effectively as possible.
Lesson Learned:
•
Safety and Mission Assurance/Independent Reviewers can be a real ally
– Providing templates and advice about effective practices.
– Extra experienced eyes during the planning phase leads to fewer process
issues and software defects later.
– Early understanding by SMA/IR of project practices leads to smoother and
more effective reviews.
•
When you get to the inevitable time crunch at a critical milestone, it is
essential to have practiced, documented processes.
– Multiple examples in LADEE of schedule being brought in by well-practiced test
program
9
“Just In Time” Organizations
Our “final” 2 build cycles (4 & 5) were major deliveries to the spacecraft.
•We smugly delivered them “Just In Time” to meet I&T schedule needs.
– I&T picked up both builds and tested extensively with them
– I&T and the 3DOF testbed identified ICD related defects
MOS “Just In Time” schedule not tied to these releases.
•From a FSW perspective, this delayed discovery of:
– More defects in EDICD/Fault management logic.
– Hidden Requirements on the spacecraft simulator model.
– Our concept of “Test Like You Fly” was not the way MOS actually flew.
•More importantly, the schedule was not tied to “Golden Load” deadline
– Milestones for development and certification of MOS RTSs were scheduled for after
the need date for inclusion in the “golden load”.
– Schedule Catch 22: They needed the “golden load” to certify the RTSs for inclusion
in the golden load.
– This put FSW back on the critical path for the spacecraft launch date!
Lessons Learned:
•“Just in Time” philosophy leaves no room for missed schedule dependencies
•We missed valuable test time of the end-to-end operation of the spacecraft
leading to a delay in identifying defects. Some were simply too late too be
incorporated in the “golden load”
10
Interface Control Documentation
The primary cause of defect escape into Build 5.x was misunderstood or
ambiguous ICDs.
Examples:
•
•
•
Fuel Tanks (A,B) = FSW (1,2), Ox Tanks (A,B) = FSW (2,1)
Star tracker: Interpreted 8Hz operation to mean heads alternately operating at
4 Hz. Instead, they operated synchronously.
Reaction Wheels: “Current Limit Flag” was a warning flag, not an error flag.
Mitigation:
•
•
EDICD in computer readable format and read into database, so could quickly
reconfigure
Ability to upload parameter tables & software patches
Best prevention for ICD problems was our “Travelling Roadshow”
•
•
EDU in a mobile chassis loaded with the FSW
Flown to payload sites and integrated with instruments.
Lessons Learned:
•
•
Early integration with payloads/instruments essential to clarifying ICDs.
Significant rework possible when one “saves money” by not purchasing
instrument engineering development units
11
Emergent Behavior
Prior to one of the orbital phasing maneuvers, the spacecraft went into "safemode”
• Fault Management took action because it detected an excessive spacecraft
rate from the state estimation system.
• It was determined that the star tracker caused a jump in the state estimator
signal. Two primary factors:
• The as-built alignment of the star tracker was slightly different than asdesigned.
• Fixed by a table upload. Reduced rate errors but did not eliminate
further safemode transitions.
• The star tracker was exhibiting delays in providing the spacecraft
orientation and position when one of the cameras pointed at a "Big Bright
Object" (BBO) such as the Sun, Moon or Earth. The behavior continued to
worsen the closer we got to the moon.
• Reworked state estimator model, reran scenarios, re-verified EST Unit
Tests, GN&C L4 Requirements & uploaded patch to the spacecraft.
Lesson Learned:
• The defects we had to correct under schedule pressure and duress in I&T and
ORTs were excellent repeated practice for polishing our maintenance
processes.
12
Current Status
LADEE is currently in orbit around the moon performing science operations.
• Successful Laser Communications demonstration: 622Mbs downlink rate.
Very useful to be able to download a SDRAM partition in less than 2 minutes.
• Over 2000 hours of flight.
• Lowest altitude above the moon’s surface so far is 12.1 Km
• Expected to operate until mid-April when eclipses will occur.
LADEE Flight Software Status
• Team recertified for CMMI level 2 in May 2013
• Some minor defects found that we’ve worked around.
• Table uploads performed (ATS, RTS, FM, Thermal updates & defect reduction)
• 2 software patches to account for star tracker behavior
• 1 reboot (still under investigation)
LADEE FSW: Enabling a successful mission!
13
Download