Alternative Response Technology API API Study

advertisement
Alternative Response Technology API
API Study-Progressing Learnings
Michael J. Cortez
BP America-Oil Spill Technology Manager
RRT-3 Presentation- November 6, 2013
API Alternative Response Technology
Working Group Report
•
Commissioned by API as part of its Sept 2010 JITF Study
 Capture learnings from Macondo Incident
 Propose ARTES enhancements based on the learnings
•
Team composition: ARTES team members from Macondo
(USCG, OSPR, Obrien’s, various OSR consultants, BP) along
with NOAA, EPA and other industry entities
•
Kicked off study in Dec 2011; completed and endorsed July
2013
•
Proposed RRT & NRT presentations for late 2013 to review
the conclusions & recommendations
•
Request RRT-3 Endorsement of proposed changes to ICS
Pre – Macondo ARTES
•
•
•
•
ARTES within Environmental Unit
For non-conventional response tools
Evaluate one idea/tool at a time
Used in concert with NCP product
schedule
• Limited external public engagement
• Used sporadically, therefore never truly
embedded in ICS structure
Macondo experience
•
•
•
•
•
•
•
•
•
•
120,00+ total submissions
Multiple technical reviews required
Conventional & non-conventional ideas submitted
Submissions via phone, fax, e:mail, internet, walk-up
From 100 countries in 88 languages
Multiple submission channels (PIERS, EPA, IATAP, LABOEC)
Multiple Incident Command Posts and a Unified Area Command
Seek out Operational needs
Field Tested 100 new technologies; 45 were proved and implemented
Limited exposure within Planning cycle increased testing logistics difficulties
Post Macondo Study Recommendations
• Scalable Response Technology Evaluation
(RTE) Unit within the Planning Section.
• RTE Unit collects, evaluates, and tests all new
or improved technology ideas (conventional
and alternative).
• RTE Unit is the single ICS channel for public or
vendors submissions.
• Use of multi-stage, progressive technical
assessment for idea evaluation.
• The National Response Team should solicit for
the development of a database system.
• Use FEMA Responder Knowledge Base as the
“clearinghouse” to hold and disseminate RTE
assessment/test results.
Technical Work Group Members
James Fletcher - USCG
Kurt Hansen - USCG
Yvonne Addassi - OSPR
Ellen Faurot-Daniels - OSPR
Ed Levine - NOAA
Eric Koglin - EPA
Jim O’brien - Obrien’s Response
Steve Lehmann- NOAA
Ken Lukins - Lukins & Associates
Tommy Tomblin - Exxon-Mobil
Erik Demicco – Exxon-Mobil
Rene Bernier - Chevron
Hunter Rowe - BP
Mark Moran - BP
Mike Cortez - BP (Working Group lead)
Backup Slides
Current & Proposed ICS Structure
RTE Scalability Examples
RTE Unit positions added as needed,
written Incident Action Plans,
multiple operational periods (smaller
team)
Expanded RTE Unit Structure within
single ICP, multiple operational periods,
high submissions, field testing required
RTE Scalability Examples
RTE Unit Structure with multiple Incident Command Posts and when
an Area Command is established for a Tier 1 or SONS Event
Stage-Gate Review Process
DWH
ARTs
Process
•
•
•
•
•
Four-stage review and testing process (similar to diagram above)
Automated communication informing submitters of status
Scoring system suitable for ranking and prioritization
Review and scoring by technical specialists in RTE Team (separated from testing roles)
Test protocols written by technical specialists in RTE Team
Database Requirements
•
Maintain record of submittals including timestamps for all entries, changes, or
updates
•
Secure but accessible by multiple people at multiple locations
•
Tracks evaluation stages and retains testing records
•
Simple, web-based entry format
– Minimum /maximum information requirements for submissions
– Disclaimers for intellectual property (IP) issues, responsible-party (RP) issues, required
confidentiality clauses, contractual relationship
•
Automated email to submitters upon review stage decisions, to maintain
communication and establish transparency
•
Technology evaluation score archiving
•
Retention and linking of field-testing results to the submission record
Download