MH Presentation Template 2012

advertisement
SEMS Auditing: An I3P
Auditor’s Perspective
Summary/Analysis of Findings To Date
Kevin Graham
Philip Emanuel, Lead Auditor
Michelle Duncan, Lead Auditor
September 11, 2013
About M&H
M&H
• Established in 1978
• More than 250 engineering and technical professionals
• Offices in Houston and Lafayette
• First I3P to conduct SEMS audit (Directed)
• More than 2 dozen SEMS audits completed or in process
Our Services
• Regulatory Compliance
• Project Management
• Design Engineering
• Information Technology
• Maintenance Management
• Technical Documentation
│2│
Our Approach to SEMS Auditing
Three types of Audits
1. Commissioned Contractor Audits
2. Gap Analysis
•
No Audit Plan, Final Report, or CAP submitted to BSEE
3. Formal I3P Audit
Based on COS SEMS Audit Protocol
M&H Presentation
Photo Caption
• Some items reorganized into separate questions
• Placeholder for Operator-specific requirements not required by
Subpart S or API RP75
│3│
Three Phases (Gap Analysis/I3P)
Phase I:
• Corporate-level documentation; program
documents/procedures
• Conducted at M&H offices
Phase II:
• Program implementation
• Conducted at Operator’s corporate office/shorebase location
Phase III:
• Program implementation (facility-specific)
• Site visit to each facility identified in Audit Plan
Number of Protocol Requirements by
Element/Phase
No of Requirements (Full Protocol)
Element
Phase I
Phase II
Phase III
Total
Percent
1 - General
33
16
0
49
11.92%
2 - Safety and Environmental
9
21
0
30
7.30%
3 - Hazard Analysis
5
25
6
36
8.76%
4 - Management of Change
17
5
0
22
5.35%
5 - Operating Procedures
23
3
2
28
6.81%
6 - Safe Work Practices
21
16
7
44
10.71%
7 - Training
8 - Assurance of Quality and Mechanical Integrity
12
34
18
4
1
6
31
44
7.54%
10.71%
9 - Pre-Startup Review
8
0
0
8
1.95%
10 - Emergency Response and Control
4
15
0
19
4.62%
11 - Investigation of Incidents
3
10
0
13
3.16%
12 - Audit of SEMS Program Elements
27
40
0
67
16.30%
13 - Records and Documentation
2
15
1
18
4.38%
NA - Operator-Specific
1
0
1
2
0.49%
199
188
24
411
100.00%
45.74%
5.84%
100.00%
Total
Percent 48.42%
Number of Protocol Requirements by
Element
Audit Protocol: Number of Unique Requirements
(by SEMS Element)
NA - Operator-Specific
2
18
13 - Records and Documentation
67
12 - Audit of SEMS Program Elements
13
11 - Investigation of Incidents
19
10 - Emergency Response and Control
9 - Pre-Startup Review
8
44
8 - Assurance of Quality and Mechanical Integrity
31
7 - Training
44
6 - Safe Work Practices
28
5 - Operating Procedures
4 - Management of Change
22
36
3 - Hazard Analysis
2 - Safety and Environmental
1 - General
30
49
Number of Protocol Requirements by
Element/Phase
Audit Protocol: Number of Phase I, II, and III Requirements
(by SEMS Element)
Phase I
80
70
60
50
40
30
20
10
0
Phase II
Phase III
Our Sample (Data Set)
16 Separate Audits
• Includes Gap Analysis and/or Formal I3P Audits
• Gap Analysis Audits: 4
• 2 of these subsequently accepted by BSEE as
meeting formal audit requirements
• I3P Audits: 12
• 3 of these were BSEE-Directed
• 1 of these utilized an Operator-defined protocol (not
the COS Protocol)
• 1 of these included only Phase I and II
• Included both production and drilling (where
appropriate)
No of Findings (Deficiencies) by
Element/Phase
No of Requirements where Findings Occurred
Phase I
Phase II
Phase III
Total
Percent
1 - General
17
5
0
22
11.89%
2 - Safety and Environmental
6
2
0
8
4.32%
3 - Hazard Analysis
2
12
2
16
8.65%
4 - Management of Change
13
4
0
17
9.19%
5 - Operating Procedures
21
1
1
23
12.43%
6 - Safe Work Practices
15
1
6
22
11.89%
7 - Training
8 - Assurance of Quality and Mechanical Integrity
10
13
8
2
0
6
18
21
9.73%
11.35%
9 - Pre-Startup Review
2
0
0
2
1.08%
10 - Emergency Response and Control
1
4
0
5
2.70%
11 - Investigation of Incidents
3
1
0
4
2.16%
12 - Audit of SEMS Program Elements
16
5
0
21
11.35%
13 - Records and Documentation
2
1
1
4
2.16%
NA - Operator-Specific
1
0
1
2
1.08%
122
46
17
185
100.00%
24.86%
9.19%
100.00%
Total
Percent 65.95%
No of Findings (Deficiencies) by
Element/Phase
Occurrence of Phase I Findings
(by SEMS Element)
NA - Operator-Specific
1
2
13 - Records and Documentation
16
12 - Audit of SEMS Program Elements
3
11 - Investigation of Incidents
10 - Emergency Response and Control
9 - Pre-Startup Review
1
2
13
8 - Assurance of Quality and Mechanical Integrity
10
7 - Training
15
6 - Safe Work Practices
21
5 - Operating Procedures
13
4 - Management of Change
3 - Hazard Analysis
2 - Safety and Environmental
1 - General
2
6
17
No of Findings (Deficiencies) by
Element/Phase
Occurrence of Phase II Findings
(by SEMS Element)
NA - Operator-Specific
0
1
13 - Records and Documentation
5
12 - Audit of SEMS Program Elements
1
11 - Investigation of Incidents
4
10 - Emergency Response and Control
9 - Pre-Startup Review
0
2
8 - Assurance of Quality and Mechanical Integrity
8
7 - Training
6 - Safe Work Practices
1
5 - Operating Procedures
1
4
4 - Management of Change
12
3 - Hazard Analysis
2 - Safety and Environmental
1 - General
2
5
No of Findings (Deficiencies) by
Element/Phase
Occurrence of Phase III Findings
(by SEMS Element)
NA - Operator-Specific
1
13 - Records and Documentation
1
12 - Audit of SEMS Program Elements
0
11 - Investigation of Incidents
0
10 - Emergency Response and Control
0
9 - Pre-Startup Review
0
6
8 - Assurance of Quality and Mechanical Integrity
7 - Training
0
6
6 - Safe Work Practices
1
5 - Operating Procedures
4 - Management of Change
0
2
3 - Hazard Analysis
2 - Safety and Environmental
0
1 - General
0
Occurrence of Findings by
Phase/Element – Difficulty Ranking
Rate at Which Findings Occurred
Phase I Phase II Phase III
Total
Element/Phase Difficulty (Findings) Ranking
PH I Rank
PHII Rank
PHIII Rank Overall Rank
1 - General
51.52%
31.25%
NA
44.90%
9
6
NA
6
2 - Safety and Environmental
66.67%
9.52%
NA
26.67%
7
10
NA
10
3 - Hazard Analysis
40.00%
48.00%
33.33%
44.44%
10
3
5
7
4 - Management of Change
76.47%
80.00%
NA
77.27%
5
1
NA
2
5 - Operating Procedures
91.30%
33.33%
50.00%
82.14%
3
5
4
1
6 - Safe Work Practices
71.43%
6.25%
85.71%
50.00%
6
12
3
4
7 - Training
83.33%
44.44%
0.00%
58.06%
4
4
6
3
8 - Assurance of Quality and Mechanical Integrity
38.24%
50.00% 100.00% 47.73%
11
2
1
5
9 - Pre-Startup Review
25.00%
NA
NA
25.00%
12
13
NA
12
10 - Emergency Response and Control
25.00%
26.67%
NA
26.32%
12
7
NA
11
11 - Investigation of Incidents
100.00% 10.00%
NA
30.77%
1
9
NA
9
12 - Audit of SEMS Program Elements
59.26%
12.50%
NA
31.34%
8
8
NA
8
13 - Records and Documentation
100.00%
6.67%
100.00% 22.22%
1
11
1
13
NA - Operator-Specific
100.00%
NA
100.00% 100.00%
Element 5 - Operating Procedures:
Most Common Deficiencies
1. Failure to adequately address:
a. “Simultaneous Operations”
b. “Temporary Operations”
c. “Any lease or concession stipulations established by recognized
governmental authority”
d. “Continuous and periodic discharge of hydrocarbon materials,
contaminants, or undesired by-products into the environment is
restricted by governmental limitations”
e. “Raw materials used in operations and the quality control
procedures used in purchasing these raw materials”
2. Consideration of human factors associated with format, content, and
intended use of operating procedures
3. Failure to include job title and reporting relationship of the person(s)
responsible for each of the facility’s operating areas
4. Failure to consistently implement operating procedures
Element 4 - Management of Change:
Most Common Deficiencies
1. MOC procedures fail to address:
a. Duration of the change, if temporary
b. Necessary time period to implement changes
c. Communication of the proposed change and the
consequences of that change to appropriate personnel
2. Employees not adequately informed of the change prior to
startup of the process or affected part of operation
3. Employees not trained in the change prior to startup of the
process of affected part of operation
4. Where MOC results in a change in operating procedures, such
changes not documented/dated
Element 7 - Training:
Most Common Deficiencies
1. Refresher training not provided to maintain
understanding of and adherence to current operating
procedures
2. Failure to develop procedures for evaluating whether
personnel possess the required knowledge and skills
to carry out their duties and responsibilities, including
startup and shutdown
3. Failure to develop and implement qualification criteria
for operating and maintenance personnel
4. Failure of training program to address operating
procedures
Element 6- Safe Work Practices:
Most Common Deficiencies
1. Failure to consider human factors in development of
safe work practices
2. No work authorization or “permit-to-work” system
implemented for tasks involving lockout and tagout of
electrical and mechanical energy sources
3. Failure to include provisions in permit-to-work system
for adequate communication of work activities to shift
changes and replacement personnel
4. Failure to include contractors in permit-to-work
communications
5. Failure to develop and implement safe work practices
to control the presence, entrance, and exit of contract
employees in operation areas.
Element 8- Assurance of Quality/Mechanical Integrity:
Most Common Deficiencies
1. Failure to document for each inspection/test performed of equipment/systems:
a. Name, position, and signature of the person who performed the
inspection/test
b. Date of inspection/test
2. Failure of Mechanical Integrity procedures to address modification of existing
equipment and systems and failure to ensure that procedures are modified for
the application for which they will be used
3. Failure of testing, inspection, calibration, and monitoring programs for critical
equipment to include:
a. Documentation of completed testing and inspection
i. Pressure vessel testing/inspection documentation not maintained for
life of equipment
ii. All other documentation not retained for a minimum of 2 years or as
needed
b. Appropriate auditing procedures to ensure compliance with the program
General Notes and Other Observations
• Most common findings in “other” elements
a. Failure to analyze/critique emergency response drills
for the purpose of identifying and correcting
weaknesses (Element 10 – Emergency Response)
b. Failure of the supervisor of the person in charge of
the task to approve the JSA prior to commencement
of the work (Element 3 – Hazards Analysis)
• Management commitment/involvement led to fewer
findings
• Key variable in determining time/cost required to conduct
audit has been quality of “Document/Records
Management System” – not size of Operator company or
number of facilities.
General Notes and Other Observations
• The more robust the MOC program/process,
the fewer program deficiencies
• Level of BSEE involvement/participation in
audits has varied
• Competing mindsets: Is this about
compliance? Or is this about management
systems?
• No clear definition/approach yet in verifying
knowledge/skills
• Greater focus on the “E” in SEMS is coming
Download