Study on the Causal Relationships between Context and Human Error in Digital-based Control System of NPPs Peng-cheng Li 1, Li Zhang 1,2 , Li-cao Dai 1 , Yan-hua Zou 2 1 Human Factor Institute, University of south China, Hengyang, China, 421001 2 Hunan Institute of Technology, Hengyang, China, 421001 (lipengcheng0615@163.com) Abstract - In order to identify the influencing relationships between performance shaping factors (PSFs) and human error in digital control system of nuclear power plants, firstly, the organization-oriented causation conceptual model of human error is built. It is composed of four modules/levels, namely, levels of the organizational factors, situational factors, error-triggering individual factors and human error, and a model-based human error classification system is developed. Finally, the influencing relationships between contextual factors and human errors are identified based on incident reports, expert opinions and literatures. The results show that the impacts of contextual factors on human errors are very complex, and different contextual factors may produce different types of influencing on the same human error, the same contextual factors may produce different types of influencing on different human error etc. KeyWords - Nuclear power plant, digital control system, human errors, performance shaping factors, causation model of human error I. INTRODUCTION With the rapid development of computer, control, and information technology, the instrumentation and control (I&C) system of nuclear power plants (NPPs) is transformed from traditional analog-based control to digital-based control, the man-machine interface (MMI) in control room is transformed from the traditional hard-wired board to the computer-based workstation, so that the operating environment in an advanced main control room (MCR) is very different from the that in a traditional control room , which makes the performance shaping factors (PSFs) impacting human reliability change, such as technology system, human-system interface (HIS), procedure system, alarm system, analysis and decision support system, size, structure, and communication paths of team et al[1,2]. Operators’ cognitive and action modes also changed, then new human error mechanism emerges including some new human error modes, PSFs, and their influencing relationships etc. Therefore, it is necessary to develop a new causation model of human error which describes human error mechanism and identify the influencing relationships between context (or PSFs) and human error in order to provide guidance for human error identification. II. THE ORGANIZATION-ORIENTED CAUSATION MODEL OF HUMAN ERROR Some well-known catastrophic accidents, such as Three Mile Island, Chernobyl, Piper Alpha, Zeebrugge, and Challenger in high-hazard industries have shown that organizational factors are one of the main causes contributing to human errors. The classical probabilistic risk analysis (PRA) technique focuses on the technical system and human reliability, and considers the effects of organizational factors on human errors, but the dependencies between technical systems and organizational factors and the relationships between situational factors and human errors are not clearly stated. Mosleh et al [3] thought that an appropriate model for assessment of the influence of organization on its product (or metrics of its performance) should consider both the structural aspects and the behavioral aspects. The interaction of organization and systems or components is carried out by the "front-line" staff (such as operators, maintainers) activities. They lie in a particular contextual environment. Their behaviors and states are influenced by a variety of organizational and situational factors. Each organization is made up of different sub-organizations (or departments), teams, units, and personnel including decision-makers, managers, safety officials, work planners, staff etc., and they have their own structure and function. Their activities are implemented to provide the work conditions and manage activities for “front-line” operators by means of motivating, designing human-computer interface, educating, guiding, managing and constraining their behavior, so as to increase the safety of their performance [4]. Therefore, the organization-oriented “structure-behavior” model has been developed as a guiding framework for incorporating organizational factors into human reliability analysis (HRA) based on field research and system theory[5]. It is simplified to the form as shown in Fig.1 in this paper, namely a conceptual casual model of human error in order to facilitate human error and human reliability analysis. The conceptual casual model is similar to Reason’s “Swiss cheese” model[6]. It is composed of four modules/levels, namely, levels of the organizational factors, situational factors, error-triggering individual factors and human errors. III. THE MODEL-BASED HUMAN ERROR CLASSIFICATION SYSTEM A. Human error classification With the improvement of automation level, the role of operator has been transformed from the operator to monitor, decision-maker and manager in complicated socio-technology systems such as NPPs. The MCR operations may be regarded as being performed based on the four primary cognitive activities for NPP operations[7,8], namely: (1)monitoring/detection, (2)situation assessment, (3)response planning, and (4) response implementation. Organizational factors 1.Organizational goals and strategies 2.Organizational structure 3.Organizational resources 4.Training 5.design/ planning 6.Organizational management 7.Organizational culture Situational factors Error-triggering individual factors 1.Technical system 2. Humancomputer interface 3. Procedure 1.physiological 4.Task quality and capacity Human error/human reliability characteristics 2 Situation assessment Response planning Monitor/ detect Response implementiong . psychological states 3.human 5.Team factors 6.Work environment Fig. 1. 4 . Memorized information signal/data/methods action/observation etc. output etc. input The conceptual causal model of human error Monitoring and detection refer to the activities involved in extracting information from the complex dynamic work environment [8]. In general, in the stage of monitoring and detection, the operator’s task is mainly to gather information, including single piece of information and more information. The operator's cognitive activities are monitoring/detection, recognition and verify for individual information, and their activities are information filtering, screening, etc. For multi pieces of information, which is combined into a cognitive activity, that is multiple information gathering. When operators detect abnormal events in a plant, they would identify and assess the situation to form a reasonable logic explanation on the plant condition. This process is referred to the situation assessment. The operator’s activities mainly include comparison, explanation, projection and cause identification [9-11]. Response planning refers to deciding upon a course of action to address an event,given a particular situation assessment. In general, response planning involves identifying goals, generating one or more alternative response plans, evaluating the response plans, and selecting the response plan that best meets the goals identified [12]. Response implementation refers to taking specific control actions required to perform a task. The five operation teams composed of twenty people are interviewed and investigated (semi-structured questionnaire in terms of activity process) to identify the specific classification of human errors as shown in table I. TABLE I. THE CLASSIFICATION OF HUMAN ERROR Cognitive processes Monitoring/det ection Cognitive activities Human error modes Specific errors (relevant keywords) C1:Monitoring/Detectio n C2: Recognition C3:Verifying C4: Multiple information collection E1: monitoring/ detection error E2:recognition error E3:verifying error E4:multiple information collection error -None,late,wrong,loss -None,late,wrong -None,late,wrong -Omission, Irrelevant, Insufficient, Redundant Situation assessment C5: Comparison C6:Diagnosis/ explanation C7:Projection C8:Cause identification E5: comparison error E6: explanation error E7:projection error E8:cause identification error -None,late,wrong -None,late,wrong,loss -None, wrong -None,late,wrong Response planning C9:Goals identification C10:Construct C11:Evaluation C12: Selection C13:Following E9:goals identification error E10:plan construct error E11:plan evaluation error E12:plan selection error E13:plan following error -None,late,wrong -None,late,wrong -None,late,wrong -None,late,wrong -None,late,wrong Response implementation C14:Timing C15:Positioning(space) C16:Selection C17:Implement C18:Communication E14: operation omission E15:not timely operation (time) E16:operating object error(space) E17: inadequate operation E18:wrong operation E19:information communication error -Omission -Too late, too early -Right operation on wrong object, Wrong operation on wrong object -Too long/short, too much/little, incomplete, regular speed too fast/ slow. -Wrong operation on right object, Operation in wrong direction, wrong sequence, wrong input, wrong record -None, unclear, incorrect B. The classification of PSFs or context According to Fig. 1, human reliability is influenced not only by situational factors, but also by other factors, such as individual and organizational factors. The organizational factors fall into 7 categories: goals and strategies, structure, resources, training, planning/design, organization management and organization culture. Situational factors include the man-machine interface, work environment, task and technology system factors. Error-triggering individual factors are composed of four groups of individual factors in the paper, which are physiological characteristics, psychological states, memorized information and human quality and capacity factors, respectively. It is subject to be further divided into particular sub-categories. The detailed classifications of PSFs are shown in reference [13]. dominant (denoted as “C”), is the case where a group of PIFs acting together have the same kind of influence as the “I”. For example, as shown in Table II, time-stress and recall-perceptual-information together represent the handling a certain chunk of information. As a result, these two PIFs together have a coordinative influence on information collection. Adjustment influence (denoted as “A”) is that of some PIFs having a certain influence on behavior which is not, however, as significant as in types “I” or “C”. For example, as shown in Table II, time stress could affect the function of “verity” activity to a certain degree but not completely disable it. In this paper, the three types of influencing relationships described above as well as no effect (denoted as “N” or “-”) are adopted to analyze the influencing relationships between contextual factors and human errors. The influencing relationships between individual factors and human errors (see Table II), and organizational, situational factors and individual factors (see Table III) are identified on the basis of the literatures, the analysis of incident reports and experts’ judgment. IV. THE INFLUENCING RELATIONSHIPS BETWEEN CONTEXT AND HUMAN ERROR In HRA methods, there are some methods that study the influencing relationships between the contextual factors such as CREAM[14] and SPAR-H [15]. However, there are little literatures that describe the causal relationships between contextual factors and human error/human reliability, only Chang et al [11] assess the effects of the performance influencing factors (PIFs) impacting the operators’ problem-solving responses. The types of effects of PIF on a given operator are classified into three types: (1) individually dominant, (2) collectively dominant, and (3) adjusting. Individually dominant (denoted as “I”), is that of a single PIF having a pronounced effect on a specific behavior. For example, as shown in Table II, bias could have a direct and determining influence on information collection. A biased mind could reject certain types of incoming information thus the useful information is ignored. Collectively TABLE II THE INFLUENCING RELATIONSHIPS BETWEEN INDIVIDUAL FACTORS AND HUMAN ERRORS Human errors Individual factors Time stress Ment al Stress Task load stress state Information load stress Frustration C E Situation assessment E P C X R A Response planning Response implement G O C T E V S E F O O M N O O B I N W R C N — — — — A A C A A A C C C C C — — — C C A — A — — — C C C C C C C C C C C C — A A A — — — — — — — — — — — — — — — — — — — — A — I — — — — — — I — A A C A A A — — I — A A C A A A — — I — A A — — C — — — I A A — C C C C — — — — — — I C C C C C C C C C C C C C C C C C C C C — — I — — — — — — I C C C C — — I — — — — — — I — A — A — A — A — A — A — — — — A A — — — — A A — — A — A — A — A — A — A — A — — A A A A I I I — — — A A A A — — — — — — — — — A — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — A A I I A A I I A A I I A A I I A A I I — — — — Recall perceptual information Previous actions Current action Prospective memory — — — C C C C A A C A A — — — — — — C — — — — — — — — — — — — — — — — — — — — A — — — — — — — — — — — — — — — — — — A C A — C A — C A — C A — C A — — — Stored information Knowledge and experience Skill Moral level — A — A — A A A A A C C C C A I A I C C — I A A — — — I — I — I — I — I — I — — — — — — — — — — — — — — — A — A — — — — — — — I I I I I I I I I I I C I Emot ion Cogn itive mode Intrin sic chara cteris tics Physi ologi cal state Mem orize d infor matio n Quali ties and abiliti es Monitoring/detecti on D R V C E E E O Conflict Pressure Uncertainty Alertness Attention Bias Attitude Self-confidence Problem solving style Motivation Morale Fatigue Physical limitations — I A A I I Note: DE=Detection; RE Recognition; VE= Verify; CO= Collection; CE= Compare; EX=Explain; PR= Project; CA= Cause identification; GO= Goals identification; CT=Construct; EV=Evaluation; SE= Select; FO=Follow; OM=Omission; NO= Not timely; OB= Object error; IN=Inadequate operation; WR= Wrong operation; CR=Communication error. TABLE III THE INFLUENCES OF CONTEXTUAL FACTORS ON INDICIDUAL FACTORS Mental state Individual factors Contextual factors Situati onal factors Task Human-c omputer interface Technolo gy system Organi zation al factors Work en vironme Procedur nt e Team factors Organiza tional strategy Organiza tional structure Organiza tional manage ment Organiza tional culture Task novelty Task Complexity / load Task importance The number of simultaneous tasks(dynamic) Information display characteristics Soft control characteristics Alarm system features Human-computer interaction characteristics (interface management) Dynamic state of the plant, such as the rate of change of the current value Available Time Reliability The level of automation Complexity System response speed / delay The compatibility of hardware and software The dangers of the work environment The comfort of work environment Procedures Communication0 Cooperation and coordination of work Organizational goals Policy / system The formulation of long-term plan Allocation of resources The primary and secondary of management Strategic approach and measures Centralization of organizational decision-making The determination of organizational structure The level of Organizational structure Roles and responsibilities The paths of communication authorization Human Resource Management Supervision and control System / interface design Education and training Work design / organize / arrangement Quality audit / assurance Organizational learning Safety standards and norms Safety awareness / attitude Security practice and measures Physiolog ical state A A A A C - A A Qualities and abilities K S M N K O - - - - - - - - - - - - C C - C C - C C - C C - - C - - - A A - - - A I I - - - - - - - - - - - - - - - A - A - - - - - - - - A A A - A - - A A - - - - - - - - - - - - - - - - - - - - - - - - - I C - - - - - - - - - - - - - - - - A - - A - - I - - A - - A - - I C - - - - - - - - - - - - - - - - A - - - - - I - - A - - A - - - - - - - - - - - - - - - - - - - - A - - A - - I - - A - A - S T C C A I C C C C E M A A - A C C C C C O - - - - C C C C I N A A A A - - - - FA - C - C I A A A P H - A - A C C C C A - - - A I - - C - - A - C A A - - - - - - - - - A A - - - - - C - - - - A - - - C A - A - - - C - - - - - - - - - - - - A A - - C - - - - - - - I C A - I A - - A - - - - - - - - - A - - A - - - - - C A C - A - - - - - - - - - A - - - - - - - - A A - A A A - - - - A A A A - - A A - - - I - - - - - - - - - - - - - - - - - A - A - - - - - Me mor y ME Note: ST= Stress; EM= Emotion; CO= Cognitive mode; IN=Intrinsic characteristics; FA=Fatigue; PH= Physical characteristics; ME= Memorized information; KN=Knowledge experience; SK=Skill; MO= Moral. V. CONCLUSIONS The development of technology makes the contextual environment change which brings about new human error modes, the distribution of human errors, and new influencing relationships between PSFs and human error. Therefore, it provides new demands for the prevention of human error and HRA. The conceptual causal model of human error is built and the classification of human errors and PSFs are constructed. And the influencing relationships between PSFs and human errors are identified to construct the corresponding influencing maps in order to provide guidance for human error identification. However, the “robustness” and “reliability” of influencing relationships should further improved base on a lot of data from event reports of digital NPPs and statistical method such as correlation analysis etc. This needs further future work. [12] ACKNOWLEDGMENT The financial support by Natural Science Foundation [13] of China (Nos. 70873040 and 71071051), National Social Science Foundation of China (No. 11BGL086), Research Project of LingDong Nuclear Power Co., Ltd. (No. KR70543), Ministry of Education of China, Humanities and social science projects (No. 11YJC630207), [14] Research Project of Education Bureau of Hunan Province, [15] China (No. 10C1139) and Research Project of University of South China (No. 2010XSJ12 and No. 2011XYY11)are gratefully acknowledged. REFERENCES [1] Committee on Application of Digital Instrumentation and Control Systems to Nuclear Power Plant Operations and Safety, National Research Council. Digital instrumentation and control systems in nuclear power plants: safety and reliability issues. Washington DC: The National Academies Press, 1997, pp. 59–70. [2] J.M. O’Hara, W.S. Brown, P. M. Lewis, J.J. Persensky. The effects of interface management tasks on crew performance and safety in complex, computer-based systems: detailed analysis. NUREG/CR-6690, Vol.2, Washington D.C: U.S. NRC, 2002. [3] A. Mosleh, E. Goldfeiz, S. Shen. The ω-factor approach for modeling the influence of organizational factors in probabilistic safety assessment, in IEEE sixth annual human factors meeting, Orlando Florida, 1997, pp. 9-18. [4] J. Rasmussen. Risk management in a dynamic society: a modeling problem. Safety Science, vol. 27, no. 2-3, pp. 183-213, 1997. [5] P.C. Li, G.H. Chen, L.C. Dai, L. Zhang. A fuzzy Bayesian network approach to improve the quantification of organizational influences in HRA frameworks. Safety Science, vol. 50, no. 7, pp. 1569-1583, 2012. [6] J. Reason. Human error. New York: Cambridge University Press. 1990, ch. 7, pp. 173–216. [7] C.M. Thompson, S.E. Cooper, D.C. Bley, J.A. Forester , J. Wreathall . The application of ATHEANA: a technique for human error analysis, in IEEE Sixth Annual Human Factors Meeting, Orlando, Florida, 1997, pp. 13-17. [8] S.J. Lee, M.C. Kim, P.H. Seong. An analytical approach to quantitative effect estimation of operation advisory system based on human cognitive process using the Bayesian belief network. Reliability Engineering and System Safety, vol. 93, no. 4, pp. 567-577, 2008. [9] T. Kontogiannis. A framework for the analysis of cognitive reliability in complex systems: a recovery centred approach. Reliability Engineering and System Safety , vol. 58, no. 3, pp. 233- 248, 1997. [10] M. R. Endsley. Toward a theory of situation awareness in dynamic systems. Human Factors, vol. 37, no. 1, pp. 32-64, 1995. [11] Y.H.J. Chang, A. Mosleh. Cognitive modeling and dynamic probabilistic simulation of operating crew response to complex system accidents. Part 4: IDAC causal model of operator problem-solving response. Reliability engineering and system safety, vol. 92, no. 8, pp. 1061- 1075, 2007 . K.J. Vicente, R.J. Mumaw, E.M. Roth. Operator monitoring in a complex dynamic work environment: a qualitative cognitive model based on field observations. Theoretical Issues in Ergonomics Science, vol. 5, no. 5, pp. 359-384, 2004. P.C. Li , G.H. Chen, L. Zhang , L.C. Dai, M. Zhao. Study on the classification of performance shaping factors in digital control system of nuclear power plants, in 1st international symposium on behavior-based safety and safety management, Beijing, China, 2011. (in Chinese) E. Hollnagel. Cognitive reliability and error analysis method. Oxford( UK ): Elsevier Science Ltd . , 1998, ch. 4, pp. 83-119. D. Gertman, H. Blackman, J Marble, J. Byers, C. Smith. The SPAR-H human reliability analysis method. NUREG /CR-6883, INL/EXT-05-00509. Washing ton DC: U. S Nuclear Regulatory Commission , 2005.