Human Factors in Supporting System Reliability and Recovery

Human Factors in Supporting System Reliability and Recovery
Slide Note
Embed
Share

This content delves into the importance of human factors in maintaining reliability and recovery within the bulk power system. It covers topics such as identifying systematic strengths and weaknesses, recognizing gut feelings in operations, and managing cognitive biases. Additionally, it discusses stress management, attention, memory, and mistake prevention in emergency situations.

  • Human factors
  • System reliability
  • Recovery
  • Stress management
  • Cognitive biases

Uploaded on Feb 25, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. You: The most important piece of the bulk power system Human factors in supporting reliability and recovery from physical and cyber events Mike Legatt, Ph.D. Principal Human Factors Engineer Electric Reliability Council of Texas, Inc. Michael.Legatt@ercot.com 1

  2. Introduction This exercise is intended to prepare you for things that don t seem quite right how you can communicate and collaborate to identify events and reduce their impacts Furthermore, it s intended to serve as a brief primer on maintaining human performance by tracking stress, accuracy, and keeping cognitive biases in check 2

  3. Objectives You will: Identify the systematic strengths and weaknesses of people, technology, and their interactions Recognize the role of your gut feelings in operations Identify when and how to share these feelings within and outside your organization and prevent against biases 3

  4. Definitions Running estimate Common operational picture (COP) Cognitive bias Situation awareness Selective attention Ego depletion Hyperstress / hypostress 4

  5. PATTERN RECOGNITION: The core human activity 5

  6. Running Estimate Process 6

  7. Attention, Memory and Mistakes 7

  8. Selective Attention 8

  9. 2. In which state of stress are you most likely to make a mistake in an emergency? a) Hypostress b) Hyperstress c) Hypostress before the emergency, then hyperstress when it happens d) Being in the zone of maximum adaptation 9

  10. Human Performance Under Stress Stress and performance, from Hancock (2008) 10

  11. 2. In which state of stress are you most likely to make a mistake in an emergency? a) Hypostress b) Hyperstress c) Hypostress before the emergency, then hyperstress when it happens d) Being in the zone of maximum adaptation 11

  12. 3. If youre operating a substation remotely and flip the wrong breaker because you were on autopilot (very familiar with this substation), what was the likely cause? a) Inattention b) Misinterpretation of a rule c) Inaccurate mental model d) Organizational bias 12

  13. How we make mistakes From: NERC Cause Analysis Methods 13

  14. 3. If youre operating a substation remotely and flip the wrong breaker because you were on autopilot (very familiar with this substation), what was the likely cause? a) Inattention b) Misinterpretation of a rule c) Inaccurate mental model d) Organizational bias 14

  15. 5. In a prolonged emergency situation (or even after a long shift), what do you need to be most careful about watching for? a) Ego depletion b) Semmelweis reflex c) Outgroup homogeneity d) Hindsight bias 15

  16. Ego Depletion Self-control is a limited resource, and like a muscle, it tires out. 16

  17. Situation Awareness 17

  18. COGNITIVE BIASES 18

  19. Cognitive Biases (a sampling) Apparently, when you publish your social security number prominently on your website and billboards, people take it as an invitation to steal your identity. Zetter, K. LifeLock CEO s Identity Stolen 13 Times. Wired.com, April 2010. 19

  20. 1. Theres an emergency, and you have idea how to solve it, while a co-worker has a different idea. What might make you think yours is better than theirs? a) Zero-risk bias b) IKEA affect c) Organizational bias d) Confirmation bias 20

  21. 4. Youre looking at a one-line, and thinking about keeping flows under thermal limits. Why are you more likely to notice a base case violation then? a) Cognitive dissonance avoidance b) Google effect c) IKEA effect d) Attentional bias 21

  22. Cognitive Biases (a sampling) Anchoring something you ve seen before seems like the benchmark (e.g., first time you paid for gas) Attentional bias You re more likely to see something if you re thinking about it Cognitive dissonance uncomfortable to have to conflicting thoughts Confirmation bias pay attention to things that support your belief 22

  23. Cognitive Biases (a sampling) Diffusion of responsibility someone else will take care of it Google effect easy to forget things that are easily available electronically Groupthink people less likely to contradict ideas in a large group Hindsight bias the past seems perfectly obvious 23

  24. Cognitive Biases (a sampling) IKEA effect things you ve built seem more valuable to you than things others have built Illusion of transparency-expect others to understand your thoughts/feelings more than they can Loss aversion you re more likely to try avoid losing than gaining 24

  25. Cognitive Biases (a sampling) Organizational bias you re likely to think ideas within your organization are better Outgroup homogeneity you re likely to think that people in another group all think the same Semmelweis reflex rejecting new ideas that conflict with older, established ones Zero-risk bias likely to choose worse overall solutions that seem less risky 25

  26. 1. Theres an emergency, and you have idea how to solve it, while a co-worker has a different idea. What might make you think yours is better than theirs? a) Zero-risk bias b) IKEA affect c) Organizational bias d) Confirmation bias 26

  27. 4. Youre looking at a one-line, and thinking about keeping flows under thermal limits. Why are you more likely to notice a base case violation then? a) Cognitive dissonance avoidance b) Google effect c) IKEA effect d) Attentional bias 27

  28. 5. In a prolonged emergency situation (or even after a long shift), what do you need to be most careful about watching for? a) Ego depletion b) Semmelweis reflex c) Outgroup homogeneity d) Hindsight bias 28

  29. SCENARIOS 29

  30. Group exercise: Scenario 1 Substation X Camera malfunction Low oil level alarm on a transformer Dispatch troubleshooter Bullet holes in camera and transformer Random act of vandalism, ploy or directed threat? 30

  31. Group exercise: Scenario 2 Substation Y Communications vaults for 2 providers damaged (AT&T and Level3). > 100 shots fired at transformers, oil leakages in several transformers (> 51k gallons spilled). Only energized transformers shot. Attackers never entered substation Initial assumption: vandalism? Dress rehearsal for future attacks? It happened: April 16, 2013, Metcalf Substation 31

  32. Group exercise: Scenario 3 Utility control room Telemetry doesn t look quite right not sure why Sees significant flow into substation without a load, then goes away RTU failure, manipulated data, cyberattack? 32

  33. Group exercise: Scenario 4 ISO Control Room News report of civil unrest in an area Call from utility: substation transformer Call from utility: telemetry issues Several other below the line calls To whom do you share this information? 33

  34. Summary Value of communication and collaboration when things are not quite right. Reporting structure for handling incidents Remember your data may just be part of something larger 34

  35. References NERC CAP Annex D, Phase 0 (draft) NERC CIPC Report to Texas RE MRC NERC Cause Analysis Methods Macmillan, N.A., Creelman, C.D. (1991). Detection theory: a user s guide. New York: Cambridge University Press Hancock, P.A., & Szalma. J.L. (Eds.). (2008). Performance under stress. Ashgate, Chichester, England.. 35

  36. Questions 36

  37. 1. Theres an emergency, and you have idea how to solve it, while a co-worker has a different idea. What might make you think yours is better than theirs? a) Zero-risk bias b) IKEA affect c) Organizational bias d) Confirmation bias 37

  38. 2. In which state of stress are you most likely to make a mistake in an emergency? a) Hypostress b) Hyperstress c) Hypostress before the emergency, then hyperstress when it happens d) Being in the zone of maximum adaptation 38

  39. 3. If youre operating a substation remotely and flip the wrong breaker because you were on autopilot (very familiar with this substation), what was the likely cause? a) Inattention b) Misinterpretation of a rule c) Inaccurate mental model d) Organizational bias 39

  40. 4. Youre looking at a one-line, and thinking about keeping flows under thermal limits. Why are you more likely to notice a base case violation then? a) Cognitive dissonance avoidance b) Google effect c) IKEA effect d) Attentional bias 40

  41. 5. In a prolonged emergency situation (or even after a long shift), what do you need to be most careful about watching for? a) Ego depletion b) Semmelweis reflex c) Outgroup homogeneity d) Hindsight bias 41

More Related Content