0% found this document useful (0 votes)
9 views36 pages

W7 Decide in A Complex World

The document discusses the complexities of decision-making in high-risk environments, using the Oscar award mishap and the Mount Everest tragedy as case studies. It highlights the characteristics of complex systems, the inevitability of errors, and the importance of error management and psychological safety in organizations. Additionally, it emphasizes the need for leaders to foster a culture of collaboration and learning from mistakes to prevent catastrophic outcomes.

Uploaded by

sushantgoyal3525
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views36 pages

W7 Decide in A Complex World

The document discusses the complexities of decision-making in high-risk environments, using the Oscar award mishap and the Mount Everest tragedy as case studies. It highlights the characteristics of complex systems, the inevitability of errors, and the importance of error management and psychological safety in organizations. Additionally, it emphasizes the need for leaders to foster a culture of collaboration and learning from mistakes to prevent catastrophic outcomes.

Uploaded by

sushantgoyal3525
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

Week 7

Decide in a Complex World


MNO2705C Leadership and Decision Making Under Uncertainty
Ningxin Wang
Today’s Agenda
o Understand complex systems
o Error management
o Mount Everest case discussion

2
The Oscar goes to…

3
What happened?
Copy of Best Actress
How were the results stored and delivered in
the award show?
o Two sets of results in envelopes (24 in each
set), carried in two briefcases by two PwC
auditors
o The auditors handed the envelopes to the Best Picture Award Card
presenters right before they went up the stage
o The wrong envelope was handed to the Best
Picture presenters – it wasn’t the envelope for
Best Picture. It was the copy of the Best
Actress award.

4
What went wrong?
There was a human error.
But the system also has several weaknesses:
o Category names on envelopes were hard to
read (subtle gold lettering on red background).
Also, on the card, the category name was at
the very bottom printed in tiny font.
o There were lots of distractions to the two
auditors in charge of handing out the results
(e.g., temptations to tweet celebrity pictures)
o The two-briefcase system, intended as a safety
feature, added complexity to the system.

5
Characteristics of complex systems

Complex interaction Tight coupling


Different elements of a Subcomponents of a process
system interact in ways that are interlinked with little room
are unexpected and difficult for error or time for
to perceive or comprehend. adjustment. There is very little
slack or buffer among different
components of the system.

Source: Perrow, C. (1984). Normal accidents: Living with high-risk technologies, New York, NY: Basic Books. 6
Mapping different systems

Danger Zone

7
In the Oscar fiasco

Complex interaction Tight coupling


Unexpected interactions of Live broadcasting to high-
different elements in the profile audiences; the winners
system (e.g., two copies of were kept secret until a
results carried by two moment before the
auditors, chaotic backstage, announcement. There was
poorly designed award cards, little slack in the process
confused award presenters). (handing the envelope →
opening it → presenting
award)
8
Normal accidents
• A normal accident happen when unexpected interaction of two
or more errors (because of interactive complexity) causes a
cascade of failures (because of tight coupling).

• Normal ≠ frequent/common
• Normal means rare but inevitable, the system’s characteristics
make it inherently vulnerable to such accidents
• Individual decision errors × complex system = failure/accident

Source: Perrow, C. (1984). Normal accidents: Living with high-risk technologies, New York, NY: Basic Books. 9
What is an “error?”
• Unintentional deviation from truth or accuracy
o Intentional deviation = violation
• Bound to occur in complex systems
• Errors are ubiquitous and inevitable, but they do NOT necessarily
cause an accident or disaster/system meltdown

Source: Keith, N., & Frese, M. (2011). Enhancing firm performance and innovativeness through error management
culture, in N. Ashkanasy, C. Wilderom & M. Peterson (Eds.), Handbook of Organizational Culture and Climate (pp. 137-
157). 10
Error management
Action/Decision Errors Error consequences

Error Prevention Error Management

• Error management aims to reduce negative consequences of errors


as well as future occurrence of similar errors
• View errors as valuable feedback for learning and improving
• Thoroughly analyze an error after it has occurred

Source: Keith, N., & Frese, M. (2011). Enhancing firm performance and innovativeness through error management
culture, in N. Ashkanasy, C. Wilderom & M. Peterson (Eds.), Handbook of Organizational Culture and Climate (pp. 137-
157). 11
BEST PRACTICE OF ERROR
MANAGEMENT

12
High reliability organizations (HROs)
• FSORE framework for HROs (Weick & Sutcliffe, 2015)
o Preoccupation with failure
o Reluctance to simplify
o Sensitivity to operations
o Commitment to resilience
o Deference to expertise

Source: Weick, K. E., & Sutcliffe, K. M. (2015). Managing the unexpected: Sustained performance in a complex world.
Hoboken, NJ: John Wiley & Sons 13
High reliability organizations (HROs)
• FSORE framework for HROs (Weick & Sutcliffe, 2015)

o Preoccupation with failure: pay attention to and learn from errors,


including seemingly trivial ones

o Reluctance to simplify: analyze errors from a systems perspective,


be aware of complex interactions

o Sensitivity to operations: pay close attention to everyday front-line


events; people are empowered to report errors on the ground;
interpret near-misses as danger rather than safety
Source: Weick, K. E., & Sutcliffe, K. M. (2015). Managing the unexpected: Sustained performance in a complex world.
Hoboken, NJ: John Wiley & Sons 14
High reliability organizations (HROs)
• FSORE framework for HROs (Weick & Sutcliffe, 2015)

o Commitment to resilience: develop the ability to bounce back from


inevitable errors (“the hallmark of an HRO is not that it is error-free,
but that errors don’t disable it.”)

o Deference to expertise: respect and utilize insights of those who


are best equipped to advise, even if the experts are employees on
the ground with lower ranking

Source: Weick, K. E., & Sutcliffe, K. M. (2015). Managing the unexpected: Sustained performance in a complex world.
Hoboken, NJ: John Wiley & Sons 15
Psychological safety
• “Shared belief that the team is safe for interpersonal risk-taking”
(Edmondson, 1999)
o Interpersonal risk-taking behaviors: admitting an error, expressing a
different point of view, asking for help

• Psychological safety is a function of interpersonal trust and mutual


respect. In teams with high psychological safety, members believe that
the group will not rebuke, marginalize, or penalize them for speaking up
or challenging the prevailing opinion.

Source: Edmondson, A. (1999). Psychological safety and learning behavior in work teams. Administrative Science
Quarterly, 44(2), 350-383. 16
A case for best practices of error management
The Blue Angels

Questions

Can you identify examples for the FSORE in the


Blue Angels case?

How was psychological safety built within the


group?

17
Summary so far
• Complex, tightly-coupled systems are prone to normal
accidents, but errors do not necessarily materialize into
accidents
• Error management can be used to learn from errors and
prevent errors from escalating to major accidents
• Best practices of error management:
o High-reliability organizations (HROs)
o Build psychological safety in groups

18
THE EVEREST CASE

19
Discussion
• Why did this tragedy happen? What is the root cause of this
disaster?
• Are tragedies such as this simply inevitable in a place like
Everest?
• What is your evaluation of Rob Hall and Scott Fischer as
leaders? Did they make some poor decisions? What decisions
and why?

21
Three levels of analysis
Organization/system Level
System Complexity

Mt
Everest
Tragedy
Individual Level Group Level
Cognitive Biases Psychological Safety
22
Individual level
Cognitive biases
• Decision biases impaired the judgment and choices that individuals
made
• Evidence of at least three biases in this case:
• Overconfidence bias: tendency to be overconfident with regard to their
judgments and choices
• Sunk cost effect: tendency to escalate commitment to a course of action in
which they have made substantial prior investment of resources
• Recency effect: tendency to over-emphasize information from recent
events when making decisions and judgments
Individual cognitive biases in the Everest case
Biases Evidence from the case
Overconfidence Hall: “It’s worked 39 times so far, pal, and a few of the blokes who summitted
with me were nearly as pathetic as you.”

Fischer: “I believe 100% that I’m coming back. My wife believes 100% that I’m
coming back. She isn’t concerned about me at all when I’m guiding because I’m
going to make all the right choices.” “It’s not the altitude, it’s your attitude.”

Krakauer: many of his fellow clients were “clinically delusional”

Sunk cost trap Hansen: “I’ve put too much of myself into this mountain to quit now, without
giving it everything I’ve got.”

Krakauer: “[the clients] had each spent as much as $70,000 and endured weeks of
agony to be granted this one shot at the summit.”
The recency effect Breashears: “Several seasons of good weather have led people to think of
Everest as benevolent, but in the mid-eighties – before many of the guides had
been on Everest – there were three consecutive seasons when no one climbed
the mountain because of the ferocious wind.”
Group level
Conditions affecting psychological safety

Member Status
Differences

Team Team Decision-


Leader
Psychological Making
supportiveness
Safety Effectiveness

Level of
Familiarity
Group level
Conditions affecting psychological safety

Member Status “Shared belief that the team is safe


Differences for interpersonal risk-taking”

Team Team Decision-


Leader
Psychological Making
supportiveness
Safety Effectiveness

Level of
Familiarity
Group level
Conditions affecting psychological safety
Krakauer: “…he [Harris] had been cast in
the role of invincible guide, there to look
after me and the other client.”
Member Status
Differences Beidleman “was quite conscious of his
place in the expedition pecking order”
Team Team Decision-
Leader
Psychological Making
supportiveness
Safety Effectiveness

Level of
Familiarity
Group level
Conditions affecting psychological safety

Member Status
Differences

Hall: “I will tolerate no dissension up


there.
TeamMy word will be absolute law,
Leader Team Learning
beyond appeal. If you don’t like a
Psychological
supportiveness Behavior
particular
Safety decision I make, I’d be happy
to discuss it with you afterward, not
while we’re up on the hill.”
Level of
Familiarity
Group level
Conditions affecting psychological safety

Member Status
Differences

Boukreev: He was very uncomfortable


Team he didn’t know many of the
because
Leader Team Learning
Psychological
other climbers.
supportiveness Behavior
Safety
Krakauer: “I’d never climbed as a
member of such a large group – a group
Level of
of complete strangers, no less.”
Familiarity
“Each client was in it for himself or
herself, pretty much.”
Group level
Conditions affecting psychological safety
High
Member Status
Differences

Low Low Low


Team Team Decision-
Leader
Psychological Making
supportiveness
Safety Effectiveness
Low
Level of
Familiarity
System level
Complex systems
Complex interactions Tight coupling
• Logistical problems burdened Fischer • Expedition had a very narrow
and disrupted his acclimatization window for summit push (time
routine dependence)
• Fischer’s health interacted with events • Very little slack (18-hour worth of
that took place later in unexpected oxygen)
ways • Rigid sequence of activities
• Sherpa assisted Sandy Pittman + Hall’s • One dominant path to the goal
safety procedures + Fischer’s (South Col route)
deteriorating physical condition →
unfixed rope → bottleneck → delayed
in summit → went past the appropriate
return time
System level
Complex systems (cont.)
• Breashears: “I think of an expedition as a very complex
organism… Rob Hall had designed a complicated system that
was very rigid.”
• Krakauer: “Four of my teammates died not so much because Rob
Hall’s systems were faulty – indeed nobody’s were better – but
because on Everest, it is the nature of systems to break down
with a vengeance.”
The Everest case
Lessons for decision making
• Use a multi-level framework for examining an organizational
decision-making failure
o There’s hardly one key factor that caused a catastrophe
o The effects of multiple errors in complex systems can be compounded
and interact in unexpected ways to cause tragic events

• A climate of psychological safety is essential for open and


collaborative decision making. A lack thereof inhibits constructive
dissent

• Leaders should set the tone for collaborative decision making;


leaders should learn from errors and be aware of own tendency to
err (don’t be overconfident!)
Three levels of analysis
Organization/system Level
System Complexity

Mt
Everest
Tragedy
Individual Level Group Level
Cognitive Biases Psychological Safety
34
Let’s recap
• Complex system: interaction complexity & tight coupling
• What is an error and the ubiquitous of errors
• Error management best practices
o High reliability organizations (HROs)
o Psychological safety

• Everest case: three levels of analysis


Next Week
Design Thinking

Complete assigned readings on Canvas

36

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy