ArticlePDF Available

Making Sense in Pitch Darkness: An Exploration of the Sociomateriality of Sensemaking in Crises

Authors:

Abstract

A brief failure of one item on the display of the information system (IS) on flight AF 447 wrought havoc in the coordination between the pilots and the aircraft, leading to the loss of all 228 lives on board. In this essay, we ask the following question: How can the very instruments supposed to ensure our safety and make organizations more reliable lead a team to destruction? We propose that the imbrication of material and human agencies in highly automated systems drives an attitude of 'mindful indifference' (i.e. the capacity for experienced operators to reckon with which problems could turn into critical ones, and which can be tolerated on account of the overall system reliability). An abrupt change in this imbrication provoked emotional distress and focused the pilots' attention towards the machine, instead of triggering an organizational process of sensemaking. We highlight the role of leadership in such situations.
1
Forthcoming, Journal of Management Inquiry
Making Sense in Pitch Darkness:
An Exploration of the Sociomateriality of Sensemaking in Crises
Olivier Berthod
Freie Universität Berlin
Gordon Müller-Seitz
Technische Universität Kaiserslautern
Abstract: A brief failure of one item on the display of the information system (IS) on flight
AF 447 wrought havoc in the coordination between the pilots and the aircraft, leading to the
loss of all 228 lives on board. In this essay, we ask the following question: How can the very
instruments supposed to ensure our safety and make organizations more reliable lead a team
to destruction? We propose that the imbrication of material and human agencies in highly
automated systems drives an attitude of ‘mindful indifference’ (i.e. the capacity for
experienced operators to reckon with which problems could turn into critical ones, and which
can be tolerated on account of the overall system reliability). An abrupt change in this
imbrication provoked emotional distress and focused the pilots’ attention towards the
machine, instead of triggering an organizational process of sensemaking. We highlight the
role of leadership in such situations.
Acknowledgements: We thank, in alphabetical order, Timo Braun, Bridget Hutter, Sally Lloyd-
Bostock, Manuel Nicklich, Simone Osterman, Charles Perrow, Markus Reihlen, Elke Schüßler, Georg
Schreyögg, Jörg Sydow, Paul t’Hart, Robert Wagner, Lauri Wessel, Karl Weick, and the participants
to the 37th WK-Org workshop for their feedback on earlier versions of this essay. Finally, we are
grateful to Saku Mantere, handling editor of the paper, and the anonymous reviewers for their
guidance, ideas and encouragements throughout the revision process. This work was supported by a
grant from the Peter Pribilla Foundation.
2
“Never let an aircraft take you somewhere your brain didn't get to five minutes earlier.”
(Pilot proverb)
About this Essay
On Monday 1 June 2009, between 2:10 and 2:14 AM (UTC), a brief failure of one item on
the display of the information system (IS) on flight AF 447 wrought havoc in the coordination
between the pilots and the aircraft, leading to the loss of all 228 lives on board. In this paper,
we seek to understand why the pilots on flight AF 447 failed to reconnect with their aircraft,
identify the source of their problem, and respond accordingly. Specifically, the high degree of
automation and highly invasive influence of IS on board such airplanes prompt us to ask the
following theoretical question: How can the very instruments supposed to ensure our safety
and make organizations more reliable lead a team to destruction? In the following, we
reconstruct the final moments of Flight AF 447 and repeatedly pause in this process in order
to reflect upon and develop a theoretical argument that addresses this question. To assist the
reader, Appendix 1 offers an overview of the data and approach we used; Appendix 2 displays
the empirical storyline and accompanying theoretical ramifications that underlie this essay.
Flying as Usual: the Case of Flight AF 447
Flight AF 447, an Airbus A330-200 operated by Air France, took off from Rio de Janeiro
at 10:29 PM (UTC) on Sunday 31 May 2009, heading for Paris, and flew over the Brazilian
coast without any noticeable problems. The younger of the two copilots soon took charge and
became the ‘pilot flying’, a title indicating the pilot in charge of operations, regardless of rank
or age, putting him in the leader position. The other copilot was the first to take a rest. At
10:33 PM, shortly after takeoff, the copilot flying engaged the autopilot and let the
3
information system (IS)i fly the aircraft up to flight level 350 (i.e. an altitude of 35,000 feet),
which it reached at 11:00 PM.
The pilots, from then on, were kept busy monitoring the IS while it flew the aircraft. In
flight, avionics in the aircraft keep it on its route, including, among other functions, the
automated maintaining of appropriate altitude, trajectory, wing angle and engine power
required. In addition, the IS on board computes information on the aircraft’s situation for the
pilots’ use. The computing of this set of information is based on empirical data delivered by
sensors and probes and converted into electronic signals fed into the IS. In return, the pilots
coordinate this information, verify data on the weather situation ahead, and communicate with
staff at the control points along the route. As the Bureau d’Enquêtes et d’Analyses pour la
Sécurité de l’Aviation Civile (BEA for short; the French agency responsible for analyzing
aviation incidents) stresses, pilots’ work is centered more on strategic aspects of navigation,
such as fuel management for example, than on flying as such (BEA, 2012: 167). Should the
pilots wish to alter the trajectory, e.g. so as to avoid turbulence, they enter this command into
the IS via inputs on a control stick. This input is converted into electronic signals, computed
by the IS, which then determines the extent to which it must act upon the motors and
hydraulics that control, say, the wings. The new position of the aircraft serves as a basis for
the IS to compute new information on the LCD screens in the cockpit.
During the flight, when the aircraft reaches a control point, the crew must notify the next
radar station that it has done so. The crew then receives a new radio frequency to
communicate with staff at the next control point. This takes place every ten to twenty minutes
during a flight. As an example, the last contact between AF 447 and staff at a control point
was as follows:
“The crew informed the ATLANTICO controller [i.e. the airspace controllers in Brazil] that
they had passed the INTOL point, then announced the following estimated times: SALPU [i.e.
the next control point after INTOL] at 1 h 48 then ORARO at 2 h 00. […]
4
At 1 h 35 min 46, the controller asked the crew to maintain FL350 [i.e. the altitude of the plane,
35,000 feet] and to give their estimated time at TASIL [another control point on the route of the
plane] […]
Between 1 h 35 min 53 and 1 h 36 min 14, the controller asked again three times for the
estimated time at TASIL with no response from the crew.
There was no more contact between the crew and ATLANTICO.” (BEA, 2011, p. 9).
This communication failed to get through because the captain misunderstood the frequency
that the controller had told him. This mistake had no bearing on what happened next, but
Otelli (2011) a former pilot and the author of numerous publications on aviation failures,
among others also an analyst of flight AF 447 – stresses that it illustrates a fairly casual
attitude in the cockpit and a rather flat hierarchy between the captain and his copilots. Soon
after the failed communication with control point ATLANTICO, the copilot flying asked the
captain about selecting a new alternative airfield. This is a safety measure used for long-haul
flights. The aircraft is required to have a scheduled alternative airfield where it can land in an
emergency. The alternative airport that was originally selected was closed at night but was
expected to open in an emergency. The copilot was concerned about this and suggested asking
for a second alternative airport, which the captain did not do:
“Captain: You’re not worrying, are you?
Copilot: It’s a pity you didn't ask him [i.e. the controller they just communicated with] for his
opinion (…)
Captain: It will be fine.” (Otelli, 2011, pp. 215-216).
Technology and the Literature on Sensemaking
When the captain says “it will be fine” to his younger colleague, he discursively enacts his
many years of experience, his leadership and his hierarchical role. The captain attempts to
shape his colleague’s interpretation of what he just did, or failed to do. It is in this ‘sense’ that
Weick (1979; 1995) introduced the perspective, or lens (Sonenshein 2009), of sensemaking to
organization theory. Sensemaking is the process by which individuals, and, ultimately, groups
and organizations, interpret their environment to provide shared meanings for a novel and
ambiguous experience (i.e., retrospective sensemaking, see Weick, 1995; Weick, 1988). Since
5
then sensemaking has also come to include attempts to lend structure to a desirable future
state (i.e., proactive or prospective sensemaking, see Barton, Sutcliffe, Vogus & DeWitt,
2015, Gioia & Mehra, 1996, and Stigliani & Ravasi, 2010), and even attempts by leaders and
external stakeholders to engage in purposeful sensegiving in organizations (Gioia &
Chittipeddi, 1991, Maitlis, 2005).
From an organizational perspective, sensemaking, predominantly seen as a function of
language and communication, is the process through which “situations, organizations, and
environments are talked into existence” (Weick, Sutcliffe & Obstfeld, 2005, p. 409). During
the process of sensemaking, members of organizations notice and bracket events, produce
connections, labels and meaning, retain a consensual story about what is going on, and
determine organizational actions and coherent series of measures that need to be applied
(Maitlis, 2005; Weick, 1979; Weick et al., 2005; Whiteman & Cooper, 2011). From this
perspective, organizing emerges through sensemaking (Weick et al., 2005), be it in the form
of action development or in the institutionalizing of shared meanings, routines, and the
collective adaptation thereof. More precisely, organizational sensemaking is said to take four
forms, depending on the intensity of sensegiving performed by stakeholders and leadership:
minimal (when leaders and stakeholders do not engage in sensemaking proactively but wait
for an instance to trigger it), fragmented (different stakeholders raise suggestions, whereby the
leader does not manage the process of sensemaking), restricted (stakeholders accept the
overall ideas put forward by leaders), or guided (i.e. leaders and stakeholders actively engage
in sensemaking) – the latter being the richest instance, with much internal control and external
interventions, thereby yielding fruitful series of consistent actions (Maitlis, 2005).
With its interest in novel and ambiguous situations, the sensemaking approach to
organizations has contributed to a rich body of studies on the role of organizational factors,
environmental conditions and material artifacts in crisis situations such as the one we will be
6
taking you through in this paper (see e.g. Weick, 1988; Snook, 2000; Dunbar & Garud, 2009;
Maitlis & Sonnenshein, 2010; Whiteman & Cooper, 2011). For example, in one of the most
seminal pieces on sensemaking, Weick (1993) shows that during the Mann Gulch fire, role
structures among the smokejumpers collapsed and thereby organizational sensemaking too,
prompting them to experience sheer panic and disorganized action. He calls such situations,
when people suddenly feel that the universe is no longer a rational, orderly system,
‘cosmology episodes’. In a comparatively more settled context, Cornelissen, Mantere, and
Vaara (2014) report on how police officers committed themselves to the identification of an
alleged suspect, leading to the shooting of an innocent civilian. In their study, emotions,
material cues, and communication were found to contribute to erroneous sensemaking of the
situation at hand.
Looking back to the pilots and their work in the cockpit of AF 447, we do see
organizational factors influencing the process of sensemaking as well. For instance, the
filtering and low-key leadership played by the captain in the process of sensegiving when he
tells his copilot “it will be fine”. There is also the apparent casualness in the cockpit and its
hierarchy (Otelli, 2011). We also see environmental factors influencing the situation in the
cockpit. For example, flying in pitch darkness over the ocean, or having to deal with the IS-
mediated information about the altitude and speed. In addition, material artifacts are involved,
e.g. uniforms, the aircraft, or physical forces, and we can easily imagine the many cues the
pilots need to take into account while monitoring their machine from within the cockpit.
What is strikingly absent in the literature, though, is the amount of material agency (in the
sense of Leonardi, 2011) performed by the aircraft IS, as depicted above, together with the
relation the pilots entertain with the technology and tools they use (Dunbar & Garud, 2009;
Holt & Cornelissen, 2013; Maitlis & Christianson, 2014). Weick once suggested that “objects
are inconsequential until they are acted upon and then incorporated retrospectively into
7
events, situations, and explanations” (1988, p. 307). His analysis of the smokejumpers,
however, highlights the pervasive role of the relation they nurtured with their tools and the
sense of their profession. He later offered a refined interpretation: “when I first posed the
question, why don’t firefighters drop their tools, I assumed a separation between firefighters
and tools that may not be their circumstance at all. Instead, their circumstance may be one of
equipment and action in a context where there is no separation between subject and object”
(Weick, 2007, p. 8). In point of fact, separating pilots from the aircraft they work in, in an
analysis of the pilots’ sensemaking, is not an appropriate rendition of what it means to fly an
aircraft. This holds even more true, emotionally speaking, for pilots flying an aircraft that
does not seem to respond or behave according to their experience and knowledge of flying.
With our focus on IS we depart from the otherwise vague notion of technology (Pinch,
2008) and wish to come closer to what Orlikowski and Robey (1991, p. 144) call “any
computers (that is, hardware and software) deployed within organizations to mediate work
tasks (...) capable of modification through systems design and programming.” The IS in the
aircraft does more than producing data. It takes action and flies the airplane on its own, based
on the computed data it has generated. Hence, if we want to understand what happened to AF
447 and the pilots’ organizational sensemaking, we need to understand how human and
material agency are imbricated in the social practices of the pilots’ work (Ciborra, 2006;
Leonardi & Barley, 2008; Leonardi, 2011). Specifically, we need to pay closer attention to the
role the IS played in the cockpit, not only as a contingency to interpret and incorporate into
sensemaking (Pirolli & Russell, 2011), but as a resource that carries a structuring force in its
own right.
Looking at the imbrication of human and material agency often means an interest in how
people use and interact with the material agency of technology, and how shifts in this
imbrication provoke changes in practices and professional identities (Leonardi & Barley,
8
2008). This idea, unmistakably, echoes the current debate on the sociomateriality of
organizations (Orlikowski, 2007; Whiteman & Cooper, 2011). Research in the realm of this
ontology proposes the inseparability of technology, work, and organizations (Leonardi, 2011;
Orlikowski & Scott, 2008). The relations between human and material agencies are, from this
perspective, best observed in practice and are constantly subject to change (Orlikowski &
Scott, 2008). Illustrating this line of argument, the AF 447 case shows us what happens to
organizational sensemaking when the equilibrium between human and material agency,
enacted via specific practices, changes too abruptly.
Mindful Indifference in the Cockpit
At around 01:30 AM, flight AF 447 approached the inter-tropical convergence zone along
the equator, an area known for its bad weather conditions. The young copilot, monitoring the
information delivered by the IS, most likely made sense of this situation as one of potential
problems, for he decided to modify the scale of the weather radar to obtain a broader picture
of the weather situation (BEA, 2011). He noticed turbulence ahead at 1:35 AM. At that point,
the aircraft had left the Brazilian coast and was now flying over the ocean in the middle of the
night. The plane was still heavy with fuel and the external temperature was too high for the
aircraft to climb further. This is noteworthy, as the lift of an airplane is strongly dependent on
the air temperature. Under the conditions at this point, climbing further would have caused the
airplane to ‘stall’, a position of the airplane that breaks the lift that holds it in the air. So the
pilots saw no choice but to have the aircraft fly through the area of turbulence (Otelli, 2011).
As described above, at this point in time, the aircraft IS, rather than the pilots, was acting
upon the hydraulics and equipment on board; a normal situation for any flight so far. And
while the IS does so, the pilots monitor the IS and its actions on the aircraft. They attend to
the required tasks, interact with the system and discuss various private matters, sometimes
9
with flight attendants who enter the cockpit temporarily; people at work with a relaxed,
ordinary state of affairs (Otelli, 2011). Modern airplanes ‘encapsulate’ the captain and
copilots from their environment. During a night flight, the cockpit windows resemble black
screens. The pilots are completely reliant on the instruments in front of them. For example, to
make sense of the meteorological situation, as the pilots did above, they rely on displayed
cells on a screen. Dangerous ones are shown in red and pilots must ‘slalom’ around red cells
via inputs using their joystick. This operation highlights how material and human agency both
enable and constrain each other in the process of flying. These pilot-inputs are computed and
slightly modified by the IS to reach the same result with the most efficient combination of
actions on wings, engines and other airplane elements. In turn, the system shows the new
position of the aircraft to the pilots via computed indicators, which serves as basis for the
pilots’ next inputs. Body sensations and perceptions of physical forces, in this situation, are
not completely off, but are particularly unreliable, as they are encapsulated in the cockpit with
no visibility. Instead, the pilots rely on the embodied practice of flying, which includes an
appreciation of the machine and its qualities, and an understanding of its own actions and
reactions to specific commands or situations (Gärtner, 2013).
The aircraft experienced turbulence and Saint Elmo’s fire. Saint Elmo’s fires are light
flashes occurring when flying close to thunderstorms, similar to flames or small flashes of
lightning, caused by the ionization of air particles by the electric field surrounding the aircraft.
The flames surprised the young copilot but not the captain, who explained what they had just
seen. The young copilot’s surprise, however, did not last long and at 01:52 AM the turbulence
stopped.
A couple of minutes later the captain decided to take his break, woke the other copilot and
gave him his seat. The second copilot was more experienced than his younger colleague, and
three times more experienced than the captain himself in this particular geographical area, but
10
the controls, and herewith the formal leadership, remained in the younger pilot’s hands. The
pilot at the commands briefed his colleague. The captain, sticking to his hierarchical role,
remained in the cockpit until the end of the briefing. In a typical instance of organizational
sensemaking (Weick, 1995; Weick et al., 2005), the team of three worked on a consensus
about their situation and the meaning they should attach to specific event fluxes:
“Well, the little bit of turbulence that you just saw, we should find the same ahead. We’re in the
cloud layer; unfortunately we can’t climb much for the moment because the temperature is
falling more slowly than forecast” (BEA, 2011, p. 87).
In this one sentence, pronounced under the hierarchical supervision of the captain, we see,
in order of appearance: a rendering of what is happening; the copilot’s sensegiving in the form
of a scenario for what is about to come next; a sense of continuity as an organization; and the
constitution of a scope for action to face the situation as a team, “we can’t climb much for the
moment”. The copilot, now in charge of the aircraft, gives sense to the situation, herewith
following a standardized procedure and role enactment. This approach is exemplary of what
Maitlis (2005, p. 32) calls “restricted sensemaking”: little animation (in the sense of intensity)
and fairly high control (in the sense of standardization and guidance) yielding unitary
accounts and a one-time, planned set of consistent actions. However, contrary to Maitlis’
(2005) approach (initially based on the influence of sensegiving strategies of leaders and
stakeholders), the copilots and the captain are alone in the cockpit. The influence and many
instances of “sensegiving” they are subject to are the actions performed and information
computed and displayed by the aircraft and its IS.
Against this background, the next quote tells us plenty about the pilots’ relation to their
tools (Weick, 2007). The aircraft is flying through the inter-tropical convergence zone and the
temperature in the cloud layer keeps on rising. At this point, nonetheless, the pilots are
relaxed and confident. This confidence is a by-product of the reliability of the system they
work with and is a result of their many hours spent flying without noteworthy incident. The
BEA (2011), at this time, states that the copilot in the right seat said to the flight attendants
11
“thanks, I’ll call you when we’re out of it” and notes that “laughs” are audible in between.
This data supports the impression of a relaxed atmosphere in the cockpit (BEA, 2011; see also
Kelly & Barsade, 2001 on positive emotions and moods in interpersonal work settings). Upon
reading the instruments, the pilot in charge told his colleague:
“Standard +13 °! [i.e. 13 degrees above standard temperature] Shit, we’re lucky to be flying a
330 [Airbus plane], aren’t we? It wouldn’t look good with a fully loaded 340 [Airbus plane]!”
(Otelli, 2011, p. 233).
The two copilots broke into laughter as they agreed on the subject: their plane was smaller,
lighter, and could maneuver more easily (Otelli, 2011). This instance of sensemaking, similar
to the one above, shows a rendering of the situation, albeit one that blends both the computed
information transmitted by the machine and professional jargon. It also highlights the sticking
of the label “lucky” onto the situation they have bracketed using temperature and the many
IS-based cues they infer from. “Lucky”, in this case, shows us that in the pilots’ minds the
practice of flying an aircraft cannot be seen and described without considering the active role,
or material agency, played by the technology it involves. Not only in terms of feedback
displayed on LCD screens, but also, as hinted above, in terms of technical, emotional and
cognitive affordances and constraints for the practice of flying as such (Leonardi & Barley,
2008; Seidel, Recker & von Brocke, 2013). The pilots, in this very instance of sensemaking,
not only make sense of what the IS computes, stores and displays; instead, their sensemaking
additionally reflects what the pilots and the technology that characterizes their job can “do
together on the trails” (Stein et al., 2014, p. 158).
The pilots’ words highlight how fragile the imbrication of human and material agency can
be (Leonardi, 2011; Orlikowski, 2007), for in their case neither human nor material agency is
possible without the other, and their imbrication is constantly flying over the edge of dramatic
consequences. This imbrication of human and material agency, and the pilots’ appreciation of
its fragile, yet dependable reliability, is internalized into their work practice, and embodied to
such a point that they seem to nurture an attitude of mindful indifference towards the system.
12
We introduce this term to denote a reflexive lack of concern over potential problems.
Neglected problems and deviances are often considered a source of disasters (Vaughan, 1999;
Weick & Sutcliffe, 2015). However, in line with the conception of mindfulness developed by
Levinthal and Rerup (2006) and Weick and Sutcliffe (2006), the pilots’ casual relation to their
tools does not underline sloppiness; instead, it shows their capacity to reckon with which
particular problems might turn into serious ones, and which ones can be ignored a point
vividly depicted by the captain’s casual refusal to ask again for an alternative airfield. This
attitude is supported by their experience with the reliability of their aircraft and its systems,
the positive emotions that this relation provokes, and the rarity of critical situations and
related major stress levels in their daily work (Kaiser, Müller-Seitz & Creusen, 2008). Hence,
this indifference to potential problems is mindful in the sense that it relies on the operators’
attention, experience, and embodied knowing in practice, and does not necessarily undermine
their capacity to react properly when needed (Gherardi, 2000; Gherardi, 2006; Leonardi &
Barley, 2008).
For example, the weather conditions at this very moment were not good, but not critical as
such. The BEA highlighted that many other flights were on the same path prior to and after
the incident. Mindful indifference occurs here insofar as the weather could be a critical
problem. The pilots know that, but, considering the robustness of their aircraft, just not yet
(“Shit, we’re lucky to be flying a 330”). Were the pilots in a state of complete indifference,
losing attention? Absolutely not, for at 2:06 AM, the pilot in charge told the cabin crew that
they ought to enter an area “where it will start moving about a bit more than now” and they
“will have to watch out there” (BEA, 2012, p. 22). At 2:08 AM, a couple of minutes before
the IS failure, the older copilot suggested to his colleague to correct the course to the left to
avoid dangerous layers and changed the radar level to an even finer grain, thereby switching
from mindful indifference towards more mindful attention (Levinthal & Rerup, 2006).
13
Copilot, left seat: “You can possibly go a bit to the left, I agree that we’re not in manual eh? […]
What I call manual, er, no we’re in computed … It’s … It’s me who just changed to max eh…”
Copilot, right seat: “Ah […] you did something to the A/C [air conditioning] … no but to the
A/C”
Copilot, left seat: “I didn’t touch it”
Copilot, right seat: “What’s that smell, now?” (BEA, 2011, pp. 87-88).
An attitude of mindful indifference can be a source of creativity and improvisation,
especially under pressure (Ortmann, 2010). But flight AF 447 teaches us that the switch from
mindful indifference towards more mindful attention needs to be carefully managed and
reflected upon if a team of operators is to remain resilient in face of such complex systems. In
the cockpit, the switch, however, was not raised openly, e.g. via an instance of sensemaking
spoken out loud. Instead, we observe an increase in the pilots’ stress levels. This is observable
in their lack of precision in wording and hesitations in speaking. The process of sensemaking
is almost played in reverse: action, then clarification of the situation. Nonetheless, the pilots
still mindfully interrelate with their machine: they know that an increasing temperature could
cause ice to form around the aircraft. The BEA does not exclude that they might have heard
the sound of ice crystals hitting the windshield. The younger copilot (still formally the leader
of the aircraft at this point in time) turned on engine de-icing.
And just as the pilots did so, an alarm rang.
The autopilot had just switched to alternate law: a mode in which the aircraft IS ceases to
run the flying of the aircraft, gives back the commands to the pilots, but continues to monitor
actions, warns about the aircraft’s movements, and corrects certain maneuvers to avoid
dangerous positions. The alarm, in this case, sounds like a cavalry charge and is accompanied
by the illumination of the master warning light in the cockpit.
“I have the controls”, shouted the pilot flying, sitting on the right of the cockpit (BEA,
2011, p. 88). The IS had just lost the speed indicators. Due to inconsistencies between the
measured airspeeds, the system had to disconnect the autopilot. The most likely cause was
that the Pitot probes had become obstructed by ice crystals. Pitot probes are small tubes that
14
emerge from the sides of the aircraft nose and capture the velocity of the air relative to the
plane. If the Pitot tubes become obstructed, the system cannot calculate the speed of travel.
The speed information for flight AF 447 was completely unavailable for 29 seconds, after
which the probes on the left-hand side recovered, and it took a total of 54 seconds until all the
probes started functioning properly again.
What we have here is not so much a technical problem as a problem of practice. The
practice of flying, routinely, relies on autopilots, phenomenal computing capacities, highly
reliable and redundant IT, which, we have argued, can lead experienced pilots to adopt a state
of mindful indifference. But then, for a reason not yet clear to the pilots, the imbrication of
material and human agency prompts them to rely on hydraulics only. The loss of speed
measures has wrought havoc with the fragile imbrication described above and put the pilots
and their tools up against a challenge: they need to imbricate it anew.
By education, pilots are trained for similar situations and should be able to cope without
any problems (BEA, 2012). However, high altitudes in real situations, like those the AF 447
crew faced, were outside the scope of typical exercises, which often take place in simulators.
In addition, the pilots were handling the aircraft in turbulence. What is more, such exercises
often rely on a loss of information. Instead, in the cockpit, the accumulation of technical
failures and human reactions led to a flow of new information, generated by the IS, which
confused the pilots (BEA, 2012). Their surprise, therefore, was understandable, as the
investigation report illustrates:
“It was the autopilot disconnection that made the crew aware that there was a problem. The
crew, at this time, did not know why the AP [autopilot] had disconnected and the new situation
that had suddenly arisen clearly surprised the pilots a normal reaction for any crew” (BEA,
2012, p. 172).
In the minute that followed the autopilot disconnection, the failure of the attempts to
understand the situation and the de-structuring of crew cooperation fed on each other until the
total loss of cognitive control of the situation” (BEA, 2012, p. 199).
15
From then on, the well-orchestrated imbrication of human and material agency, which had
been operating smoothly for several hours, reached its limits and started unraveling.
The Lurking Threat of Performativity
Making sense of one’s work in a cockpit involves more than just interpreting the IS. The
radical automation of the flight deck was the major innovation offered by Airbus (Reason,
2008; Favre, 1994), an evolution sometimes criticized by pilots themselves. Cockpits are now
standardized to such a degree that a pilot experienced in flying a small airbus can fly the
largest models as well:
“The 21st century flagship A380 leverages Airbus’ 30-plus years of expertise in aircraft design,
sharing a large degree of commonality in systems, flight deck, procedures, and maintainability
with the fly-by-wire A320 and A330/A340 and the A350 XWB families. Benefits of this unique
approach include unmatched flexibility, highly efficient operations and reduced costs. Pilots do
not need extensive amounts of training to transfer from one aircraft type to another, increasing
their productivity and reducing training costs that come with the Airbus fly-by-wire families’
Cross Crew Qualification and Mixed Fleet Flying concepts.” (Airbus, 2015)ii
In such contexts, abrupt changes in the way human and material agency are welded
together in practice are most likely to affect processes of sensemaking. Obviously, heroic and
inspired reactions do happen. But it is more likely that the more surprising and life-
threatening the change, the more negative this effect will be, at least momentarily, as has been
documented for negative emotions causing narrowed action tendencies (Frijda, 1986). Such
changes challenge the knowledge on which operators rely to make sense of what is going on
and what they can do about it. Sensemaking is a process that leaves room for both ignorance
and knowledge to test old frameworks and develop new ones (Weick et al., 2005). But we
also know that practices and their enactment create this knowledge and, in return, a sense of
professional and individual identity (Brown & Duguid, 2001; Gherardi, 2006; Orlikowski,
2002). Operators, in such situations, might well have some limits to how much ignorance they
can bear without questioning the sense of who they are and what they can.
16
Abrupt breaches in the imbrication of human and material agency hold the potential to
yield powerful emotional reactions, similar to the ones one might feel during cosmology
episodes (Weick, 1993). Such emotions, during life-threatening situations, often result in an
arousal of the autonomic nervous system (Maitlis & Sonenshein, 2010; Weick, 1995) and
provoke simplified, almost subconscious reactions (Barthol & Ku, 1959; Kernberg, 1978;
Weick, 1990, 1993). As we will see, similar reactions were enacted in the cockpit of AF 447.
Considering the crucial role of leadership in guided organizational sensemaking (Maitlis,
2005), such negative emotions become the source of organizational problems. The ability to
sense and manage one’s own emotions is inhibited (Gardner, 1983); let alone any ability to
influence the emotions of the other crew members intentionally (Wolff, Pescosolido, &
Druskat, 2002). Therefore, negative emotions are particularly likely – intentionally or not – to
spread across team members (Barsade, 2002; Hatfield, Cacioppo, & Rapson, 1994) This
effect is even stronger when negative emotions and the failure to lead due to confusion stem
from the team leader (e.g. Lewis, 2000).
This pattern of emotional reactions, we suggest, is particularly problematic when teams
operate highly automated systems. The increasing automation of flight decks has contributed
to a radical decrease in the influence of human and organizational factors in accidents, thus
increasing reliability tremendously (Hee, Pickrell, Bea, Roberts, & Williamson, 1999;
Reason, 2008). But increasing levels of automation have also changed the role of the pilots
and their interactions with the aircraft and environment (Bainbridge, 1983; in line with
Leonardi & Barley, 2008). As automation increases, pilots rely more and more on a depiction
of ‘reality’ that is produced by the instruments they are supposed to control (Kallinikos, 2009;
MacKenzie, 2006).
From a sociomaterial perspective, this phenomenon echoes the concept of performativity
(Orlikowski, 2005). Performative is used to refer to any artificial object that not only
17
describes a situation, but also contributes to the structuration and evolution of this same
situation (Austin, 1970; Callon, 1998; MacKenzie, 2006; Schultze & Orlikowski, 2010). The
technology involved in the aircraft does more than just depict the flight trajectory or pitch
attitude; it also acts upon the aircraft, hereby producing action that generates the data it
computes and the information it displays, which in turn produces further actions, etc.
Consequently, as the autopilot went off, the relations between pilots and technology prompted
radical changes towards the enactment of a different practice of flying. Human and material
agencies remained tightly imbricated throughout the accident. But “ultimately, people decide
how they will respond to a technology” (Leonardi, 2011, p. 151).
IS, Performativity and Sensemaking on Flight AF 447
With the loss of reliable airspeed measures, the IS turned off the autopilot and auto-thrust
and radically changed its contribution to the practice of flying, from main operator to that of a
referee, monitoring the pilots’ actions and providing warnings when applicable. Standard
procedure in this case is to grab the controls, maintain the flight on its path and, with minimal
interferences with the system, collectively make retrospective sense of what just happened.
The pilot in charge leads this sensemaking process. Instead, the pilot in the right seat took
control of the joystick and pulled it to make the airplane climb again, pitching the nose of the
airplane by three quarters upwards, which is a dangerous input. An aircraft flies because of lift
produced by its wings as they penetrate the air, as a result of the thrust produced by the
engines. An envelope develops around the wings and holds the aircraft in the air for as long as
it maintains a sufficient speed. Managing the inclination of the wings is an essential part of
flying the aircraft. Inclining the wings at too high an angle causes unintended stalling.
Instinct, here, could be an explanation for the pilot’s reaction. Numerous pilots recreated
the scene. Warned about what happened in AF 447, all of them managed to fly manually
18
through the bad weather without pulling the joystick. However, the BEA (2012, p. 173)
provides more interpretations: the attraction of clear sky above the cloud layer; saturation of
mental resources induced by stress; turbulence altering the perception of movements; and the
crew attention focused on roll and speed. All these explanations interacted and explain the
persistence of this input. And yet, the last point made by the BEA relates to the one we just
made about performativity and the structuring power of IS. The influence of the missing
speed indicators on the altitude readings was stressed in the report. The report argues that the
IS displayed a sudden loss of altitude (about 300ft). What is more, “no explicit indication that
could allow a rapid and accurate diagnosis was presented to the crew” (BEA 2012, p. 174).
So, interpreting this data as being the right rendition of what was happening to the airplane,
the young pilot could also have thought that he was raising the aircraft back to its expected
cruising altitude.
The pilot’s input via the joystick made the aircraft climb. The movements recorded are
harsh and far from a reaction in cold blood (BEA, 2012). In terms of sociomaterial
imbrication, the IS was in the position to correct dangerous inputs, but it was not programmed
to correct such a climb. So the pilot was free to maintain this angle, with the nose pointing
upwards, until the IS sounded a new alarm. This alarm was a synthetic male voice,
announcing Stall! Stall!The aircraft, in the climbing position, had assumed an angle at
which its wings broke the airflow, meaning that it was about to lose its lift altogether. The
aircraft was now flying at 34,900 feet. The pilots had enough scope to recover from the
situation and regain stable conditions by dropping the nose first, recreating lift at a lower
altitude and climbing back at a gentler angle. This procedure, however, meant making sense
of the confusing stall warning first (Otelli, 2011).
Here, the necessity to think anew our relation to technology in sensemaking is particularly
salient. The pilots made proper sense of the situation as the autopilot went off (“I have the
19
control!”) and returned to manual piloting. However, the influence of the technology in the
structuring of the situation needs to be acknowledged, too. From the pilots’ point of view, the
dismantling of the fragile sociomaterial imbrication that characterizes the practice of flying as
they know it must have been daunting. We infer this from the transcript documenting their
broken syntax (“No it’s… er …”), high-pitched, emotionally-laden outbursts (“Wait we’re
losing…wing anti-ice…Watch your speed! Watch your speed!”) and erratic inputs on the
joystick as documented by the BEA (2012) in their report. The pilots did return to manual
piloting, but they did so while making sense of the broader problem at the same time,
herewith increasing their emotional distress tremendously. All three pilots were experienced
and all three had been trained in flight with unreliable speed indicators and stall-exercises on
flight with degraded law procedures (BEA, 2011, pp. 13-16). Even so, the chain of events was
already becoming too complex to make sense of.
As the stall alarm sounded, the copilot in the left-hand seat wondered “What’s that?”
(BEA, 2011, p. 89), to which the IS answered again: “Stall, Stall”. The copilot in the right-
hand seat (still the pilot flying) commented correctly: “We haven’t got a good… we haven’t
got a good display… of speed”, and the other copilot replied “we’ve lost the, the… the speeds
so…” (BEA, 2011, p. 89). The pilots were evidently increasingly distressed about the
apparent loss of speed. The BEA report states:
The impression of an accumulation of failures created as a result probably did not incite the crew
to link the anomaly with a particular procedure […] The symptoms perceived may therefore have
been considered by the crew as anomalies to add to the anomaly of the airspeed indication, and thus
indicative of a much more complex overall problem than simply the loss of airspeed information.”
(BEA, 2012, p. 176)
When Leaders Fail to Lead, and Followers to Follow
What does it take for a crew to face such a situation and remain resilient? Leading the
process of organizational sensemaking is what is required. The value of a sociomaterial
approach to sensemaking lies in its taking into account both the performativity of the
20
technology (Schultze & Orlikowski, 2010) and the role tools and technologies play in the
structuring of social life beyond their operators’ actual intentions and expectations
(Orlikowski & Scott, 2008). Similar to Leonardi (2011), we suggest that the imbrication of
human and material agency is constantly (re-)created as we engage with our daily work, not
least to accommodate critical situations, and contribute to the structuring of what we do, who
we are, and why the world makes sense at all (Weick, 1993; Orlikowski & Scott, 2008;
Whiteman & Cooper, 2011).
On our way towards unpacking this accident, we have argued that research on sensemaking
in crisis situations needs to reconsider the structuring role of material agency in its own right.
Sharpening that argument, we have highlighted the (potentially) performative nature of IT
(Orlikowski, 2005; Schultze & Orlikowski, 2010), not least in a cockpit, where it does more
than just inform the pilots about what is going on. The practice of flying an aircraft relies
heavily on the performance of the aircraft IS, which contributes to constraining and enabling
the pilots' scope of action. This imbrication characterizes the practice of flying, which, upon
each enactment, creates expertise, on which pilots rely to work. A corollary, we argued, is the
potential emergence of an attitude of mindful indifference, which we related to the masterful
capacity to judge what mistakes and problems might become critical and which ones can be
tolerated by the system as a whole. Against this background, abrupt breaches in the usual
imbrication of human and material agency must yield powerful emotional reactions. Mindful
indifference emerges because operators, drawing on their experience, do not expect certain
things to go from bad to worst. Hence, emotional reactions during abrupt changes might cause
instinctive reactions based on whatever is left of the “old” imbrication, instead of making
sense of the situation in such a way that the crew can imbricate material and human agency
anew towards an alternative, temporarily stable arrangement.
21
What happened, then? The emotional reaction of all three pilots is unequivocal.
Technology in an aircraft represents more to operators than a mere source of information. A
breach in the expected flow of material agency led to breaches in their work practices and
emotional stability altogether. In the face of such a situation, unprepared leaders fail to lead,
and followers to follow.iii The two pilots’ individual attention, it seems, went to the displays
and indicators and remained there until the end. In point of fact, we see almost only attempts
at individual sensemaking in this crisis situation, and hardly a trace of organizing a series of
coherent actions toward a common goal.
To better define what we mean by organizational sensemaking, let us go back again to
Maitlis’ (2005, p. 32) four forms of organizational sensemaking minimal, restricted,
fragmented and guided. In her work, she distinguishes between these forms based on the
amount (high or low) of sensegiving performed by leaders and/or stakeholders. In the highly
automated system we have at hand, the aircraft IS performs a form of sensegiving in its own
right, i.e. it attempts “to influence the sensemaking and meaning construction of others toward
a preferred redefinition of organizational reality” (Gioia & Chittipeddi, 1991, p. 442). The IS
is programmed in such a way that it must correct and alter pilots’ inputs, warn them if
appropriate, and display a wide array of information on the aircraft and its situation to assist
and funnel the pilots’ sensemaking towards maximal reliability. For example, while the pilots
struggle with the aircraft, the IS literally tells them it’s stalling, the light “Master Warning”
illuminates, and the central screen fills with failure reports. Faced with unexpected, non-
routine, high levels of material sensegiving, a team becomes highly dependent on its leader's
capacity to guide or failure to do so the process of organizational sensemaking. When
leadership controls this process, similar to Maitlis’ (2005) idea of guided sensemaking, we
predict a richer form of organizational sensemaking, characterized by encompassing and
unitary accounts of the crisis and the development of series of coherent actions. The other
22
way around, when leadership fails to control the process, the outcomes are multiple, narrow
accounts and series of inconsistent actions. Indeed, this accident shows that in life-threatening
situations, disruptions in the sociomaterial imbrication underlying highly automated systems
enjoin operators not only to hold onto their tools (Weick, 1993), but to focus their individual
attention on them. This argument is consistent with the observations of the BEA report:
In the absence of a constructed action plan, the dynamic management of a situation becomes
reactive or even random, with no anticipation. The increase in the level of emotion, which reduces
the ability to recall information, leads to a return to the simple and basic rules in executing tasks in
an unexpected situation.” (BEA, 2012, p. 175)
How the Absence of Leadership Drove the Sociomaterial Enactment of a
Cosmology Episode
“Copilot, left seat: Watch your speed, watch your speed.
Copilot, right seat: Okay, okay, okay, I’m going back down.
Copilot, left seat: Stabilize.
Copilot, right seat: Yeah.
Copilot, left seat: Go back down. According to the three [i.e. instruments] you’re going up so
you go back down.” (BEA, 2011, p. 90).
The BEA experts highlight two consequences of the interactions between the IS and the
pilots. First, the reliance on observations and comparisons between multiple indicators
“indicates the beginning of a loss of confidence in the instrument readings” (BEA, 2012, p.
176), documented by open remarks and rhetorical questions like that of the copilot, right seat:
“We’re in… yeah we’re in climb…”. Second, the dialogue above shows that the pilot in the
left seat is trying to support the pilot in charge, i.e. the formal leader in the cockpit, which is
the right enactment of his role in that very team-constellation. But, the experts note, he does
that without giving a firm objective “e.g. maintain altitude or adopt a specific pitch attitude”
(ibid).
At the same time, the pilot in charge was manipulating his instruments, an indication “that
he was searching for information. The pilot flying may therefore have been overloaded by the
combination of his immediate and natural attempts to understand the situation that was added
23
to the already demanding task of handling the aeroplane” (BEA, 2012, p. 176). The speed
indicators started to function again at this point, but the copilots kept struggling with the many
failure reports at their disposal. Panic set in. But contrary to Weick’s description of the Mann
Gulch disaster, the role structure in the cockpit does not seem to collapse altogether. The
copilot in the left-hand seat (not at the controls) called for the captain. He said “Fuck, where is
he?” (BEA, 2011, p. 91). Two seconds after that the IS again announced “Stall. Stall”, and
two seconds later “Stall. Stall” again. The airbus was climbing continuously and had reached
37,512 feet. The situation, from a detached stance, still offered enough scope for corrective
action. But the situation in the cockpit at this time is full of confusion and high-pitched
emotions, with the copilot in the left-hand seat shouting “Fuck!” again (Otelli, 2011, p. 243).
The copilot in the right-hand seat, officially the team leader, maintained his position by
remaining silent, but did not trigger attempts at organizing their thoughts or kicking ideas and
propositions. Similarly, his colleague neither urged him to do so, nor did he trigger the
process of sensemaking himself.
Two seconds later the IS again announced “Stall. Stall”. At this point, the indicators were
coherent and were providing accurate measurements of the dangerous state of the aircraft
again (Otelli, 2011). However, coordination between the two pilots was unraveling, with both
copilots not knowing exactly what the other was doing:
Copilot, left seat: “Above all, try to touch the lateral controls as little as possible, eh.” (BEA,
2011, p. 92).
The IS again announced “Stall. Stall”. The copilots waited for the captain to return to the
cockpit, exclaiming “Fuck, is he coming or what?” The IS, monitoring the situation, kept
warning the pilots by sounding the stall alarm. Meanwhile, the aircraft had reached over
37,900 feet. The copilots, despite a wide scope of action at hand, were left puzzled by the
situation, experiencing a dramatic loss of understanding about what was going on. Their
24
words indicate they are experiencing a cosmology episode in line with Weick’s definition
(1993):
Copilot, right seat: “But we’ve got the engines! What’s happening?” (BEA, 2011, p. 92).
The aircraft’s angle of attack had increased to 29.9° (BEA, 2011, p. 93). This is
unimaginable for an aircraft of this size according to Otelli (2011). The copilot in the left-
hand seat said “I don’t understand what’s going on” (Otelli, 2011, p. 250). And meanwhile the
IS kept announcing “Stall. Stall”. The pilots were constantly focusing on making individual
sense of the conduct of the aircraft, but they had still not debated collectively the stall alarm
being issued by the IS. The copilot at the controls of the aircraft burst out:
“Fuck, I’ve no control of the airplane anymore now. I don’t have control of the airplane at all!”
(BEA, 2011: 93; Otelli, 2011, p. 252).
At this moment, the copilot in the left-hand seat asked for the controls. But as soon as he
had seized his joystick, he, too, failed to guide the collective process of sensemaking to find
the appropriate response to the problem they were facing. Instead, he too pulled the aircraft
upwards, just as his colleague had been doing. The situation remained unchanged. He
shouted, highly distressed and unaware of the situation unfolding in front of them: “Fuck,
where are we? What is that?” (Otelli, 2011, p. 253).
The IS subsequently started having difficulties calculating the correct values for certain
indicators and instruments due to the position of the aircraft. The pilots had brought the
airplane into a position that was not included in the programming of the avionics. At that
moment, it appears that the copilot in the right-hand seat began reducing his information
processing to the most subjective level, the sensations of his own body, declaring: “I have the
impression we have speed.” (BEA, 2011, p. 93).
Just then, the captain enters the cockpit. By then, according to the BEA the alarm and
noises in the cockpit must have been unbearable. His delayed entry and his passivity during
the subsequent sequence of events are difficult to explain. Otelli (2011) suggests that the
25
captain must have experienced trouble recovering from a disrupted circadian rhythm (a
feeling comparable with the deep exhaustion experienced when waking up with jetlag). The
captain entered the cockpit and addressed the two copilots, wondering what was happening.
At that point, we see, again, how the team maintains its role structure. The two copilots had
not once debated the situation collectively. But prompted by their captain to say something,
they obeyed and told him they had reached the consensus that the situation had spiraled out of
control. Meanwhile, the IS continued to keep informing the crew what was actually
happening: the aircraft had stalled. This situation is vividly documented by the sequence of
statements below:
“Captain: Er, what are you doing?
Copilot, left seat: I don’t know, I don’t know what’s happening. (BEA, 2011, p. 94).
Computer: Stall!
Copilot, right seat: We’re losing control of the airplane!
Copilot, left seat: We've lost all control of the airplane! We don’t understand! We’ve tried
everything.” (Otelli, 2011, p. 254).
The aircraft’s lift had long been lost and now the aircraft had begun its fall. The angle of
attack was so steep, the computer considered it invalid. The computer announced NCD status
(“non-computed data”, for situations that are physically impossible according to the IS
parameters) two seconds after the captain had entered the cockpit. The copilot in the right-
hand seat noticed this defect at this point and seemed to decide again to rely on his own
sensations:
“Copilot, right seat: I have a problem. It’s that I don’t have vertical speed indication.
Captain: Alright.
Copilot, right seat: I have no more displays.
Copilot, left seat: We have no more valid displays.
Copilot, right seat: I have the impression that we have some crazy speed, what do you think?”
(BEA 2011, p. 94).
The aircraft, because of its position, was falling at an extraordinary speed. Misled by his
sensations, the copilot in the right-hand seat opened the airbrakes. The copilot in the left-hand
seat shouted out “No! Above all, don’t!” The IS then momentarily measured data that were
26
valid according to its parameters, and activated itself again. The synthetic voice again rang
out in the cockpit “Stall. Stall.”, increasing the pilots’ cluelessness. Their processing of the
situation shows the enactment of their individual role structures. The copilots asking the
captain for advice, the captain trying to make sense of what’s going on, but on no occasion
did they try to build an interpretation of the situation collectively. Instead, each of them kept
on dropping comments and adding more cues:
“Copilot, left seat, to the captain: What do you think about it, what do you think that we need to
do?
Captain: There. I don’t know. There. [probably pointing at indicators] It’s going down.” (BEA,
2011, p. 95).
The investigation reports document that the copilots switched again and the copilot in the
right-hand seat assumed control (BEA, 2011). Nothing changed. The joystick remained
pulled. The three pilots read aloud the indicators displayed by the IS. They were repeatedly
accompanied by the synthetic voice of the IS announcing “Stall. Stall”. And at this point, all
of a sudden, a collective instance of sensemaking emerges, starting with the pilot in the left
seat and then guided by the captain. But even though the captain has the right hunch, his
attitude demonstrates his own fixation with the indicators, thereby undermining his potential
for leadership. His idea remains vaguely ordered and therefore does not turn into a coherent
narrative about what is going on for the other two to follow:
“Copilot, right seat: What do we have in alti? [i.e. altitude]
Captain: Fuck, it’s impossible.
Copilot, left seat: What do you mean on altitude?
Copilot, right seat: Yeah yeah yeah, I’m going down, aren't I?
Copilot, left seat: You’re going down, yes.
Captain: Hey you’re in … get the wings horizontal. Get the wings horizontal.
Copilot, right seat: That’s what I’m trying to do.
Captain: Get the wings horizontal.” (BEA, 2011, p. 96).
The copilot in the left-hand seat then asked for dual input (a status in which both pilots, left
and right, can give input through their joysticks, and in which the IS computes and sends an
average input to the mechanical systems). Soon afterwards the copilot in the left-hand seat
asked for priority on the controls again, and as soon as he resumed the controls he pushed
27
slightly on the joystick, hereby slightly correcting the bad angle of attack. The data became
valid again and the IS, still in NCD mode, warned “Stall. Stall”. The aircraft was now at about
4,000 feet and correcting its trajectory would have been impossible. The copilot in the right-
hand seat shouted:
Fuck! We’re going to crash! This can’t be true! What’s going on?” (BEA, 2011, p. 96)
Concluding Reflections
In this essay, we sought to understand why the pilots in flight AF 447 failed to reconnect
with their aircraft, identify the source of their problem, and respond accordingly. In particular,
the high degree of automation and highly invasive influence of IS on board such airplanes
prompted us to ask the following theoretical question: How can the very instruments supposed
to ensure our safety and make organizations more reliable lead a team to destruction?
We suggested that material agency and the relation operators entertain with the technology
and tools they use represent two phenomena that are strikingly absent from the literature on
sensemaking. The practice of flying, we argued, involves deep intertwining of human and
material agency. It is an extreme case of a more general sociomaterial claim because in this
case both the machines and the humans take action in relation to the flight.
In the practice of flying, IS material agency is given a premium over human embodied
materiality, such that visual input is limited to controls when flying in pitch darkness, and
proprioceptive input in relation to controls is non-existent; for example, there is no resistance
or force on joysticks in Airbus aircrafts. This creates unique conditions and constraints for the
interweaving of human and material agency towards more reliability. This reliability, we
argued, is the result of work practices that rely on specific, stable and, from the pilots’ point
of view, very familiar sociomaterial imbrication (Leonardi & Barley, 2008). As operators get
accustomed to their work and gain experience, they develop an attitude of mindful
28
indifference, i.e. the capacity to reckon with which particular challenges in their work
practices might turn into critical problems and which ones can be absorbed by the overall
imbrication of human and material agency. This attitude is what helps them cope with the
complexity of their work and yet allows them to switch to a mindful level of attention in case
of serious problems.
However, abrupt changes in the sociomaterial imbrication of work practices can yield
considerable emotional upheavals, especially in dangerous situations. We argued that these
emotional reactions were not properly managed or led by the formal leader in the cockpit.
Because leadership failed to set in, the pilots tended to focus their attention on their tools and
instruments instead of triggering a process of organizational sensemaking that would have
helped them create a series of coherent actions and regain control of their instruments. In
highly-automated systems, this problem is particularly salient because technology not only
provides a picture of the situation but also acts upon the situation, thereby enhancing the
potential complexity of sensemaking processes tremendously. Beyond clear role structures,
during these episodes, a leader’s capacity for invoking organizational sensemaking is a critical
factor in determining whether or not the accumulation of errors will continue into a massive
failure or be corrected and become a near miss.
These theoretical developments generated from the case of flight AF 447 have three
significant implications for the sensemaking literature. First, this essay addresses the nascent
body of papers that examine materiality in processes of sensemaking (for a recent exception
see Cornelissen et al., 2014). Most research, however, tends to keep materiality at a distance,
focusing on the influence of materiality on individuals' cognitive processes instead. This
essay shows the relevance of linking processes of sensemaking with the sociomateriality of
work practices to provide a sharper view of the resources and constraints with which
individuals relate when they engage with sensemaking, or fail to do so. Building on that idea,
29
we proposed that an attitude of mindful indifference emerges at the interplay of knowing and
practicing (Gherardi, 2000; Gherardi, 2006; Leonardi & Barley, 2008). This attitude is fueled
by our understanding and appreciation of how many mistakes the imbrication of human and
material agency can absorb and correct. This concept describes accurately the way in which
operators of complex technologies develop mastery in coordinating material and human
agency in their work practices. Indifference can be the source of a sane and expert distance to
rules and procedures, thereby allowing for improvisations and improvements (Ortmann,
2010). It remains mindful because operators relate these breaches and deviances, including
their own mistakes, to their potential consequences, or lack thereof, for the well-orchestrated
imbrication of human and material agency. The challenge, then, is to come up with the right
structures in crisis situations to support the transition from mindful indifference to rich,
organizational sensemaking, so that such teams can coordinate “interconnected and mutually
dependent instances of local knowings and practices” (Nicolini 2011, p. 617; see also Beane
& Orlikowski, 2015; Gherardi, 2000; Gherardi, 2006; Klein, Ziegert, Knight & Xiao, 2006;
Majchrzak, Jarvenpaa & Hollingshead, 2007). Most research on the interaction between
human and material agency, so far, has looked at practices of knowledge transfer in rather
stable circumstances (e.g. Carlile, 2004; Gherardi, 2006; Klein et al., 2006). Researching the
mobilization of fragmented knowledge during crisis must move beyond a mere organizational
dimension to include the interplay of artifacts, bodies and emotions, hereby embracing an
understanding of “knowing in practice as the knowledgeability that is continually enacted
through ongoing action” (Feldman & Orlikowski 2011, 1243). Very little is known of the
sociomateriality of sensemaking (Maitlis & Christianson, 2014). Against this background, our
sociomaterial reading of sensemaking in flight AF 447 reveals that abrupt changes in material
agency, albeit provoked by minor failures, hold the potential to wipe out the very facilitators
that enable processes of sensemaking (Maitlis & Lawrence, 2007).
30
Second, this essay highlights the relevance of bodies and emotions in sensemaking
research. Data on embodiment was scarce in the material underlying this essay. Nevertheless,
we noted that the pilots, upon reaching a sense of complete incomprehension, began relying
on their own bodily sensations, trying to sense the speed of the aircraft. The aforementioned
mastery and attitude of mindful indifference also relate to bodily sensations and emotions, a
physical feeling for what the machine (an aircraft, but also a car, or any tool really) is going
through, what it is doing, how it is supposed to react, when and why, and the resulting
impression of safety that operators learn to appreciate while training and gaining experience
in their trade. Finally, we can also further imagine how emotions such as fear, stress and sheer
panic must have been embodied and viscerally experienced by the pilots. Relatedly, we wish
to call for a more gradual understanding of emotions in connection with sensemaking, ranging
from negative to positive emotions with the possibility to experience both positive and
negative emotions at the same time, as suggested by Maitlis and colleagues (2013, see also
George & Zhou, 2007). While a mild degree of positive or negative emotions might be
considered beneficial (Stein, 2004), a high degree of positive or negative emotions is likely to
be detrimental (Müller-Seitz, 2008). In crisis situations, panic can be as destructive as
overconfidence. In an attempt to deal organizationally with this issue, we see a parallel
between the pilots’ holding onto their joystick and Weick’s well-known dictum “drop your
tools” (2007). The longing for control led the pilots to make erratic, uncoordinated decisions,
and the smokejumpers in Mann Gulch (Weick, 1993) would have been better off without their
gear. Against the background of highly automated systems, however, the pilots should have
dropped their tools – and grabbed them back. Dropping the tools, in this case, means pausing
a second to make sense of the information one has without interfering with the system (Weick
& Roberts, 1993). Thereby, operators can prevent instinctive, emotion-driven reactions that
could worsen the situation and increase its complexity. When reflecting on the pilots’
31
situation in the cockpit, we cannot help but think about what would have happened, had they
paused for a second before rushing to the joystick. Would have they considered the computed
ramifications of their machine, the causes and effects of whatever information went missing
before interacting with it? “Drop and grab”, as a new dictum, could be developed into
heuristics or other mechanisms to prevent emotional contagion (Barsade, 2002; Hatfield et al.,
1994) and groupthink (Janis, 1972) and trigger collective sensemaking processes that could
help accompany the switch from a state of mindful indifference to one of structured,
collective, mindful attention.
Finally, this essay highlighted the relevance of leadership for sensemaking beyond role
structures. While we subscribe to the point of view that sensemaking never ceases to be
(Maitlis & Christianson, 2014), our analysis contributes to showing why events can also
become the trigger for an absence of organizational sensemaking as a result of an absence of
clear leadership. Weick et al. (2005) already pointed out the difficulty of bringing together
different individuals with different pieces of information to assemble new meanings. These
questions have long been neglected in sensemaking research (Maitlis & Christianson, 2014).
Leadership, we proposed, is what can make a difference and help such outfits to create
meaning beyond the frames they were trained in (Holt & Cornelissen, 2013). Research has
long predicted different types of leadership for different situations and tasks, including crises
(Fiedler, 1967; Hannah et al., 2009). But problems seldom start as a discernible crisis; quite
often, insignificant events end up producing a crisis instead. Against the background of the
AF 447 case, we predict that more resilience might emerge from teams that manage to keep
leadership firmly distributed along different lines from mere organizational structures and
roles. To achieve this distribution and promote resilience, our study highlights the relevance
of recent arguments that locate the potential for leadership in the actions taken by single
individuals, and their capacity to federate support based on these actions, instead of a
32
relational focus only (Kort, 2008). Bypassing hierarchies and role structures is often put to the
fore as a critical feature of highly reliable organizations (Weick & Sutcliffe, 2015). Initiating
processes of sensemaking is the kind of action that can be casually introduced and
preventatively repeated and standardized even in non-critical situations, to assert the
leadership of the operators in charge, but also of the other members of the crew. From our
point of view, repeated, guiding instances of collective sensemaking should begin right at the
moment when operators depart from an attitude of mindful indifference towards more mindful
attention in order to animate and control the construction of meaning and production of
corrections (Maitlis, 2005). In case the situation escalates, operators need to process
‘progressive approximations’ (Weick et al., 2005) about the situation instead of catching up
with it altogether.
Was it, then, a technological, or a human mistake? From a sociomaterial perspective,
culprits are indistinguishable, for the problems loom where human and material agencies are
imbricated. Can we, nonetheless, design technology towards even more reliability? A
potential route might be considering LCD screens and IT mediated interfaces, which often
offer such a bridge between material and human agencies. Designing these interfaces so that a
wider array of organizational dynamics such as emotions, the situatedness of knowledge in
action, or embodiment – are taken into account represents a final, fruitful suggestion for
research and practical improvements. For example, in highly automated systems,
sociomaterial performativity tends to remove the ability to use our embodied sense of space to
increase safety. But in crisis situations this prevents people from drawing on the most
fundamental interpretive vehicle for how to understand the world. For example, in the context
of aviation, Boeing decided to recreate bodily sensations of resistance in the commands when
pilots pull the nose of their airplanes to give them a better feeling for the physical forces they
are dealing with and herewith reduce the encapsulation effect of the cockpit. From a
33
sociomaterial perspective, the usage of a specific technology will always depend on the
individual and team-based enactment of whatever practices it is embedded into. Nevertheless,
sometimes the most “successful automated systems with rare need for manual intervention
[are those] which may need the greatest investment in operator training” (Bainbridge, 1983, p.
777). There is a timely need to pay more attention to the emotional relation operators develop
to the tools of their trade (Beunza & Stark, 2004) and reflect on ways to help operators meet
the challenge of sensemaking in situations when material agency takes a turn that goes
beyond their expectations. Instead of differentiating technical from organizational factors, we
might want to pay more attention to how these factors enable and constrain each other
(Giddens, 1984; Barley 1986, 1990). Design issues regarding the IS could include the
reactions of operators and their ability to reflect and manage their emotions (Picard, 1997),
e.g. by changing lights in the cockpit, addressing other senses – apart from the already
existing audible signals such as “Stall” warnings (Gardner, 1983; Hatfield et al., 1994; Wolff
et al., 2002) –, or even suggestions for actions when the operators’ reactions do not seem to
address the most urgent problem. Thinking the dynamics of imbricating IS and users anew, in
highly automated settings, might represent constructive ways to make productive use of such
workplace disruptions (Jett & George, 2003) and, hopefully, decrease the probability of
similar disasters happening again.
References
Austin, J. (1970). Philosophical papers. Oxford, UK: Oxford University Press.
Bainbridge, L. (1983). Ironies of automation. Automatica, 19, 775-779.
Barley, S. R. (1986). Technology as an occasion for structuring: Evidence from observations
of CT scanners and the social order of radiology departments. Administrative Science
Quarterly 31, 78-108.
34
Barley, S. R. (1990). The alignment of technology and structure through roles and networks.
Administrative Science Quarterly 35, 61-103.
Barsade, S. G. (2002). The Ripple Effect: Emotional Contagion and Its Influence on Group
Behavior. Administrative Science Quarterly, 47, 644-675.
Barthol, R., & Ku, N. (1959). Regression under stress to first learned behavior. Journal of
Abnormal and Social Psychology, 59, 134-136.
Barton, M.A., Sutcliffe, K.M., Vogus, T.J. & DeWitt, T. (2015). Performing Under
Uncertainty: Contextualized Engagement in Wildland Firefighting. Journal of
Contingencies and Crisis Management, 23, 74-83.
BEA (Bureau d’Enquêtes et d’Analyses pour la Sécurité de l’Aviation Civile). (2011). Interim
Report n°3 on the Accident on 1st June 2009 to the Airbus A330-203 Registered F-
GZCP Operated by Air France Flight AF 447 Rio de Janeiro–Paris. Paris: Ministère de
l’Ecologie, du Développement Durable, des Transports et du Logement.
BEA (2012). Final Report on the Accident on 1st June 2009 to the Airbus A330-203
Registered F-GZCP Operated by Air France Flight AF 447 Rio de Janeiro–Paris. Paris:
Ministère de l’Ecologie, du Développement Durable, des Transports et du Logement.
Beane, M., & Orlikowski, W. (2015). What Difference Does a Robot Make? The Material
Enactment of Distributed Coordination. Organization Science, 26, 1553-1573.
Beunza, D., & Stark, D. (2004). Tools of the trade: The socio-technology of arbitrage in a
Wall Street trading room. Industrial and Corporate Change, 13, 369-400.
Brown, J., & Duguid, P. (2001). Knowledge and organization: A social-practice perspective.
Organization Science, 12, 198-213.
Callon, M. (ed. 1998). The laws of the markets. Oxford, UK: Blackwell.
Ciborra, C. (2006). Imbrications of representations: Risk and digital technologies. Journal of
Management Studies, 43, 1339-1356.
35
Cornelissen, J., Mantere, S., & Vaara, E. (2014). The contraction of meaning: The combined
effect of communication, emotions, and materiality on sensemaking in the Stockwell
shooting. Journal of Management Studies, 51, 699-736.
Dunbar, R.L.M., & Garud, R. (2009). Distributed knowledge and indeterminate meaning: The
case of the Columbia Shuttle flight. Organization Studies,
Favre, C. (1994). Fly-by-wire for commercial aircraft: The Airbus experience. International
Journal of Control, 59, 139-157.
Feldman, M., & Orlikowski, W. (2011). Theorizing practice and practicing theory.
Organization Science, 22, 1240-1253.
Fielder, F.E. (1967). A Theory of Leadership Effectiveness. New York: McGraw Hill.
Frijda, N. H. (1986). The Emotions. Cambridge: Cambridge University Press.
Gardner, H. (1983). Frames of mind: The theory of multiple intelligences. New York: Basic
Books.
Gärtner, C. (2013). Cognition, knowing and learning in the flesh: Six views on embodied
knowing in organization studies. Scandinavian Journal of Management, 29, 338-352.
George, J. M, & Zhou, J. (2007). Dual tuning in a supportive context: Joint contributions of
positive mood, negative mood, and supervisory behaviors to employee creativity
Academy of Management Journal, 50(3), 605-622.
Gherardi, S. (2000). Practice-based Theorizing on Learning and Knowing in the Organization.
Organization 7, 211-223.
Gherardi, S. (2006). Organizational Knowledge: The Texture of Workplace Learning. Oxford,
UK: Blackwell Publishing.
Giddens, A. (1984). The Constitution of Society – Outline of the Theory of Structuration.
Berkeley, University of California Press.
36
Gioia, D. A., & Chittipeddi, K. (1991). Sensemaking and Sensegiving in Strategic Change
Initiation. Strategic Management Journal, 12, 433-448.
Gioia, D. A., & Mehra, A. (1996). Review of sensemaking in organizations. Academy of
Management Review, 21, 1226-1230 .
Hannah, S.T., Uhl-Bien, M., Avolio, B.J., Cavaretta, F.L. (2009). A framework for examining
leadership in extreme contexts. Leadership Quarterly, 20, 897-919.
Hatfield, E., Cacioppo, J. T., & Rapson, R. L. (1994). Emotional Contagion. Cambridge:
Cambridge University Press.
Hee, D., Pickrell, B., Bea, R., Roberts, K., & Williamson, R. (1999). Safety management
assessment system (SMAS): A process for identifying and evaluating human and
organization factors in marine system operations with field test results. Reliability
Engineering & Safety System, 65, 125-140.
Holt, R., & Cornelissen, J. (2013): Sensemaking revisited. Management Learning, 45, 525-
539.
Jett, Q.R., & George, J.M. (2003). Work Interrupted: A Closer Look at the Role of
Interruptions in Organizational Life. Academy of Management Review 28, 494-507.
Kaiser, S., Müller-Seitz, G., Creusen, U. (2008). Passion wanted! Socialization of positive
emotions in consulting firms. International Journal of Work, Organization and Emotion,
2, 305-320.
Kallinikos, J. (2009). On the computational rendition of reality: Artefacts and human agency.
Organization, 16, 183-202.
Kelly, J.R., & Barsade, S.G. (2001). Mood and Emotions in Small Groups and Work Teams.
Organizational Behavior and Human Decision Processes 86, 99-130.
Kernberg, O. (1978). Leadership and organizational functioning: Organizational regression.
International Journal of Group Psychotherapy, 28, 3-25.
37
Klein, K.J., Ziegert, J.C., Knight, A.P., & Xiao, Y. (2006). Dynamic Delegation: Shared,
Hierarchical, and Deindividualized Leadership in Extreme Action Teams.
Administrative Science Quarterly, 51, 590-621.
Kort, E.D. (2008). What, after all, is leadership? ‘Leadership’ and plural action. Leadership
Quarterly, 19, 409-425.
Lampel, J., Shamsie, J., & Shapira, Z. (2009). Experiencing the Improbable: Rare Events and
Organizational Learning. Organization Science, 20, 835-845.
Leonardi, P., & Barley, S. (2008). Materiality and change: Challenges to building better
theory about technology and organizing. Information and Organization, 18, 159-176.
Leonardi, P. (2011). When flexible routines meet flexible technologies: Affordance,
constraint, and the imbrication of human and material agencies. MIS Quarterly, 35, 147-
167.
Levinthal, D. & Rerup, K. (2006). Crossing an apparent chasm: Bridging mindful and less-
mindful perspectives on organizational learning. Organization Science, 17, 502-513.
MacKenzie, D. (2006). An engine not a camera: How financial models shape markets.
Cambridge, MA: MIT Press.
Maitlis, S. (2005). The social processes of organizational sensemaking. Academy of
Management Journal, 48, 21-49.
Maitlis, S., & Christianson, M. (2014). Sensemaking in organizations: Taking stock and
moving forwards. Academy of Management Annals, 8, 57-125.
Maitlis, S., & Lawrence, T.B. (2007). Triggers and enablers of sensegiving in organizations.
Academy of Management Journal, 50, 57-84.
Maitlis, S., & Sonnenshein, S. (2010). Sensemaking in crisis and change: Inspiration and
insights from Weick (1988). Journal of Management Studies, 47, 551-580.
38
Maitlis, S., Vogus, T. J., & Lawrence, T. B. (2013). Sensemaking and emotion in
organizations. Organizational Psychology Review, 3(3), 222-247.
Majchrzak, A., Jarvenpaa, S.L., & Hollingshead, A.B. (2007). Coordinating Expertise Among
Emergent Groups Responding to Disasters. Organization Science, 18, 147-161.
Orlikowski, W.J. (2002). Knowing in Practice: Enacting a Collective Capability in Distributed
Organizing. Organization Science, 13, 249-273.
Orlikowski, W. (2005). Material works: Exploring the situated entanglement of technological
performativity and human agency. Scandinavian Journal of Information Systems, 17,
183-186.
Orlikowski, W. (2007). Sociomaterial practices: Exploring technology at work. Organization
Studies, 28, 1435-1448.
Orlikowski, W., & Robey, D. (1991). Information technology and the structuring of
organizations. Information Systems Research, 2, 143-169.
Orlikowski, W., & Scott, S. (2008). Sociomateriality: Challenging the separation of
technology, work and organization. Academy of Management Annals, 2, 433-474.
Ortmann, G. (2010). On drifting rules and standards. Scandinavian Journal of Management,
26, 204-214.
Otelli, J.-P. (2011). Erreurs de pilotage – Tome 5. Levallois-Perret, FR: Actipress.
Picard, R.W. (1997). Affective Computing. Boston/MA: MIT Press.
Pinch, T. (2008). Technology and institutions: Living in a material world. Theory and Society,
37, 461-483.
Pirolli, P., & Russell, D. (2011). Introduction to the special issue on sensemaking. Human-
Computer Interaction, 26, 1-8.
Quinn, R.W., & Worline, M. (2008). Enabling Courageous Collective Action: Conversations
from United Airlines Flight 93. Organization Science, 19, 497-516.
39
Reason, J. (1990). Human error. Cambridge: Cambridge University Press.
Reason, J. (2008). The human contribution: Unsafe acts, accidents and heroic recoveries.
Farnham, UK: Ashgate.
Schultze, U., & Orlikowski, W. (2010). Virtual worlds: A performative perspective on
globally distributed, immersive work. Information Systems Research, 21, 810-821.
Seidel, S., Recker, J., & vom Brocke, J. (2013). Sensemaking and sustainable practicing:
Functional affordances of information systems in green transformation. MIS Quarterly,
37, 1275-1299.
Snook, S. (2000). Friendly fire – The accidental shootdown of U.S. Black Hawks over
northern Iraq. Princeton, NJ: Princeton University Press.
Sonenshein, S. (2009). Emergence of ethical issues during strategic change implementation.
Organization Science, 20(1), 223-239.
Stein, M. (2004). The critical period of disasters: Insights from sense-making and
psychoanalytic theory. Human Relations, 57(10), 1243-1261.
Stein, M.-K., Newell, S., Wagner, E., & Galliers, R. (2014). Felt quality of sociomaterial
relations: Introducing emotions into sociomaterial theorizing. Information and
Organization, 24, 156-175.
Stigliani, I., & Ravasi, D. (2010). Organizing thoughts and connecting brains: Material
practices and the transition from individual to group-level prospective sensemaking.
Academy of Management Journal, 55, 1232-1259.
Tsoukas, H., & Chia, R. (2002). Organizational becoming: Rethinking organizational change.
Organization Science, 13, 567-582.
Vaughan, D. (1999). The dark side of organizations: Mistake, misconduct and disaster.
Annual Review of Sociology, 25, 271-305.
40
Weick, K. E. (1979). Sensemaking in organizations: Small structures with large
consequences. Englewood Cliffs, NJ: Prentice Hall.
Weick, K. (1988). Enacted sensemaking in crisis situations. Journal of Management Studies,
25, 305-317.
Weick, K. E. (1990). The vulnerable system: An analysis of the Tenerife air disaster. Journal
of Management, 16, 571-593.
Weick, K. E. (1993). The collapse of sensemaking in organizations: The Mann Gulch disaster.
Administrative Science Quarterly, 38, 628-652.
Weick, K. E. (1995). Sensemaking in organizations. Thousand Oaks: Sage.
Weick, K. (2007). Drop your tools: On reconfiguring management education. Journal of
Management Education, 31, 5-16.
Weick, K. E., & Roberts, K. (1993). Collective mind in organizations: Heedful interrelating
on flight decks. Administrative Science Quarterly, 38, 357-381.
Weick, K. E., & Sutcliffe, K. M. (2006). Mindfulness and the quality of organizational
attention. Organization Science, 17, 514-524.
Weick, K. E., Sutcliffe, K. M (2015). Managing the Unexpected: Sustained Performance in a
Complex World. Wiley: New York.
Weick, K. E., Sutcliffe, K. M. & Obstfeld, D. (2005). Organizing and the process of
sensemaking. Organization Science, 16, 409-421.
Whiteman, G. (2010). Management studies than break your heart. Journal of Management
Inquiry, 19, 328-337.
Whiteman, G., & Cooper, W. (2011). Ecological sensemaking. Academy of Management
Journal, 54, 889-911.
41
Wolff, S. B., Pescosolido, A. T., & Druskat, V. U. (2002). Emotional intelligence as the basis
of leadership emergence in self-managing teams. The Leadership Quarterly, 13, 505-
522.
42
Appendix 1. On data.
The data that we use come predominantly from two reports (an interim and the final report) prepared
by the Bureau d’Enquêtes et d’Analyses pour la Sécurité de l’Aviation Civile (BEA, 2011, 2012), the
French government agency in charge of investigating aviation accidents. Two preliminary reports were
published in 2009. The search operation identified the wreck and recovered the flight data recorder
and cockpit voice recorder on 1 and 2 May 2011, respectively. The final BEA reports and
investigations were largely factual and offer little room for interpretation. In particular, the interim
report (BEA, 2011) contained a selective transcript of the cockpit conversations before and during the
incident; we focus on this for the purposes of our essay. The interim report did not include the whole
of the conversations. ‘Side’ discussions were frequently missing and the text was restricted to some
minutes immediately before and during the tragedy. Therefore, to increase the trustworthiness of the
data, we relied on a book written by Otelli (2011), a former pilot and the author of numerous
publications on aviation failures and accidents. In his book, Otelli transcribed the pilots’ conversations
over a much wider timeframe and did not exclude hesitations and side conversations. His transcription
resembles that produced by the BEA (i.e. providing a mixture of what the pilots said, what they did,
and what the IS displayed at the time), but more often than not Otelli interrupts the flow of the
conversation in his transcript to explain how basic piloting practice related to the actions of the captain
and copilots. In this essay, we rely exclusively on what the pilots said and did unless indicated
otherwise. We drew on the literature on airplane IS for secondary data and background information to
better contextualize the pilots’ experiences and the current debate about the role of technology in
flying.
The AF 447 case clearly belongs to what Whiteman (2010) called “management studies that
break your heart”. The consequences of the AF 447 disaster clearly make this statement appropriate,
but so too do the data on which we rely. It became increasingly difficult for us to maintain the
distanced and stoic viewpoint expected in organizational analysis, as we delved into the conversations
between the captain and copilots, as we unpacked their anxieties during the incident, as we read aloud
the outbursts in the cockpit, and especially as we reached the laconic end of the transcript of the
conversations. According to Whiteman, such emotional reactions (and the connection a researcher has
43
with his or her field of study) are powerful analytical signals that urge us to criticize, expand, and
refresh the theories on which we rely, and to engage compassionately with the field and the theory
involved. Retrospectively, we can safely say that it was our engaging with the emotions provoked by
the AF 447 case that enjoined us to explore the sensemaking processes involved in the tragedy. To
make sure, however, that these emotions would not interfere with our understanding of the case, we
contrasted our interpretations with the ones of 12 pilots and took their comments into consideration
when refining specific aspects of our descriptions, e.g. by increasing the focus on natural forces and
the filtered perception of a copilot sitting in a cockpit at night. Unless otherwise indicated, the specific
descriptions of and information on the practice of flying presented in this paper are based on
discussions with these pilots, to whom we are particularly indebted.
i In this paper, we write “aircraft IS” as a generic name to bundle the many functions of IT in the airplane, from
ii http://www.airbus.com/aircraftfamilies/passengeraircraft/a380family/commonality/
Last accessed September 24, 2016.
iii We gratefully acknowledge the help of one of the anonymous reviewers in suggesting this formulation.
4
4
Appendix 2. Empirical and theoretical timeline.
Empiri cal)cas e
Time lin e Prior*to*02:07:00 02:07:00 02:10 02:11 02:12 02:13 02:14
Airplane
;.#,*-)"*)#(0,3(0)&'0)
('7"'(*)*-&#-)"0%"'76
!"#$%&'()34'-"', 4,*%2)
0#4$*<)1("'7)
"'-(#:"--( '-%2)&-)&).("7.-)
4+)&14,-)=6999)+((-6)
!"#3#&+-)3#&*.(*
Material+(i.e.+IS)+agency++
;.()>?)4'14&#0<)&:4'7)
4-.(#)-&*@*<):&'&7(*)/"'7)
&'7%(<)-#&A(3-4#2<)+,(% )
34'*,:$-"4'<)(-36)&'0)*('0)
34::&'0)-4)-.()&"#3#&+-)
&334#0"'7% 2
>1"06)B%,*C),$4')&)
34::&'0)4+)-.()
$"%4-*<)$#4A( 3-*)'(/)
/(&-.(#)0&-&)4')
-.()#&0&#
D4*-)%"@(%2C)B"-4-)
$#41(*)+#((E( )0,()-4)
/(&-.(#)34'0"-"4'*6)
F"-.4,-)*$((0)0&-&<)>?)
34:$,-(*)&)%4**)4+)
&%-"-,0(6 )!%&#:C)-.()
&,-4$"%4-)*/"-3.( *)4++6)
;.()>?)4'%2):4'"-4#*)
-.()$"%4-*)&3-"4'*)/" -.)
:"'":&%)34##(3-"4'*6)
>-)/&#'*)-.()$"%4-*C)
?-&%%G
>?)/&#'*C)H?-&%%G)
?-&%%H)&'0)0"$%&2*)&')
&33,:,%&-"4')4+)
+&"%,#(*)4')-.()
:4'"-4#*6
I(3&,*()4+)*%"7.-)
34##(3-"4'*)4+)-.()&'7%()
12)-.()$"%4-*<)-.( )>?)3&')
34:$,-()&7&"')&'0)/&#'*C)
?-&%%G)?-&%%G
>1"0 )J
Huma n+(i. e.+crew)+ag ency
B"%4-*):4'" -4#)-.()*2*-(:<)
0"*3,**)3%,(*)&'0)#($4#-*<)
34::,'"3&-()/"-.)34'-#4%)
$4"'-*
!0A,*-)#&0&#6)?(()
#(0)3(% %*)&' 0)0(3" 0()
-4)&K4"0)#(-,#'*)
0(-(3-(0)12)-.()
#&0&#6
Pilot,+right+seat:)L#&1 *)
-.()34'-#4%*6)B,%%*)-.()
*-"3@<)$4-('-"&% %2)&*)&)
#(*$4'*()-4)-.()
0"*$%&2(0)%4**)"')
&%-"-,0(6
Pilot,+right+seatC)$,%% *)
,$)/."%()0"*4#0(#%2)
#(&0"'7)-.()
"'*-#,:('-*6
Pilot,+right+seat:+
@(($*)$,%%"'76);.()
-/4)$"%4-*)
#($(&-(0% 2)3.&'7()
$4*"-"4'*< )1,-)14-.)
$,%%),$6);.()3&%%)+4#)
-.()3&$-&"'6
Captain C)+4%%4/ )-.()
34'3%,*"4'*)4+)-.()$"% 4-*<)
34::('-*)"'0"3&-4#*C)
$%&'()"*)+&%%"'7)04/'6)
Captain: )-#"(*)-4):&@()
*('*()4+)/.&-M*)74"'7)4'6)
L"K(*)*4:()&0K"3(*< )(676)
4')-.()/"'7%( )&'7%(<)
/"-.4,-)+"#:)34::&'06))))))))
Pilots:+@( ($)$,%%"'7
)J
!-)9NC9N<)-.()3&$-&"')%(&K( *)
-.()343@$"-6)B"%4-)#"7.-)*(&-)
1(34:(*)%(&0( #6
Pilot,+left+seatC)&*@*)
-.()$"%4-)+% 2"'7)-4)*-4$)
-.()3%":1
!%#(&02<)-.()$" %4-*)
1(7"')*.4/"'7)&)%4**)
4+)34'+"0('3()" ')-.()
"'*-#,:('-)
#(&0" '7*6
Pilots: )@(( $)$,% %" '7
Pilots C)O&%-(#'&-(%2P )$,%%)
&7&"')-.()'4*()4+)-.()
&"#$%&'(),$)"')-.()&"#
Theoreti cal )impl ica tion s
Q%2"'7)&*)&)$#&3-"3()#(%"(*)4')
*$(3"+" 3<)*-&1%()&'0<)+#4:)
-.()$"%4-*R )$4"'-)4+)K"(/<)
K(#2)+&:"%" &#)*43"4:&-(#"&%)
":1#"3&-"4'*6);.()$"% 4-*)
&--"-,0()"*)3.&#&3-(#"*(0)12)
4'()4+):"'0+,% )
"'0"++(#('3(<)"6 (6)-.("#)
3&$&3"-2)-4)#(3@4'"'7)/"-.)
/."3.):"*-&@()4#)0(K"&'3()
3&')1()&1*4#1(0)12)-.()
*2*-(:)&'0)/."3.)4'(*)3&')
-,#')"'-4)$#41%(:*
B"%4-*)7#&0,&%%2 )
*."+-)+#4:):"'0+,%)
"'0"++(#('3()-4)&')
&--"-,0()4+):"'0+ ,%)
&--('-"4'6)>'3#(&*()
"')*-#(**)%(K( %*)
3.&#&3-(#"E(0)12)
-.()&1*('3()4+)&)
3%( &#)$#43( **)4+ )
*('*(:&@" '76)
!1#,$-)3.&'7(*)"')-.()
":1#"3&-"4')4+).,:&')
&'0):&-(#"&%)&7('32)
#(*,%-*)"')&')&#4,*&%)4+)
-.()&,-4'4:"3)'(#K4,*)
*2*-(:6)
;.()$"%4-*M)&--('-"4')
+43,*)4')-.()
"'*-#,:('-*)"'*-( &0)
4+)-#"77(#"'7)&)$#43(**)
4+)*('*(:&@" '76)
?('*(:&@" '7)-&@(*)
&)*-#4'7%2)"'0"K"0,&%)
-&@(6)>')$&'"3<)-.()
$"%4-)"')3.&#7()
1(7"'*)#(%2"'7)4')."*)
140"%2)*( '*&-"4'*)
"'*-(&0)4+):&@"'7)
*('*()4+)-.()
"'*-#,:('-*6)
S(&0(#*."$)"')-.()
343@$"-)"*)'4-)
34:$%(-(% 2)&1*('-6)
;.()$(#+4#:&-"K"-2)4+ )-.()
*2*-(:):&@(*)&')&33,#&-()
$#43(**)4+)*('*(:&@" '7)
K(#2),'%"@(%26);.()
34:$%(V)3.&"')4+)
":1#"3&-(0).,:&')&'0)
:&-(#"&%)&7('32)&'0)
&33,:,%&-"4')4+)0&-&)&'0)
+&"%,#()#($4#-*)&#()
4K(#/.(% :"'76)
)J
?-(&02)$4*"-"4' <)4#0"'&#2)+%"7.-):40(6)I&0)/(&-.(#)34'0"-" 4'*)&.(&06
;.()&'7%()4+)-.()&"#$%&'( )"*)7(&#(0)-4/&#0*)
89)0(7#(()&'0):4#(6)!"#$%&'()#(&3.(*)
=W6999)+((-6
... These algorithms are designed for solving different problems and do not always provide coherent output, so that it is the task of the responsible persons to use diverse information and unravel the complexity of the relation between different algorithmic calculations to make a decision that can be enforced within the organisation. Several studies (Berthod & Müller-Seitz, 2018;Daipha, 2015) show how difficult and risky this is. In an emergency, for example, different signals in the cockpit may indicate different technical malfunctions, and it remains the task of the pilots to develop a meaningful (and possibly life-saving) interpretation from a mix of information (Berthod & Müller-Seitz, 2018). ...
... Several studies (Berthod & Müller-Seitz, 2018;Daipha, 2015) show how difficult and risky this is. In an emergency, for example, different signals in the cockpit may indicate different technical malfunctions, and it remains the task of the pilots to develop a meaningful (and possibly life-saving) interpretation from a mix of information (Berthod & Müller-Seitz, 2018). ...
Article
Full-text available
Social science research has been concerned for several years with the issue of shifting responsibilities in organisations due to the increased use of data-intensive algorithms. Much of the research to date has focused on the question of who should be held accountable when 'algorithmic decisions' turn out to be discriminatory, erroneous or unfair. From a sociological perspective, it is striking that these debates do not make a clear distinction between responsibility and accountability. In our paper, we draw on this distinction as proposed by the German social systems theorist Niklas Luhmann. We use it to analyse the changes and continuities in organisations related to the use of data-intensive algorithms. We argue that algorithms absorb uncertainty in organisational decision-making and thus can indeed take responsibility but cannot be made accountable for errors. By using algorithms, responsibility is fragmented across people and technology, while assigning accountability becomes highly controversial. This creates new discrepancies between responsibility and accountability , which can be especially consequential for organisations' internal trust and innovation capacities. K E Y W O R D S accountability and responsibility, algorithmic accountability, algorithmic decisions, Niklas Luhmann, organisation theory
... Ces contextes regroupent les recherches dans lesquelles des ruptures de nature diverse ont conduit à des épisodes cosmologiques exceptionnels (Weick, 1993) liés au fonctionnement même de certaines entités : aviation (Weick, 1990 ;Berthod et Müller-Seitz, 2017), navettes spatiales (Starbuck et Farjoun, 2005 ;Vaughan, 1996), secours incendie (Weick, 1993), expéditions (Kayes, 2004 ;Roberto, 2002 ;Tempest, Starkey et Ennew, 2007). Sous l'impulsion de problèmes non seulement techniques, mais également sociaux, d'erreurs, d'incapacité à modifier les routines ou encore de communications défaillantes (Schakel et al., 2016), petites déviations ou défauts d'expérience peuvent s'enchaîner en séquences défavorables jusqu'à produire des effets incontrôlables et des accidents. ...
... La reconstruction d'accidents, par exemple, est une stratégie de recherche éprouvée dont les limites sont également bien connues. Elle s'avère néanmoins précieuse et fiable notamment lorsque des événements sont enregistrés en temps réel et que de nombreuses sources complémentaires sont accessibles par la suite (Berthod et Müller-Seitz, 2017). ...
... Automation practices, for instance, can reduce the amount of experience people have directly operating, rather than monitoring systems. As the 2009 Air France 447 crash illustrates, when automation technologies failed, pilots with little experience flying manually in unusual conditions entered a state of growing panic and were unable to vividly represent information about their altitude and pitch, leading to the loss of 228 lives (Berthod and Müller-Seitz, 2018;Oliver et al., 2017). As previously discussed in relation to roles, interactions that emphasize experience and expertise can erode vividness. ...
Article
Full-text available
This article develops the attention-based view of crises. Crises implicate the failure of structures that shape attention throughout a system and initiate attempts to transform these structures. As crises are so influential, case studies of crisis provide rich details from which to build theory. Synthesizing insights from 80 qualitative case studies of crises, we build a theoretical framework that revitalizes scholarly understanding of how structure and attention relate in today’s complex systems. This framework reveals how everyday social practices instantiate structure, compose systems, and shape the quality of attention, such that practices constitute both a source and solution to crises. Understanding the systemic nature of attention through practices might therefore advance our collective capacity to face crises. It also contributes more broadly to ongoing conversations about how to apply the attention-based view in today’s world, where important organizations look less and less like traditional big businesses, notions of structure implied by formal organization charts are diminishingly relevant, and the quality of attention matters more than its quantity.
... Berthod and Müller-Seitz (2018) showed in the article "Making Sense in Pitch Darkness" that leadership played a role in organizational insentience. (46) According to the results of research carried out in the field of organizational insentience, creating, and strengthening organizational sense should be considered part of the main activity in organizations. Based on the research, we can point to different components, including human resource management (38), meritocracy (39), organizational culture (34), manager-employee relationship (37), and work alienation (17). ...
Article
Full-text available
Abstract INTRODUCTION The phenomenon of organizational insentience has been one of the main challenges in recent years and refers to a situation in which the employees of an organization have become indifferent to their organizational environment. The current research was conducted to investigate the causes and contexts of the formation of organizational insentience in the Red Crescent Society of Tehran province, Iran. METHODS: This applied study was conducted based on the descriptive-correlation method of data collection. The statistical population of this research included all the managers of Red Crescent Organization in Tehran province in 2021. The samples (n=90) were selected using Cochran's statistical formula and stratified random sampling. The required data were collected using a researcher made organizational insentience questionnaire. The validity and reliability of the questionnaire were checked and confirmed. The collected data were analyzed in Smart PLS software using structural equation modeling with partial least squares approach. FINDINGS: The results of the research showed that causal conditions had a direct, positive, and significant effect on the central category with a standard beta coefficient of 0.725, t=17.625, and P=0.001 at the level of one percent error. Moreover, contextual conditions with a beta coefficient of 0.410, t=3.107, and P=0.002 had a direct, positive, and significant effect on the category of strategies at the level of one percent error. Intervening conditions had a direct, positive, and significant effect on strategies at the level of one percent error (standard beta coefficient=0.221, t=3.008, and P=0.003). The central category had a direct, positive, and significant effect on the strategy category with a beta coefficient of 0.334, t=2.282, and P=0.023 at the five percent error level. Based on the results, at one percent error level, the strategies presented in the research had a direct, positive, and significant effect on the outcomes (standard beta coefficient=0.347, t=3.769, and P=0.000). In total, the results showed the existence of organizational insentience in the Red Crescent Organization of Tehran province, which can be reduced by applying appropriate methods. CONCLUSION: According to the findings, by reducing the sense of meaninglessness towards work, double standards, eliminating organizational discrimination, applying appropriate procedures in strategies and their implementation, paying attention to the structural dimensions of the Red Crescent Society in Tehran province, and implementing meritocracy management, it would be possible to increase organizational social capital, the level of interest, commitment, and responsibility of employees towards the Society and work, organizational independence, organizational justice, the atmosphere of trust, and supportive behaviors from the Society and employees. By adopting these measures, a step would be taken toward preventing the formation of organizational insentience in the Red Crescent Society of Tehran province. Keywords: Causal conditions; Organizational insentience; Red Crescent Society; Strategies; Tehran province
Article
Full-text available
High reliability organizations (HROs) are rare organizations that manage established technologies to avoid catastrophic errors. The concept of reliability, however, has become attractive to other organization types. This expansion creates scholarly questions about what reliability is outside of HROs. The COVID‐19 pandemic challenged new organizations to create reliability by also creating alternative meanings and practices of reliability that could adequately address an unknown, evolving health threat. This study draws on semistructured interviews and virtual ethnography during the first year of the COVID‐19 pandemic to examine how organizations communicatively defined reliability. The study finds that organizations engage in datafication of hazards to demonstrate they are performing reliably and proposes the practice of “evidencing reliability” as an important step in constituting reliability. However, datafication of hazards can also lead to skewed understandings of organizational performance and potential success biases.
Article
For a ski guide, updating on the ever-changing natural conditions and group dynamics is essential to stay safe and provide a good experience for clients. In this paper, we explore how guides update their understanding in the mountains. Our data arise out of a one-season participant ethnography of ski guiding in Norway.
Article
Full-text available
Despite previous efforts to deal with the ontological split between human subjects and reality, sensemaking has remained human-centered. We argue that human-centered sensemaking risks omitting constitutive elements of reality. To escape the ontological split, we decenter sensemaking and thus extend it in such a way that it allows seemingly unrelated and independent humans and nonhumans to become connected and interdependent with what is made sense of. Doing so allows us to demonstrate how a decentered understanding of reality can produce a radically different understanding of research phenomena. As a means to show the consequences of a decentered sensemaking, we revisit the Mann Gulch disaster and show that not all disasters can be avoided by better sensemaking or good management.
Article
Full-text available
Background: Considering the high goals of an organization, one of the most important tasks of organizations is to motivate employees as the strategic assets of the organization. The phenomenon of organizational insentience has been one of the main challenges in recent years and refers to a situation in which employees of the organization are not motivated toward their organizational environment. Accordingly, the present study aims to develop the scale of organizational insentience in the Red Crescent Society of the Islamic Republic of Iran. Methods: The present study was conducted through a mixed method in 2022. The qualitative part has been done using the grounded theory approach and based on semi-structured and in-depth interviews with 21 experts selected through purposive and snowball samplings. Furthermore, the quantitative part has been done by the structural equations modeling with partial least squares approach and Smart PLS3 software. The statistical population of the research in the quantitative part included all the executives (110 people) of the Tehran Red Crescent Organization. With a random sampling method based on Cochran formula, 85 people were selected, and for further confidence and reducing the sampling error, 90 individuals were selected for the sample. To collect data in a quantitative part, the researcher-made questionnaire regarding organizational insentience derived from the research model was used. It included 34 items in the form of a 5-point Likert scale. The reliability of the questionnaire was assessed through Cronbach's alpha coefficient, and combined reliability and validity were assessed through the construct validity method, both of which were confirmed based on the results. Results: In the qualitative part, after 3 stages of open, central and selective coding, the research model including 6 main categories (central category; causal, intervening, and background conditions; strategies and consequences of organizational insentience) and 29 sub-categories were presented. The results of the quantitative part showed that causal conditions with a significant value of (0.001) and path coefficient of (0.725) had a significant and positive effect on the central category, and intervening conditions with a significant value of (0.003) and path coefficient of (0.221), background conditions with a significant value of (0.002) and path coefficient of (0.410) and central category with a significant value of (0.023) and path coefficient of (0.334) had a positive and significant effect on strategies. Moreover, strategies with a significant value of (0.001) and path coefficient of (0.347) had a positive and significant effect on the individual and organizational consequences of insentience in Tehran Red Crescent population. Conclusion: The results demonstrated that the model presented in this study was a suitable model for knowledge and awareness of managers in the field of concepts and categories affecting organizational insentience. Therefore, it is recommended that the managers of the Red Crescent Society use the model presented in this study to minimize the phenomenon of organizational insentience and increase employees' productivity.
Article
Full-text available
Zusammenfassung Die Frage der Verschiebung von Verantwortung, die durch den vermehrten Einsatz von datenintensiven Algorithmen verursacht wird, beschäftigt seit einigen Jahren die sozialwissenschaftliche Forschung. Dabei geht es vor allem darum, welche Personen oder Instanzen dafür verantwortlich sein sollen, wenn sich Entscheidungen als diskriminierend, sachlich falsch oder ungerecht herausstellen. Aus soziologischer Perspektive fällt auf, dass in diesen Debatten nicht trennscharf zwischen Verantwortung und Verantwortlichkeit unterschieden wird. In unserem Beitrag greifen wir diese Unterscheidung so wie sie von Niklas Luhmann formuliert wurde auf, um Veränderungen und Kontinuitäten des Organisierens zu analysieren, die mit dem Einsatz datenintensiver Algorithmen verbunden sind. Wir zeigen, dass Algorithmen in organisationalen Entscheidungsprozessen Unsicherheit absorbieren und somit durchaus Verantwortung übernehmen, aber nicht für Fehler verantwortlich gemacht werden können. Durch den Einsatz von Algorithmen wird Verantwortung in Assemblagen von Personen und Technik zerlegt, während die Zuschreibung von Verantwortlichkeit hochgradig kontrovers wird. Daraus entstehen neue Diskrepanzen zwischen Verantwortung und Verantwortlichkeit, die insbesondere für das organisationsinterne Vertrauen und die Innovationsfähigkeit von Organisationen folgenreich sein können.
Article
Full-text available
The authors reflect on ways increased prevalence of technology and digital natives entering the workplace influence how work is approached. They talk about competencies of the digital workforce and suggests both digital natives and digital immigrants could have the skills needed to utilize technology for manipulating data, problem solving, and new product creation. They comment on interpersonal relating and identity development in digital workforces, and the utilization of technology at work.
Chapter
Full-text available
This paper reports an ethnographic study of the initiation of a strategic change effort in a large, public university. It develops a new framework for understanding the distinctive character of the beginning stages of strategic change by tracking the first year of the change through four phases (labeled as envisioning, signaling, re-visioning, and energizing). This interpretive approach suggests that the CEO’s primary role in instigating the strategic change process might best be understood in terms of the emergent concepts of ‘sensemaking’ and ‘sensegiving’. Relationships between these central concepts and other important theoretical domains are then drawn and implications for understanding strategic change initiation are discussed. © Gerry Johnson, Ann Langley, Leif Melin and Richard Whittington 2007 and Cambridge University Press, 2010.
Article
Full-text available
As more and more information becomes available in larger, ever more rapid flows, the skills of sensemaking are no longer required just of specialists in intelligence analysis, but increasingly of everyone. We all live in a data world with continually flowing streams of information. These articles are the beginning of a coordinated effort to build better tools and better theories of how to cope. Sensemaking is increasingly a part of all our lives; this collection shows a few ways to understand what's happening to us.
Book
This book explores the human contribution to the reliability and resilience of complex, well-defended systems. Usually the human is considered a hazard - a system component whose unsafe acts are implicated in the majority of catastrophic breakdowns. However there is another perspective that has been relatively little studied in its own right - the human as hero, whose adaptations and compensations bring troubled systems back from the brink of disaster time and again. What, if anything, did these situations have in common? Can these human abilities be ‘bottled’ and passed on to others? The Human Contribution is vital reading for all professionals in high-consequence environments and for managers of any complex system. The book draws its illustrative material from a wide variety of hazardous domains, with the emphasis on healthcare reflecting the author's focus on patient safety over the last decade. All students of human factors - however seasoned - will also find it an invaluable and thought-provoking read.
Article
On April 14, 1994, two U.S. Air Force F-15 fighters accidentally shot down two U.S. Army Black Hawk Helicopters over Northern Iraq, killing all twenty-six peacekeepers onboard. In response to this disaster the complete array of military and civilian investigative and judicial procedures ran their course. After almost two years of investigation with virtually unlimited resources, no culprit emerged, no bad guy showed himself, no smoking gun was found. This book attempts to make sense of this tragedy--a tragedy that on its surface makes no sense at all. With almost twenty years in uniform and a Ph.D. in organizational behavior, Lieutenant Colonel Snook writes from a unique perspective. A victim of friendly fire himself, he develops individual, group, organizational, and cross-level accounts of the accident and applies a rigorous analysis based on behavioral science theory to account for critical links in the causal chain of events. By explaining separate pieces of the puzzle, and analyzing each at a different level, the author removes much of the mystery surrounding the shootdown. Based on a grounded theory analysis, Snook offers a dynamic, cross-level mechanism he calls "practical drift"--the slow, steady uncoupling of practice from written procedure--to complete his explanation. His conclusion is disturbing. This accident happened because, or perhaps in spite of everyone behaving just the way we would expect them to behave, just the way theory would predict. The shootdown was a normal accident in a highly reliable organization.
Article
What difference does robotic telepresence make to the coordination of complex, dynamic, and distributed knowledge work? We explored this question in a post-surgical intensive care unit where medical workers struggled to coordinate their work in the face of different assessments of their extremely sick patients. Our in-depth field study examined night rounds, a central routine for coordinating work in this unit that was performed remotely through different technologies. We found that night rounds that are materially enacted through robotic telepresence intensify coordination outcomes both positively and negatively, resulting in contrary implications for subsequent coordination of work. We further found that these differences in intensification depend on whether preparatory work is more or less distanced from the bedside. We develop a theoretical account of these findings by explaining how the coordination of complex, dynamic, and distributed work is crucially related to how that work is materially enacted over time.
Article
Karl Weick's classic study of "sensemaking" showed that there is much to be learned from a wildland fire. In this tradition, we present an ethnographic tale from the subarctic to introduce the concept of ecological sensemaking-the process used to make sense of material landscapes and ecological processes. We then reanalyze data from the Mann Gulch fire and conclude that ecological sensemaking and ecological materiality were underappreciated dimensions of this historic tragedy. Comparisons of incidents and actors suggest that ecological embeddedness enables sensemaking and that inability to make sense of subtle ecological cues introduces hidden vulnerability.