Drew Hardesty
Forecaster
Human Performance in Uncertain Environments
By LAURA MAGUIRE, M.Sc
Graduate Researcher
Cognitive Systems Engineering Lab
(this piece soon to be published in the periodical of the American Avalanche Association, The Avalanche Review and built upon her original work presented in Innsbruck Austria in Fall 2018 -
How we perceive risk and make decisions can be subject to flawed thinking and dangerous biases, but this is only one small dimension of a much broader understanding of what constitutes expert performance in uncertain, changing environments.
This article will describe the conditions that makes snow safety difficult while drawing comparisons to highly skilled practice in other high risk/high consequence domains and then make a pitch for using this perspective to explore some useful avenues.
Looking into a field of practice and asking ‘what makes this hard?’ is a fundamental question for understanding what is means to be an expert in that kind of work. It’s a frame of reference that provides insights into the kinds of challenges faced by practitioners where simple proceduralization is not possible (or, desired).
In other words, how do experts cope when the data is ambiguous, analysis remains uncertain, and rules are underspecified, insufficient or, inapplicable? The avalanche community already recognizes the limitations of strict rule following in making judgements about the snowpack. For example, in the Canadian Avalanche Association Observational Guidelines and Reporting Standards for Weather, Snowpack and Avalanches (OGRS), there are seven7 instances of the word “rule” whereby six6 of these indicated that a definitive rule was impossible! (The seven7th instance was to describe a rule of thumb and indicate variability was required).
If the rules by themselves are unable to prescriptively define safe decisions, yet, many outcomes are successful, then avalanche professionals must be doing something right! Given this paradox, we can make a guess that there is sophisticated cognitive work - in perception, reasoning, evaluation and judgement - that goes into successfully managing the ambiguity in forecasting and guiding work.
Given the extensive literature on the technical aspects of forecasting you might think this already exists. Yes and no. There is a solid foundation of work based on introspective self reports from interviews and surveying of experienced practitioners, but “cognitive psychologists have noted there are limitations to what people can actually tell us about their mental processes” (Nisbett & Wilson, 1977, p.232). This means studies based on self-reports will only provide a partial understanding. In addition, observational studies can be similarly limited. As the ‘fluency law’ (Woods, 2005) notes, most experts are really good at what they do; which this often makes their work look easy to observers so it’s hard to ‘see’ just how difficult it actually is!
In this way, we need to use other methods and triangulate them to uncover cognitive aspects of managing risk in the mountains. A good place to start doing this is by examining ‘the hard stuff.’.
Challenges to performance in avalanche forecasting
Given what we know - based on studies in healthcare, air traffic management, and mission control in space exploration (Cook & Rasmussen, 2005; Patterson, Watts-Perotti, & Woods, 1999; Smith, P.J., McCoy, E. and Orasanu, J., 2001)- there are common patterns to what can make a job hard. In mountain environments, salience & discriminability, change, goal conflicts, and coordination are of particular interest. Studying how experienced practitioners handle these difficult aspects of their work helps provide insight into the nature of domain specific expertise.
Salience & discriminability of cues.
Salience refers to how noticeable or discriminable information is given the backdrop of the environment and all the other possible sources of information. For example, a rapid deformation and crack propagating fracture is a very salient cue - there is the movement of the snow shifting (even slightly) in crack propagation across a slope, changes in light and texture on the surface of the snow as the fracture line opens up, and perhaps even an auditory signal.
But that example is a bit of an outlier because to uncover many of the meaningful signals or cues about what is happening in the snowpack you literally have to go digging to extract the right kinds of information. In other high consequence monitoring environments like nuclear power plants or intensive care units there are often layered sensor networks providing real time data to aid the operator in monitoring the state of the system across multiple variables. In those environments, threshold alarms or visual displays can highlight minor variations to inform the plant operator or nurse that the state is changing. When out in the field, snow safety professionals have to accurately perceive often very subtle cues. (pull quote) And, even when assessing already existing data from previous assessment reports or online databases, the information is presented in a way that requires mental effort to extract the meaningful data from the background information.
This represents the first of the challenges in forecasting - data has varying levels of salience and is collected over different points in time - so variations in that data requires ongoing interpretation to recognize when conditions change in a meaningful way.
Change is a constant.
Notice, in that last paragraph I didn’t say if conditions change. Because this is another key factor of what makes avalanche forecasting cognitively demanding work: conditions are continually changing, and often in an unpredictable and interactive manner.
Because, while we may understand conceptually how windloading can influence the snowpack, there are only imprecise measurements of how much and where the loads are setting up. Even with telemetry devices, these can fail or be overcome with rime providing false information and adding to uncertainty. Also, the measurements taken are categorical (a point in time) not continuous. This means relevant cues take time to accumulate which can slow decisions about the trajectory of the stability (Is it increasing? Decreasing? How slowly or quickly? What might this mean for my guiding plans today? What other factors will inform not only my assessment but my planning or revision when conditions change?)
Noticing change (and rates of change) and interpreting its meaning is best supported with continuous telemetry with low time delay. However, field data will always come with some form of time delay (for instance, waiting for the sun to come up to be able to visually inspect a cornice or in the time it takes to ski out to a slope and dig a pit). Lag in any kind of system where the hazard, once triggered, is largely unstoppable severely compromises the capacity to manage the variability (and its corresponding risk) common in that system.
Multiple competing demands
Forecasting work, like most high demand practice, occurs within a system subject to constraints. For instance, control work must be completed before the hill can open to guests anxiously waiting for first chair or incoming weather is limiting the window for the helicopter meaning you have to prioritize the plan.
Even the most safety conscious organization does not exist solely to eliminate risk - instead they control risk to meet other objectives. These goals - running a successful ski hill, keeping a highway open, or, enabling crews safe transit to keep an construction project on schedule- are subject to tradeoffs in order to maintain safe operations.
Managing these competing demands is part of successful expert performance. Pilots, at the bare minimum, are obviously expected not to crash the plane but they also balance responsibilities for having on-time departures, ensuring passenger comfort, minimizing fuel costs and keeping accurate flight records. These additional demands require consideration in making or revising plans as disruptions occur. Similarly, the analysis that a ski guide has to do to ensure clients who’ve paid thousands of dollars get to safely ski great lines means additional cognitive burden.
Coordination is key.
The multiple goals of the work system go hand-in-hand with another aspect that makes forecasting work hard: the need to coordinate across a distributed network.
The coordination may be so others can provide information (like when a team calls in the results of their control work), to adjust their actions (say, changing the pick up location with the cat driver), to provide approvals, or to or, communicate to public and other impacted users.
Well-coordinated groups run smoothly - minimizing downtime or unnecessary risk - and help to proactively identify issues. Coordination breakdowns increase the cognitive work by introducing lag or requiring more effort to determine what others are doing and how that may impact your plans.
So, what does this mean?
In outlining the characteristics that make the work hard, it is clear that snow safety work is cognitively demanding. Describing the “hard work” in this way provides more specific explanations into why things sometimes go wrong. Research into the cognitive work of avalanche forecasting in ski resort operations (Maguire & Percival, 2018) provides an example of how this new perspective can reframe how we think about professional practice.
Further studies can help generate rich descriptions that allow for comparing and contrasting performance across a variety of conditions and work environments. This gives us more nuanced understanding of when things like heuristics and biases help us cope with dynamic and demanding environments and when it can get us into trouble. This kind of data also provides promising design directions for engineering tools and technologies to better support practitioners. No matter what your opinion on why people make mistakes, avoiding oversimplifications about how work gets done is critical.
While we know a lot about the technical expertise in snow safety professions, making visible the strategies used to get work done in context - with all the messy details, goal conflicts, and complexities - can help improve training programs, develop new technologies, refine procedures and enhance teamwork to support more successful outcomes.
Many thanks to Greg Gagne for the conversation that inspired this article and his feedback and to Jesse Percival for his feedback.
References:
Cook, R., & Rasmussen, J. (2005). “Going solid”: a model of system dynamics and consequences for patient safety. Quality & Safety in Health Care, 14(2), 130–134.
Maguire, L. M., & Percival, J. (2018). Sensemaking in the snow: Examining the cognitive work of avalanche forecasting in a Canadian ski operation. Presented at the International Snow Science Workshop, Innsbruck, Austria.
Nisbett, R. E., & Wilson, T. D. (1977). Telling more than we can know: Verbal reports on mental processes. Psychological Review, 84(3), 231.
Patterson, E. S., Watts-Perotti, J., & Woods, D. D. (1999). Voice loops as coordination aids in space shuttle mission control. Computer Supported Cooperative Work: CSCW: An International Journal, 8(4), 353–371.
Smith, P.J., McCoy, E. and Orasanu, J. (2001). Distributed cooperative problem-solving in the air traffic management system. In G Klein And (Ed.), Linking expertise and Naturalistic Decision Making (pp. 369–384.). Erlbaum.
Woods, D. (2005). Studying cognitive systems in context: the cognitive systems triad. Institute for Ergonomics, The Ohio State University, Columbus, OH. Retrieved from http://csel.eng.ohio-state.edu/productions/woodscta/media/triad_intro_final.pdf
Author Bio:
Laura Maguire “pre-tired” at age 18 and spent 10 years ski bumming around Western Canada. Now she is a PhD student & researcher with the Cognitive Systems Engineering lab at The Ohio State University where she studies adaptive human performance in high risk, high consequence environments. She has a Masters degree in Human Factors & Systems Safety from Lund University in Sweden. When not in a library, she is out skiing, climbing, biking or reading in a hammock.