LINE OPERATIONS SAFETY AUDIT (LOSA) |
(Last Revision: Dec. 10, 2010)
1.1 INTRODUCTION
1.1.1 Historically, the way
the aviation industry has investigated the impact of human
performance on aviation safety has been through the
retrospective analyses of those actions by operational
personnel, which led to rare and drastic failures. The
conventional investigative approach is for investigators to
trace back an event under consideration to a point where
they discover particular actions or decisions by operational
personnel that did not produce the intended results and, at
such point, conclude human error as the cause. The weakness
in this approach is that the conclusion is generally
formulated with a focus on the outcome, with limited
consideration of the processes that led up to it. When
analyzing accidents and incidents, investigators already
know that the actions or decisions by operational personnel
were bad or inappropriate, because the bad
outcomes are a matter of record. In other words,
investigators examining human performance in safety
occurrences enjoy the benefit of hindsight. This is,
however, a benefit that operational personnel involved in
accidents and incidents did not have when they selected what
they thought of as good or appropriate actions or
decisions that would lead to good outcomes.
1.1.2 It is inherent to
traditional approaches to safety to consider that, in
aviation, safety comes first. In line with this, decision
making in aviation operations is considered to be 100
percent safety-oriented. While highly desirable, this is
hardly realistic. Human decision making in operational
contexts is a compromise between production and safety
goals. The optimum decisions to achieve the actual
production demands of the operational task at hand may not
always be fully compatible with the optimum decisions to
achieve theoretical safety demands. All production systems,
and aviation is no exception, generate a migration of
behaviors: due to the need for economy and efficiency,
people are forced to operate at the limits of the system's
safety space. Human decision making in operational contexts
lies at the intersection of production and safety and is
therefore a compromise. In fact, it might be argued that the
trademark of experts is not years of experience and exposure
to aviation operations, but rather how effectively they have
mastered the necessary skills to manage the compromise
between production and safety. Operational errors are not
inherent in a person, although this is what conventional
safety knowledge would have the aviation industry believe.
Operational errors occur as a result of mismanaging or
incorrectly assessing task and/or situational factors in a
specific context and thus cause a failed compromise between
production and safety goals.
1.1.3. The compromise between
production and safety is a complex and delicate balance.
Humans are generally very effective in applying the right
mechanisms to successfully achieve this balance, hence the
extraordinary safety record of aviation. Humans do, however,
occasionally mismanage or incorrectly assess task and/or
situational factors and fail in balancing the compromise,
thus contributing to safety breakdowns. Successful
compromises far outnumber failed ones; therefore, in order
to understand human performance in context, the industry
needs to systematically capture the mechanisms underlying
successful compromises when operating at the limits of the
system, rather than those that failed. It is suggested that
understanding the human contribution to successes and
failures in aviation can be better achieved by monitoring
normal operations, rather than accidents and incidents. The
Line Operations Safety Audit (LOSA) is the vehicle endorsed
by ICAO to monitor normal operations.
1.1.4. The Line Operations Safety Audit
Program describes the process by which all airline flight
crewmembers are evaluated on professional standards. This
section is designed to provide instructions, guidance, and
regulatory requirements for evaluating flight crewmembers
during these observations. As professionals, airline flight
crewmembers are expected to exhibit the highest degree of
airmanship, integrity, professionalism, proficiency, and
safety. The flight crewmembers should be a master of the
airplane, and demonstrate an ability to operate under
complex circumstances throughout the range and scope of
his/her duties. Additionally, the flight crewmember bears
the final responsibility for the safe conduct of the flight.
This standard, more than any other, distinguishes the flight
crewmember as a professional. This mastery of complex
problems, good judgment, situational awareness, crew
resource management, and leadership skills is necessary to
ensure that safety is never compromised. Flight Manual Part
1, the appropriate Aircraft Operating Manual, and the Line
Operations Safety Audit Program provide the framework for
ensuring standardized flight operations. However, when
situations arise that are not specifically addressed by
these manuals or FARs, the Flight Crew is expected to
exercise professional judgment while maintaining safety of
flight as the first priority. The Line Operations Safety
Audit Program is the responsibility of the Manager of Flight
Safety. Written comments and suggestions may be submitted
via board mail to the Safety Department. All flight
operations are subject to the Line Operations Safety Audit
Program. The determination of whether a flight crewmember's
performance is acceptable is derived from the experience and
judgment of the LOSA Observer. The LOSA Observer must
evaluate carefully, consistently, and in accordance with the
operating procedures outlined in the appropriate Aircraft
Operating Manual.
1.2 BACKGROUND
Reactive strategies/ Accident
investigation
1.2.1 The tool most often used
in aviation to document and understand human performance and
define remedial strategies is the investigation of
accidents. However, in terms of human performance, accidents
yield data that are mostly about actions and decisions that
failed to achieve the successful compromise between
production and safety discussed earlier in this chapter.
1.2.2 There are limitations to
the lessons learned from accidents that might be applied to
remedial strategies vis-a-vis human performance. For
example, it might be possible to identify generic
accident-inducing scenarios such as Controlled Flight Into
Terrain (CFIT), Rejected Takeoff (RTO), runway incursions
and approach-and-landing accidents. In addition, it might
be possible to identify the type and frequency of external
manifestations of errors in these generic accident-inducing
scenarios or discover specific training deficiencies that
are particularly related to identified errors. This,
however, provides only a tip-of-the-iceberg perspective.
Accident investigation, by definition, concentrates on
failures, and in following the rationale advocated by LOSA,
it is necessary to better understand the success stories to
see if they can be incorporated as part of remedial
strategies.
1.2.3 This is not to say that
there is no clear role for accident investigation within the
safety process. Accident investigation remains the vehicle
to uncover unanticipated failures in technology or bizarre
events, rare as they may be. Accident investigation
also provides a framework. If only normal operations
were monitored, defining unsafe behaviors would be a task
without a frame of reference. Therefore, properly focused
accident investigation can reveal how specific behaviors can
combine with specific circumstances to generate unstable and
likely catastrophic scenarios. This requires a contemporary
approach to the investigation. Should accident
investigation be restricted to the retrospective analyses
discussed earlier, its contribution in terms of human error
would be to increase existing industry databases, but its
usefulness in regard to safety would be dubious. In
addition, the information could possibly provide the
foundations for legal action and the allocation of blame and
punishment.
Combined reactive/proactive strategies
Incident investigation
1.2.4 A tool that the aviation
industry has increasingly used to obtain information on
operational human performance is incident reporting.
Incidents tell a more complete story about system safety
than accidents do because they signal weaknesses within the
overall system before the system breaks down. In addition,
it is accepted that incidents are precursors of accidents
and that N-number of incidents of one kind takes place
before an accident of the same kind eventually occurs. The
basis for this can be traced back almost 30 years to
research on accidents from different industries, and there
is ample practical evidence that supports this research.
There are, nevertheless, limitations of the value of the
information on operational human performance obtained from
incident reporting.
1.2.5 First, reports of
incidents are submitted in the jargon of aviation and,
therefore, capture only the external manifestations of
errors (for example, misunderstood a frequency, busted
an altitude, and misinterpreted a clearance).
Furthermore, incidents are reported by the individuals
involved, and because of biases, the reported processes or
mechanisms underlying errors may or may not reflect reality,
this means that incident-reporting systems take human error
at face value, and, therefore, analysts are left with two
tasks. First, they must examine the reported processes or
mechanisms leading up to the errors and establish whether
such processes or mechanisms did indeed underlie the
manifested errors. Then, based on this relatively weak
basis, they must evaluate whether the error management
techniques reportedly used by operational personnel did
indeed prevent the escalation of errors into a system
breakdown.
1.2.6 Second, and most
important, incident reporting is vulnerable to what has been
called normalization of deviance. Over time, operational
personnel develop informal and spontaneous group practices
and shortcuts to circumvent deficiencies in equipment
design, clumsy procedures or policies that are incompatible
with the realities of daily operations, all of which
complicate operational tasks. These informal practices are
the product of the collective expertise and hands-on
expertise of a group, and they eventually become normal
practices. This does not, however, negate the fact that they
are deviations from procedures that are established and
sanctioned by the organization, hence the term
normalization of deviance. In most cases, normalized
deviance is effective, at least temporarily. However, it
runs counter to the practices upon which system operation is
predicated. In this sense, like any shortcut to standard
procedures, normalized deviance carries the potential for
unanticipated downsides that might unexpectedly trigger
unsafe situations. However, since they are "normal, it
stands to reason that neither these practices nor their
downsides will be recorded in incident reports.
1.2.7 Normalized deviance is
further compounded by the fact that even the most willing
reporters may not be able to fully appreciate what are
indeed reportable events. If operational personnel are
continuously exposed to substandard managerial practices,
poor working conditions, and or flawed equipment, how could
they recognize such factors as reportable problems?
1.2.8 Thus, incident reporting
cannot completely reveal the human contribution to successes
or failures in aviation and how remedial strategies can be
improved to enhance human performance. Incident reporting
systems are certainly better than accident investigations in
understanding system performance, but the real challenge
lies in taking the next step understanding the processes
underlying human error rather than taking errors at face
value. It is essential to move beyond the visible
manifestations of error when designing remedial strategies.
If the any airline is to be successful in modifying system
and individual performance, errors must be considered as
symptoms that suggest where to look further. In order to
understand the mechanisms underlying errors in operational
environments, flaws in system performance captured through
incident reporting should be considered as symptoms of
mismatches at deeper layers of the system. These mismatches
might be deficiencies in training systems, flawed person
technology interfaces, poorly designed procedures, corporate
pressures, poor safety culture, etc. The value of the data
generated by incident reporting systems lies in the early
warning about areas of concern, but such data do not capture
the concerns themselves.
Training
1.2.9 The observation of
training behaviors (during flightcrew simulator training,
for example) is another tool that is highly valued by the
aviation industry to understand operational human
performance. However, the production component of
operational decision making does not exist under training
conditions. While operational behaviors during line
operations are a compromise between production and safety
objectives, training behaviors are absolutely biased towards
safety. In simpler terms, the compromise between production
and safety is not a factor in decision making during
training. Training behaviors are by the book.
1.2.10 Therefore, behaviors under
monitored conditions, such as during training or line
checks, may provide an approximation to the way operational
personnel behave when unmonitored. These observations may
contribute to flesh out major operational questions such as
significant procedural problems. However, it would be
incorrect and perhaps risky to assume that observing
personnel during training would provide the key to
understanding human error and decision making in unmonitored
operational contexts.
Surveys
1.2.11 Surveys completed by operational
personnel can also provide important diagnostic information
about daily operations and, therefore, human error. Surveys
provide an inexpensive mechanism to obtain significant
information regarding many aspects of the organization,
including the perceptions and opinions of operational
personnel: the relevance of training to line operations, the
level of teamwork and cooperation among various employee
groups, problem areas or bottlenecks in daily operations,
and eventual areas of dissatisfaction. Surveys can also
probe the safety culture. For example, do personnel know the
proper channels for reporting safety concerns and are they
confident that the organization will act on expressed
concerns? Finally, surveys can identify areas of dissent or
confusion, for example, diversity in beliefs among
particular groups from the same organization regarding the
appropriate use of procedures or tools. On the minus side,
surveys largely reflect perceptions. Surveys can be likened
to incident reporting and are therefore subject to the
shortcomings inherent to reporting systems in terms of
understanding operational human performance and error.
Flight data recording
1.2.12 Digital Flight Data Recorder (DFDR)
and Quick Access Recorder (QAR) information from normal
flights is also a valuable diagnostic tool. There are,
however, some limitations about the data acquired through
these systems. DFDR/QAR readouts provide information on the
frequency of exceedences and the locations where they occur,
but the readouts do not provide information on the human
behaviors that were precursors of the events. While DFDR/QAR
data track potential systemic problems, pilot reports are
still necessary to provide the context within which the
problems can be fully diagnosed.
1.2.13 Nevertheless, DFDR/QAR
data hold high cost/efficiency ratio potential. Although
probably underutilized because of cost considerations as
well as cultural and legal reasons, DFDR/QAR data can assist
in identifying operational contexts within which migration
of behaviors towards the limits of the system takes place.
Proactive strategies
Normal line operations monitoring
1.2.14 The approach proposed in this
manual to identify the successful human performance
mechanisms that contribute to aviation safety and,
therefore, to the design of countermeasures against human
error focuses on the monitoring of normal line operations.
1.2.15 Any typical routine flight - a
normal process - involves inevitable, yet mostly
inconsequential errors (selecting wrong frequencies, dialing
wrong altitudes, acknowledging incorrect read-backs,
mishandling switches and levers, etc.) Some errors are due
to flaws in human performance while others are fostered by
systemic shortcomings; most are a combination of both. The
majority of these errors have no negative consequences
because operational personnel employ successful coping
strategies and system defenses act as containment nets. In
order to design remedial strategies, the aviation industry
must learn about these successful strategies and defenses,
rather than continue to focus on failures, as it has
historically done.
1.2.16 A medical analogy may be helpful
in illustrating the rationale behind LOSA. Human error could
be compared to a fever: an indication of an illness but not
its cause. It marks the beginning rather than the end of the
diagnostic process. Periodic monitoring of routine flights
is therefore like periodic physical: proactively checking
health status in an attempt to avoid getting sick. Periodic
monitoring of routine flights indirectly involves
measurement of all aspects of the system, allowing
identification of areas of strength and areas of potential
risk. On the other hand, incident investigation is like
going to the doctor to fix symptoms of problems; possibly
serious, possibly not. For example, a broken bone sends a
person to the doctor; the doctor sets the bone but may not
consider the root cause(s): weak bones, poor diet, high-risk
lifestyle, etc. Therefore, setting the bone is no guarantee
that the person will not turn up again the following month
with another symptom of the same root cause. Lastly,
accident investigation is like a postmortem: the
examination made after death to determine its cause. The
autopsy reveals the nature of a particular pathology but
does not provide an indication of the prevalence of the
precipitating circumstances. Unfortunately, many accident
investigations also look for a primary cause, most often
pilot error, and fail to examine organizational and
system factors that set the stage for the breakdown.
Accident investigations are autopsies of the system,
conducted after the point of no return of the system's
health has been passed.
1.2.17 There is emerging consensus within
the aviation industry about the need to adopt a positive
stance and anticipate, rather than regret, the negative
consequences of human error in system safety. This is a
sensible objective. The way to achieve it is by pursuing
innovative approaches rather than updating or optimizing
methods from the past. After more than 50 years of
investigating failures and monitoring accident statistics,
the relentless prevalence of human error in aviation safety
would seem to indicate a somewhat misplaced emphasis in
regard to safety, human performance and human error; unless
it is believed that the human condition is beyond hope.
1.3 A CONTEMPORARY APPROACH TO
OPERATIONAL HUMAN PERFORMANCE
AND ERROR
1.3.1. The implementation of normal
operations monitoring requires an adjustment on prevailing
views of human error. In the past, safety analyses in
aviation have viewed human error as an undesirable and
wrongful manifestation of human behavior. More recently, a
considerable amount of operationally oriented research,
based on cognitive psychology, has provided a very different
perspective on operational errors. This research has proven,
in practical terms, a fundamental concept of cognitive
psychology: error is a normal component of human behavior.
Regardless of the quantity and quality of regulations the
industry might promulgate, the technology it might design,
or the training people might receive, error will continue to
be a factor in operational environments because it simply is
the downside of human cognition. Error is the inevitable
downside of human intelligence; it is the price human beings
pay for being able to think on our feet. Practically
speaking, making errors is a conservation mechanism
afforded by human cognition to allow humans the flexibility
to operate under demanding conditions for prolonged periods
without draining their mental batteries.
1.3.2. There is nothing inherently wrong
or troublesome with error itself as a manifestation of human
behavior. The trouble with error in aviation is the fact
that negative consequences may be generated in operational
contexts. This is a fundamental point in aviation: if the
negative consequences of an error are caught before they
produce damage, then the error is inconsequential. In
operational contexts, errors that are caught in time do not
produce negative consequences and therefore, for practical
purposes, do not exist. Countermeasures to error, including
training interventions, should not be restricted to avoiding
errors, but rather to making them visible and trapping those
before they produce negative consequences. This is the
essence of error management: human error is unavoidable but
manageable.
1.3.3. Error management
is at the heart of LOSA and reflects the previous argument.
Under LOSA, flaws in human performance and the ubiquity of
error are taken for granted and, rather than attempting to
improve human performance, the objective becomes to improve
the context within which humans perform. LOSA ultimately
aims through changes in design, certification, training,
procedures, management and investigation - at defining
operational contexts, including buffer zones or time delays
between the commission of errors and the point in which
error consequences become a threat to safety. The
buffer zone or time delay allows for recovery from the
consequences of errors. The more resistant the buffer
or the longer the time delay, the stronger the intrinsic
resistance and tolerance of the operational context to the
negative consequences of human error. Operational
contexts should be designed in such a way that allows
front-line operators second chances to recover from the
consequences of errors.
1.3.4. In making an analogy with
flight instruments, human performance can be considered as
falling into three bands: a green band, a
yellow band, and a red band. Within the
green band, the operational context demands are low.
Task and situational factors are compatible with cognitive
resources, operational personnel make the fewest errors and,
as indicated by the high recovery rate, the operational
personnel have ample cognitive resources in reserve to
recover from the negative consequences of errors. Task
and situational factors put human performance into the
yellow band when the operational context demands
increase and become more complex and, consequently, errors
increase in number and the recovery rate decreases. As
operational context demands continue to increase and
eventually peak, task and situational factors force human
performance into the red band. In this band, the
number of errors sharply jumps and the recovery rate dips to
a point at which cognitive control is lost. At this
point, cognitive resources are no longer available to cope
with the situation at hand; the mental batteries are
totally depleted.
1.3.5. This classification of human
performance into bands is beneficial to organizations to
apply the LOSA data. As an example, the term coffin
corner is used to describe the point in the operational
envelope of an aircraft at which the (low) stall speed and
the (high) buffet speed are the same and the aircraft
exhibits bizarre behavior and eventually goes out of
control. Weight-verses-altitude-and-speed capability
charts and other tools provide flightcrews with the
necessary information to avoid operating aircraft in this
condition and, therefore, to stay within a safe operating
envelope. LOSA generates the information necessary for
organizations to define the green band of safe
operations in the human performance envelope, thus avoiding
taking operational human performance into the coffin corner of cognition.
1.4 THE ROLE OF THE
ORGANIZATIONAL CULTURE
1.4.1. In order to understand how an
organization can effectively implement approaches to error
management, it is essential to examine the organization's
daily processes, the kind of corporate culture such
processes generate, and the organization's attitudes toward
error and punishment. This will make it possible to
assess the effectiveness of the controls that the
organization has in place to ensure that its processes
foster the green band of operational human performance.
It is good to remember the following points: humans do
not live in a vacuum so their behaviors are affected by many
external factors; corporate culture is an organizational
mandate that conditions operational personnel decision
making; and humans exhibit the kinds of behaviors an
organization fosters and which they therefore assume the
organization expects of them.
1.4.2. In closing this
section, it is important to clearly point out the
distinction between errors - which are products of human
limitations, and violations - which have a motivational
component. While errors should be considered as the
inevitable downside of human intelligence and flexibility
and the aviation industry must learn to live with it,
violations should be considered from a different
perspective. Violations are an emerging topic of
research, and in due time, the aviation industry might need
to change prevailing attitudes towards them. However,
for the purposes of this manual, violations should not be
condoned.
1.5 CONCLUSION
1.5.1. There is no
denying that monitoring normal operations on a routine basis
poses major challenges. Significant progress has been
achieved in tackling some of these challenges. From a
methodological point of view, some of the early problems in
defining, classifying and standardizing the data obtained
have been solved with this program revision. From an
organizational perspective, there is a need to consider
using multiple data collection tools, including line
observations, surveys, self-reports such as ASAP Reports,
and more refined safety incident reporting and Flight Data
Analysis systems such as FOQA. Each tool can provide
its own unique part of the whole picture and, taken as a
whole, provide any airline with a comprehensive look at
their actual operations.
For more information on how your company
can quickly gain these advantages, please
email one of our Senior Consultants to find out more
Printer Friendly Version
Next Article - ASAP and
Employee Feedback