Safety Management System: Week 3
Safety Management System: Week 3
Safety Management System: Week 3
MANAGEMENT
SYSTEM
Week 3
1
Course Content
1. Introduction
2. Development of safety concept – Accident statistics
3. Approaches/models to accident investigations
4. Human Performance and Limitations
5. Positive Safety Culture
6. Introduction to safety management
7. Hazards
8. Safety risks
9. ICAO Safety Management SARP’s
10. SMS-Safety Management System
11. SMS - Planning
12. SMS - Operation
13. Phased implementation approach to SMS
14. SSP-State Safety Programme
2
Approaches/Models to Accident
Investigation
3
Reason Model* - Cause / Result
Relationship
• The - Swiss-Cheese Model, developed by Professor James Reason,
illustrates that accidents involve successive breaches of multiple system
defenses. These breaches can be triggered by a number of enabling factors
such as equipment failures or operational errors.
• Since the Swiss-Cheese Model contends that complex systems such as
aviation are extremely well defended by layers of defences, single-point
failures are rarely consequential in such systems.
*http://patientsafetyed.duhs.duke.edu/module_e/swiss_cheese.html
4
YS/Safety Management System
Reason Model* - Cause / Result
Relationship
• Breaches in safety defenses can be a delayed consequence of
decisions made at the highest levels of the system, which may
remain dormant until their effects or damaging potential are activated
by specific operational circumstances.
• Under such specific circumstances, human failures or active
failures at the operational level act to breach the system’s inherent
safety defences.
• The Reason Model proposes that all accidents include a
combination of both active and latent conditions.
5
Swiss Cheese
The original source for the Swiss Cheese illustration is: “Swiss Cheese” Model – James Reason,
1990. The book reference is: Reason, J. (1990) Human Error.
Cambridge: University Press, Cambridge. 6
• Active failures are actions or inactions, including errors and
violations, which have an immediate adverse effect. They are
generally viewed, with the benefit of hindsight, as unsafe acts.
Active failures are generally associated with front-line personnel
(pilots, air traffic controllers, aircraft mechanical engineers, etc.)
and may result in a harmful outcome.
7
• Latent conditions are those that exist in the aviation system well
before a damaging outcome is experienced. The consequences of
latent conditions may remain dormant for a long time.
• Initially, these latent conditions are not perceived as harmful, but
will become evident once the system’s defenses have been
breached. These conditions are generally created by people far
removed in time and space from the event.
8
A concept of accident causation
9
The organizational accident
• The notion of the organizational accident underlying Reason’s Model can
be best understood through a building-block approach, consisting of five
blocks. The top block represents the organizational processes.
• These are activities over which any organization has a reasonable degree
of direct control. Typical examples include policy making, planning,
communication, allocation of resources, and supervision.
• Unquestionably, the two fundamental organizational processes as far as
safety is concerned are allocation of resources and communication.
Downsides or deficiencies in these organizational processes are the
breeding grounds for a dual pathway towards failure.
10
The organizational accident
11
12
Latent Conditions
• Latent conditions have all the potential to breach aviation system
defenses. Typically, defenses in aviation can be grouped under three
large headings: technology, training and regulations.
• Defenses are usually the last safety net to contain latent conditions, as
well as the consequences of lapses in human performance.
• Most, if not all, mitigation strategies against the safety risks of the
consequences of hazards are based upon the strengthening of existing
defenses or the development of new ones.
13
14
15
16
17
Remember:
• Error
An action or inaction by an operational person that
leads to deviations from organizational or the
operational person’s intentions or expectations. …
• Violation
A deliberate act of willful misconduct or omission
resulting in a deviation from established
regulations, procedures, norms or practices…
18
Organizational Accidents (5 main cause)
1. Institutional (Organizational / In-Business) Processes
2. Invisible Conditions
3. Defense Mechanisms
4. Workplace Conditions
5. Current Failures
19
Other Causes
• Incorrect resource allocation
• Inadequate hazard identification and risk management
• Inadequate technology-training
• Inadequate communication
• Insufficient regulation control
• Normalization of ordinary actions
• Insufficient reporting feedback
20
Reason Model
Briefly;
• There are multiple causes / factors for the occurrence of accidents.
• The occurrence of an accident can not be attributed solely to operational
errors/ violations. The operator itself is also an important influence.
• Effective faults (errors and violations), latent conditions (business,
workplace, people) also lead to the formation of accidents with the effect.
• As a precaution, the process must be well monitored and the defense
mechanisms must be established consistently)...
21
Examples of active errors/violations;
• Generally, it comes from the people who are at the forefront of the
activity.
• Braking test not to be done despite heavy snowfall (violation),
• The air traffic controller is driving the pilot to the wrong track (error),
• The carelessness of the operator, which brings the passenger bridge closer
to the aircraft (error),
• Ground handling vehicle driver's carelessness (error)….
22
The Practical Drift
• Drift from «baseline performance». Always happens….
• Reasons;
• Technology not always operated as predicted,
• Procedures cannot be exacuted as planned under some conditions,
• With capturing and analyzing we introduce safety adaptations to the
system….
23
24
Why was the SCM so successful
• Previous section shows this idea particularly well: the metaphors of
resident pathogens (from the medical domain), defence in depth (from
engineering) and Swiss cheese (food) are combined to produce the most-
popular safety model.
• The SCM, in particular, owes its success to a systemic foundation that
broadens the scope of the analysis to the organisation’s complexity,
environment, management and defences.
25
Case Study
• There is such accidents in the aviation history.
• You need to analyze the accident assigned to you through the SCM model
and identify active faults and latent conditions.
• You need to submit a 1 or 2 page report.
• You need to present in the class with a presentation of maximum 5
minutes, either through a presentation or a picture.
26
https://www.aviationfile.com/swiss-cheese-
model/#:~:text=An%20Imaginary%20Example%20of%20App
liance%20of%20The%20Swiss%20Cheese%20Model&text=
Think%20about%20two%20planes%20colliding,be%20so%2
0clear%20and%20precise). 27
SHEL Model
• The SHEL Model is a conceptual tool used to analyse the interaction of a
multiple system interaction ofcompenents which provides a basic
depiction of the relationship between humans and other workplace
compenents.
• People make mistakes.
• In order to achieve expected performances, it is necessary to know the
factors (such as; relations with other people, environment, hardware, etc.).
• The SHEL model Developed by Edwards in 1972, presented in 1984 by
Hawkins in its present form, is a conceptual approach used in the analysis
of organizational/operational structures/properties and their interaction
with humans.
28
Understanding the relationship between
people and operational contexts
29
SHELL* MODEL
(First L; human itself, Second L; the others…)
30
SHEL MODEL
You can not change people,
Change the terms and conditions. "
• The model has taken its name from the first letters of its associates.
• Liveware-Hardware (L-H)
• Liveware-Software (L-S)
• Liveware-Liveware (L-L)
• Liveware-Environment (L-E)
31
a. Liveware-
Hardware (L-H)
• Technology vs Human
Performance
• Relationship between the human
and the phsical attributes of
equipment, machine and
facilities…
32
b. Liveware-
Software (L-S)
• Supporting Systems vs Human
• Relationship between the human
and the regulations, manuals,
checklists, publications, standard
operating procedures and
computer software, including
formats and symbology
33
c. Liveware-Liveware (L-L)
• People vs People
• Relationship between the human
and the other people (such as
relations between the pilots,
technicians, engineers… their
roles/functions in the groups…)
34
d. Liveware-Environment (L-E)
People vs Workplace Environmental
Conditions
Relationship between the human and
both internal and external
environments (such as temparature,
ambient light, noise, vibration, air
quality, etc…) including psychological
and physiaological forces (such as;
ilness, fatigue, social/financial
problems, career concerns, etc.)
35
• According to the SHEL Model, the humans are in the center of the
operations. Although humans are remarkably adaptable, they do not
interface perfectly with the various components of the workplace.
• Physical factors; necessary factors for the person to perform the required
tasks (power, height, reach distance, sight, hearing, etc...)
• Physiological factors; Factors influencing the physical performance of the
individual (general health and form status, illness, tobacco, drug, alcohol use,
stress, fatigue, pregnancy, etc…)
• Psychological factors; Coping with the conditions that may arise (education,
knowledge, experience and workload, etc…)
• Psycho-social factors; Job and non-workplace pressures (workplace
• discussions, financial problems, family problems, etc…)
36
For example, the gaps in the Personnel-Hardware
interface can be turned off as follows
• The designer can ensure the reliability of the hardware performance under
specified operating conditions.
• During the certification process, the regulatory authority can identify the
realistic conditions under which the equipment can be used.
• Management can develop standard operational practices
and develop and deliver trainings for safe use of equipment.
• Individual hardware operators ensure that the equipment can be safely
used in all desired operating conditions ....
37
ERRORS
• As indicated before, an error is defined as «an action or inaction by an
operational person that leads to deviations from organizational or the
operational person’s intentions or expactations.
• In the context of SMS, both the State and the product or service provider
must understand and expect that humans will commit errors regardless of
the level of technology used, the training or the existence of regulations,
processes and procedures.
• An important goal then is to set and maintain defenses to reduce the
likehood of errors and just importantly, reduce the consequences of errors
when they do occur.
38
ERRORS
• To effectively accomplish this task, errors
must be identified, reported and analysed so
that appropriate remedial action can be taken.
Errors can be divided into two categories:
• Slips and lapses are the failures in the
execution of the intended action. Slips
are actions that do not go as planned.
Lapses are memory failures. For
example, operating the flap lever instead
of the (intended) gear lever is a slip.
Forgetting a checklist item is a lapse.
• Mistakes are failures in the plan of
action. Even if execution of the plan were
core correct, it would not have been
possible to achieve the intended outcome.
39
Aviation businesses use intensive technology.
Even the most educated and experienced staff can make mistakes.
Statistically, there are millions of operational errors resulting from
humans until a significant safety fault occurs.
The flight crew performs the after-engines-start checklist, but do not detect the incorrect flap
setting, and the aircraft initiates taxiing for departure. They start the take-off roll, they do not
identify the warning and continues the take-off roll. ACCİDENT
4
Error Deviation Amplification Normal flight
• Human-centred design
• Ergonomic factors
• Training
• - .....
45
• Capturing strategies assume the error will made. The intent is to «capture» the error before any adverse
consequences of the error are left. Capturing strategies are different from reduction strategies in that they utilize
checklists and other procedural interventions rather than directly eliminating the error.
• Checklists
• Task cards
• Flight strips
• - .....
46
• Tolerance strategies refer to the ability of a system to accept that an error will made but without
experiencing serious consequences. The incorporation of redundant systems or multiple inspection processes
are examples of measures that increase system tolerance to errors.
• System redundancies
• Structural inspections
• - ...
48
VIOLATIONS
Violations (depending on workspace conditions) can be categorized as follws:
Routine violations become the normal way of duing business within a work group. Such
violations are committed in response to situations in which compliance with established
procedures make completion difficult. These deviations are called as «drift» and over time result
in severe consequences (safety is not compromized).
4
EXAMPLES
If the minimum safety distance between two planes is 20 NM, if the controller is doing 18 NM
separation;
In the areas where the service vehicles should not enter in the airplane parking area;
52
• In Modern Approach, for improved system
reliability;;
53
Example*
55
Investigation - Findings
• Crew did not use the weather radar
• Crew did not consult the emergency check-list
• Demanding situation requiring decisive thinking and clear action
• Conditions exceeded certification condition for the engines
• Did not request diversion to a closer aerodrome.
56
Investigation – Findings (cont…)
• Crew did not use correct phraseology to declare emergency
57
• Multiple engine failures
• Incomplete performance of emergency drills
• Crew actions in securing and re-starting engines
• Drag from unfeathered propellers
Causes • Weight of ice
• Poor CRM
• Lack of contingency plans
• Loss of situational awareness
58
Safety
recommendations
• Authority should remind pilots to
use correct phraseology
• Authority should research
into most effective form of
presentation of emergency
reference material
59
Same Example / Modern System / The Facts
• An old generation two engine turboprop commuter aircraft engaged in a regular
passenger transport operation is conducting a non-precision approach in marginal
weather conditions in an uncontrolled, non-radar, remote airfield
• The flight crew conducts a straight-in approach, not following the published approach
procedure
• …
60
The Facts (cont…)
• Upon reaching MDA, the flight crew does not acquire visual references
• The flight crew abandons MDA without having acquired visual references to pursue the
landing
61
• Findings
• The crew made numerous mistakes
• But
• Crew composition legal but unfavourable in view of demanding flight
conditions
• According to company practice, pilot made a direct approach, which was against
regulations
• …
62
… But
• The company had consistently misinterpreted regulations
• Level of safety was not commensurate with the requirements of a scheduled passenger
operation Aerodrome operator had neither the staff nor the resources to ensure regularity
of operations
• …
63
… But
• Lack of standards for commuter operations
• Lack of supervision of air traffic facilities
• Authorities’ disregard of previous safety violations
• Legislation out of date
• …
64
But
65
Causes
66
• Safety recommendations
• “Tip-of-the-arrow” recommendations
• But
• Review the process of grantingAOC
• Review the training system
• Define an aviation policy which provides support to the task of
• the aviation administration
•…
67
But
• Reform aviation legislation
• Reinforce existing legislation as interim measure
• Improve both accident investigation and aircraft and airways
• inspection processes
68
• Classical understanding (result focused) saved the day, closed the file.
• The advanced system (process-oriented) has shown direction for a safer future ...
• The controllers are fully independent, with all the resources they need.
69
Errors
Poor design
Inadequate training
Inadequate defenses
Conflicting goals
Accidents
70