instruktsiya.info Art Human Error James Reason Pdf

HUMAN ERROR JAMES REASON PDF

Tuesday, December 3, 2019


The problem of human error can be viewed in 2 ways: the person approach and the system approach. Each has its model of error causation, and each model. " an in-depth analytical framework of human error " Journal of Perinatal & Neonatal Nursing " a comprehensive and often innovative treatment of human error. Cambridge Core - Cognition - Human Error - by James Reason. James Reason, University of Manchester. Publisher: . Access. PDF; Export citation. Contents.


Author:HIEN SIENICKI
Language:English, Spanish, German
Country:Italy
Genre:Lifestyle
Pages:626
Published (Last):16.04.2016
ISBN:823-5-24690-343-4
ePub File Size:19.74 MB
PDF File Size:12.66 MB
Distribution:Free* [*Regsitration Required]
Downloads:21000
Uploaded by: SHERI

PDF | After the Fukushima accident, a new concept of nuclear safety arouse: SCM: James Reason (psychologist and theorist of human error) and John. doi/bmj ;; BMJ. James Reason. Human error: models and management instruktsiya.info Human Error. Causes of Workplace 'To say accidents are due to human failing is like saying falls are due to gravity. . Error Taxonomy. James Reason ().

Keywords Download chapter PDF 1 Introduction The Fukushima nuclear accident that occurred in Japan, March , and its aftermath reinforced the need of theoretic and pragmatic studies over industrial and social resilience. Since there is no end in sight to the accident, it also raised the issue of engineering thinking facing extreme situation [ 1 ]. In this paper we therefore try to shade a light on collaborative process where social scientist and engineers come to work together more than side to side in order to seek determinants of successful collaborations.

The paper focuses on an historical case: the so called Swiss Cheese Model of accidents. Since the early , the Swiss Cheese Model SCM of the psychologist James Reason has established itself as a reference model in the etiology, investigation or prevention of industrial accidents. Its success in many fields transport, energy, medical has made it the vector of a new paradigm of Safety Science: the organizational accident.

In a dualistic premise where research and industry would be two entities interacting but still separable, this collaboration would be understood as the appropriation of research work by the industrial world.

These meetings shed a new light on a prolific era for the Safety Sciences field.

See a Problem?

We therefore hope to keep from a retrospective bias that tends to smooth and simplify facts. This chapter deals with the induced effects of the collaboration between a psychologist and an engineer in terms of models production.

In the second section, we focus on the effects of this collaboration over their intellectual and scientific productions. Reason a psychologist of human error and Wreathall a nuclear engineer. After presenting their backgrounds Sects. Until , he works on sensory disorientation and motion sickness.

Human error

In he becomes professor of psychology at Manchester University. In , Reason makes a little action slip that will impact his scientific career.

While preparing tea, he began to feed his cat screaming with hunger. The psychologist confused the bowl and teapot. This was of great interest to him and he started a daily errors diary. After he became a referent on the issue, he was a keynote speaker in various international conferences on human error.

On their collaboration will be drawn the first version of the SCM. Since then, Reason kept working on human and organizational factors in many industrial fields. This option brings the young engineer to human factors and systems thinking. From to he works on the British nuclear submarine design which allows him to access confidential reports on HRA by Swain.

From to , Wreathall works for the CEGB English energy company , first as design reviewer for control systems then as an engineer on human factors in nuclear safety. After meeting Reason there, they both started professional collaborations on accident prevention models including SCM.

Department of interactions between different professional Not long ago, these human contributions Psychology, groups. Nor can we take account of only outcome. These are failures of intention, those human failures that were the proximal termed mistakes. Mistakes can be further causes of an accident. Major accident inquiries subdivided into rule based mistakes and for example those for Three Mile Island knowledge based mistakes see below. In the shuttle explosion, King's Cross underground case of slips, lapses, trips and fumbles, actions fire, Herald of Free Enterprise capsizing, Piper deviate from the current intention.

Here the Alpha explosion and fire, Clapham rail disaster, failure occurs at the level of execution. For Exxon Valdez oil spill, Kegworth air crash, etc mistakes, the actions may go entirely as make it apparent that the human causes of planned but the plan itself deviates from some major accidents are distributed very widely, adequate path towards its intended goal.

Here both within an organisation as a whole and the failure lies at a higher level: with the mental over several years before the actual event. In processes involved in planning, formulating consequence, we also need to distinguish intentions, judging, and problem solving. They are combining with local triggering events to almost invariably associated with some form of penetrate the system's defences.

They are also provoked by causes.

Consequential classifications are already change, either in the current plan of action widely used in medicine. The error is described or in the immediate surroundings. Figure 2 in terms of the proximal actions contributing shows the further subdivisions of slips and to a mishap for example, administration of a lapses; these have been discussed in detail wrong drug or a wrong dose, wrong intubation, elsewhere. Causal classifications, has been detected.

A problem is anything that on the other hand, make assumptions about requires a change or alteration of the current the psychological mechanisms implicated in plan. Mistakes may be subdivided into two generating the error. Since causal or psycho- groups, as follows. The associated below. For my lem has to be worked out on the spot without present purpose an error is the failure of the help of preprogrammed solutions.

This planned actions to achieve their desired goal. The failures model" of the problem and its possible are failures of execution and are com- causes. Under these circumstances the human monly termed slips and lapses.

Slips relate mind is subject to several powerful biases, of to observable actions and are associated which the most universal is confirmation with attentional failures.

Lapses are more bias. This was described by Sir Francis Bacon internal events and relate to failures of more than years ago. But a ticularly evident when trying to diagnose full investigation into the circumstances of the what has gone wrong with a malfunctioning disaster leads inexorably to the conclusion that the system.

We "pattern match" a possible cause underlying or cardinal faults lay higher up in the to the available signs and symptoms and then Company From top to bottom the body corporate was infected with the disease of sloppiness. Other biases have failures is made very clear. The active failures been discussed elsewhere. Deliberate violations differ from errors in Thus, the distinction between active and latent several important ways. Generally, violations are more generally associated with medical active failures are committed by those motivational problems that is, low morale, people in direct contact with the patient, and poor supervisory example, perceived lack of latent failures occur within the higher echelons concern, the failure to reward compliance of the institution, in the organisational and and sanction non-compliance, etc management spheres.

Human error: models and management

Violations The technological advances of the past 20 require motivational and organizational years, particularly in regard to engineered remedies. The difference provides greater opportunities for the insidious concerns the length of time that passes before accumulation of latent failures within the human failures are shown to have an adverse system as a whole. Medical systems and items impact on safety. For active failures the of equipment have become more opaque to the negative outcome is almost immediate, but for people who work them and are thus especially latent failures the consequences of human prone to the rare, but often catastrophic, actions or decisions can take a long time to be organizationall accident.

In his inquiry report, he being from left to right. The accident sequence wrote: begins with the negative consequences of Understanding adverse events: human factors 83 Corporate Local climate Situation Defences as active failures is shown in the figure by the culture Task Barriers arrow connecting organizational processes directly to defences.

Error- The model presents the people at the sharp Management producing Errors end as the inheritors rather than as the decisions and organizational processes conditions Violation- producing conditions H I Violations instigators of an accident sequence.

This may seem as if the "blame" for accidents has been shifted from the sharp end to the system managers.

But this is not the case for the following reasons. Blame implies delinquency, and delinquency is normally dealt with by exhortations and sanctions. The latent failures appreciably prone to error. Like designs, etc , where they create the local conditions decisions are nearly always a compromise. It that promote the commission of errors and is thus axiomatic that all strategic decisions violations for example, understaffing, high will carry some negative safety consequences workload, poor human equipment interfaces, for some part of the system.

This is not to say etc. Many of these unsafe acts are likely to be that all such decisions are flawed, though committed, but only very few of them will some of them will be.

But even those penetrate the defences to produce damaging decisions judged at the time as being good outcomes. The fact that engineered safety ones will carry a potential downside. There are nearly always losers.

In judging uncertain futures some of the shots will inevitably be called wrong. The crux of Case 1: Therac accident at East Texas Medical Centre the matter is that we cannot prevent the creation of latent failures; we can only make their adverse consequences visible before A 33 year old man was due to receive his ninth radiation treatment after they combine with local triggers to breach the surgery for the removal of a tumour on his left shoulder.

The radiotherapy technician positioned him on the table and then went to her adjoining control system's defences. The Therac machine had two modes: a high power "x ray" mode These organizational root causes are further and a low power "electron beam" mode. The high power mode was selected complicated by the fact that the health- by typing an "x" on the keyboard of the VT terminal.

This put the care system as a whole involves many machine on maximum power and inserted a thick metal plate between the interdependent organizations: manufacturers, beam generator and the patient. The plate transformed the 25 million volt government agencies, professional and patient electron beam into therapeutic x rays.

The low power mode was selected by typing "e" and was designed to deliver a rad beam to the tumour. The model shown in figure The intention on this occasion was to deliver the low power beam. But 3 relates primarily to a given institution, but the the technician made a slip and typed in an "x" instead of an "e. The screen now confirmed that the machine points. She returned the cursor to the bottom of the screen in preparation for the "beam ready" display showing that the machine was fully charged.

As soon as the "beam ready" signal appeared she depressed Applying the organizational accident the "b" key to activate the beam. As theoretical framework and to emphasise some soon as she activated the "b" command, a blast of 25 rads was delivered to the patient's unprotected shoulder. He saw a flash ofblue light Cherenkov important points regarding the practice of high radiation , heard his flesh frying, and felt an excruciating pain.

He called out tech medicine. Radiological mishaps tend to to the technician, but both the voice and video intercom were switched off. This meant little to the technician.

She took But organisational accidents should not be it to mean that the beam had not fired, so reset the machine to fire again. An Once again, she received the "malfunction 54" signal, and once more she reset and fired the machine.

As a result, the patient received three, 25 entirely comparable anaesthetic case study rad blasts to his neck and upper torso, although the technician's display has been presented elsewhere. Subsequently, the range of suitable case studies is limited. This case study provides a clear example of what has been called "clumsy auto- Latent failures mation. The technicians routinely ignored complex systems are largely protected against alarms and did not survey patients, the after- single failures.

But they render the workings of loader, or the treatment room after high dose the system more mysterious to its human rate procedures. The build up of latent failures, hidden behind Nuclear Regulatory Commission did not high technology interfaces and within the adequately address the problems and dangers interdepartmental interstices of complex associated with high dose rate procedures.

This case study illustrates how a combination The second case study has all the causal of active failures and latent systemic weak- hallmarks of an organizational accident but nesses can conspire to penetrate the many differs from most medical mishaps in layers of defences which are designed to having adverse outcomes for nearly protect both patients and staff.

As a result, 2 important features of human error tend to be overlooked. First, it is often the best people who make the worst mistakes—error is not the monopoly of an unfortunate few. Second, far from being random, mishaps tend to fall into recurrent patterns. The same set of circumstances can provoke similar errors, regardless of the people involved. The pursuit of greater safety is seriously impeded by an approach that does not seek out and remove the error-provoking properties within the system at large.

Defenses, barriers, and safeguards occupy a key position in the system approach.

High-technology systems have many defensive layers: Their function is to protect potential victims and assets from local hazards. They are mostly effective at this, but there are always weaknesses.

In an ideal world, each defensive layer would be intact. In reality, they are more like slices of Swiss cheese, having many holes—although, unlike in the cheese, these holes are continually opening, shutting, and shifting their location.

Human Error

Usually this can happen only when the holes in many layers momentarily line up to permit a trajectory of accident opportunity—bringing hazards into damaging contact with victims figure. The holes in the defenses arise for 2 reasons: Nearly all adverse events involve a combination of these 2 sets of factors.

Active failures are the unsafe acts committed by people who are in direct contact with the patient or system. They take a variety of forms: At Chernobyl, for example, the operators violated plant procedures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core.

Followers of the person approach often look no further for the causes of an adverse event once they have identified these proximal unsafe acts. But, as discussed later, virtually all such acts have a causal history. They arise from decisions made by designers, builders, procedure writers, and top-level management. Such decisions may be mistaken, but they need not be. All such strategic decisions have the potential for introducing pathogens into the system. Latent conditions have 2 kinds of adverse effect: Latent conditions—as the term suggests—may lie dormant within the system for many years before they combine with active failures and local triggers to create an accident opportunity.

Unlike active failures, whose specific forms are often hard to foresee, latent conditions can be identified and remedied before an adverse event occurs.

Understanding this leads to proactive rather than reactive risk management. To use another analogy: They can be swatted one by one, but they still keep coming. The best remedies are to create more effective defenses and to drain the swamps in which they breed.

Services on Demand

The swamps, in this case, are the ever-present latent conditions. In the past decade, researchers into human factors have been increasingly concerned with developing the tools for managing unsafe acts.

Error management has 2 components: Whereas followers of the person approach direct most of their management resources to trying to make individuals less fallible or wayward, adherents of the system approach strive for a comprehensive management program aimed at several targets: High-reliability organizations—systems operating in hazardous conditions that have fewer adverse events—offer important models for what constitutes a resilient system.

Just as medicine understands more about disease than health, so the safety sciences know more about what causes adverse events than about how they can best be avoided. In the past 15 years of so, a group of social scientists based mainly in Berkeley, California, and the University of Michigan at Ann Arbor has sought to redress this imbalance by studying safety successes in organizations rather than their infrequent but more conspicuous failures.

Although such high-reliability organizations may seem remote from clinical practice, some of their defining cultural characteristics could be imported into the medical domain. Most managers of traditional systems attribute human unreliability to unwanted variability and strive as far as possible to eliminate it. In high-reliability organizations, it is recognized that human variability in the shape of compensations and adaptations to changing events represents one of the system's most important safeguards.

High-reliability organizations can reconfigure themselves to suit local circumstances. In their routine mode, they are controlled in the conventional hierarchic manner.

But in high-tempo or emergency situations, control shifts to the experts on the spot—as it often does in the medical domain. The organization reverts seamlessly to the routine control mode once the crisis has passed. Paradoxically, this flexibility arises in part from a military tradition—even civilian high-reliability organizations have a large proportion of ex-military staff.

Military organizations tend to define their goals in an unambiguous way and, for these bursts of semiautonomous activity to be successful, it is essential that all the participants clearly understand and share these aspirations.

Although high-reliability organizations expect and encourage variability of human action, they also work hard to maintain a consistent mindset of intelligent wariness. Perhaps the most important distinguishing feature of high-reliability organizations is their collective preoccupation with the possibility of failure.On the left hand, the white plates represent the organizational managerial level and human failures unsafe acts : contribution of the psychologist.

Ideally, each defense mechanism should be intact, but these mechanisms contain individually harmless flaws. From top to bottom the body corporate was infected with the disease of sloppiness. Nearly all adverse events involve a combination of these 2 sets of factors.

The human error problem can be viewed in two ways: On the other hand, technical and organizational sides of safety often confuse academic researchers. Effective risk management depends crucially on establishing a reporting culture.

NEAL from Nevada
See my other articles. I have always been a very creative person and find it relaxing to indulge in street racing. I fancy correctly .