Risk Perception - The "Macho" Syndrome

Risk Perception in High Risk Environments

The “Macho” Syndrome

Lessons Learned from Fatal Events

Based on the article See & Flee in the Petrochemical Industry, by Jim Whiting and on 32 years of personal experience in design start-up, operation, and troubleshooting of direct reduction plants worldwide.

Investigations on fatal events in industrial plants in Mexico, Malaysia, Indonesia, India, and Europe as well as in Australia and South America bring a question to mind.

Were the workers aware that the mine conditions were unsafe and was that awareness translated into actions to reduce the risk? Or was the risk accepted as part of the job?

Especially after the recent evidence at Pasta de Conchos (explosion in a coal mine in Northern Mexico), and the “heroic” declarations of some politicians: “We will not rest until we have found and punished the person responsible!”

Particularly, in one of the companies where I was employed, a few years ago, two operators died in an explosion, there was a ball a fire that covered them during one of the product discharges. Unfortunately this was not an isolated event.

I remember the work environment and the spirit that moved us. Production is first, personal security was never mentioned, even though the official speech would say otherwise. We thought we were expected to be “machos”.

Nobody wanted to appear as a “chicken”.

In that period, even the use of dust masks was socially unacceptable. The use of fall arresting harnesses indicated fear or insecurity, and was taken as a bad example for the rest of the workers.

“Nothing is going to happen! We have worked like this for more than 35 years; we are not going to back up now, are we?” This was said to me by one of the company’s VP’s, when he wanted to install a video camera in the open discharge of one of the new reactors. When I was explaining to him that the camera had open electronic components and that the atmosphere was explosive, and that the dust was electrically conductive, he said to me “we’re not going to chicken-out, are we?”

I had to think fast and melt the fuses in the camera, so that we could not use it. I then designed a rig to put the camera in with minimum risk. The end result was there was indeed an explosion, but, fortunately, nobody was hurt, as they were using the appropriate equipment.

I lost the opportunity to move up in the administrative ladder, as there were two Vice Presidents and one Manager looking down into the reactor that day! They knew the risk as we had just measured the explosivity of the surrounding air, but they were willing to face it.

Explosions in the reactor’s charge and discharge were frequent; the balls of fire were, literally, fireworks. The personnel would play, throwing objects from high platforms to the new engineers. We would laugh at their fear when they heard an unexpected (for them), explosion.

Intoxicated employees and flames during maintenance and operation were part of the environmental conditions, especially in some of the loading bins and quench towers, where, more than once, we retrieved unconscious, burnt and, sometimes, deceased personnel.

Reports and discussions were confronted with opinions from people in higher positions with no on-site experience, or that were so brave as to not visualize the danger. A report on a risky situation was a sign of not wearing the company’s T-shirt, and it was a terminal career move.

But we never ran, we would always return to work, once the ball of flame had passed. Actually, we were really never afraid. Nobody reported anything, we all knew the environment we worked in and we considered it exciting and part of our work. Those who would stay away were criticized and stigmatized.

The question now is, how do we train the operators and maintenance personnel on when to run or when to be “macho”?

When is it O.K. to be afraid?

Unfortunately, in Mexico, the consequences for a fatal accident in a company are few, even though the official speech may deny it. So there are hardly any effective motivational and disciplinary policies, besides signs and useless statistics.

In some cases, the consequences have been limited to the loss of production, during the time it took to clean-up.

In Australia, in comparison, a plant can be shut-down indefinitely.

In other countries, the insurance company can refuse to pay de damages, and fines for a fatality are charged automatically, even before the investigation, and they can be considerable.

Even so, “the culture of production is first”, it is universal… and it extends to nuclear installations and even to NASA.

And the phobia to use the emergency shut-down button also.

Many years ago, when we were beginning to make use of computers to control chemical plant operations, we decided to install controlled shut-down buttons, baptized as “the panic” buttons.

Few of us could clearly see the conditions under which they should be activated.

We only knew that shutting down a complex installation without assistance from the control system was very difficult, due to the number of variables that the operator had to have in mind. (This button was installed after a catastrophic incident, similar, in sequence, but not in consequence, to the one in Chernovyl.)

Each operator would establish the criteria for activating the “panic” button.

An unfortunate name as it implied the lack of ability of the person who activated it. Our people were reluctant to activate the “panic” button, as it meant that they were afraid of something.

Years later an effort was made to change its name to CSB (Controlled Shutdown Button), a difficult task in the petrochemical industry. At least the name was changed in some drawings!

Later on, due to the persistence of one of the production managers, the shutdown button was used every time they had to shutdown the plant. So the “panic” part of activating the button was finally lost, and became a very comfortable addition to the operation environment.

A related incident happened not long ago, two operators followed by a Manager went to a platform to correct a problem. They all knew they could have the option of shutting down the plant, nevertheless, they decided to solve the problem, without using the panic button.

The apparent process control loss and the record of recent repeated plant shutdowns made them take a high risk decision that took their lives.

The situation had never been analyzed or drilled.

They made the wrong decision as a team (they did not work as a team; they supported the wrong decision due to an incorrect concept of solidarity.

The technical problem was finally tracked to a poor control loop tuning (the plant had been built using multiple contractors, each one supplying its own control system); the conflicts between control systems led them to block all but one computer during start-up, a shortcut that would cost them dearly.

But the issue here is, what led them to decide to put their lives on the line, when there was the option of shutting down the plant and starting-up again?

As with all investigations, pertaining to human performance problems, the principal challenge is to:

•Analyze which were the “good reasons” that made them take that risk.

•See what they took into account that made them believe that their opportunities of winning were better than those of loosing.

All this behavior depends on the perception of risk. Which was and is the perception of risk of the miners in the coal mine industry?

Three perceptions have been considered for this incident, at least, to try to interpret the motivation of the operators that died and the rest of the personnel involved.

1.One setting could be that they intentionally, or unintentionally, did, or did not do something that had created the problem, and they believed that they could correct the problem before anyone else could find out, possibly to avoid a professionally embarrassing situation or for fear of disciplinary actions. To activate a stop button and shutdown the plant would make everyone aware of the problem. They believed that, if they could solve the problem without stopping the process, then nobody would find out what had happened. In that particular case somebody had pushed the start-up button again, resetting the computer block-out during start-up. (The buttons were too close together and the operator’s attention was on controlling a dome.)

2.The second setting is that the emergency stop button had been given the name of “panic button” or “chicken button”.

During our investigations we discovered:

•That there was a cultural expectation established among all the workers and management process, that the plant should not shut down for any change, unless you were “chicken”.

•Management would award those who were known to have been able to make a change without shutting down the plant. (They would also give more importance to the people who spoke English, no matter how inefficient they were.)

•This perception was reinforced when management would give contradictory or ambiguous messages referring to when to use the emergency stop button or the shutdown control button to stop the plant.

•Management honestly said that they had never told the operators not to use the stop button, and they probably frequently said otherwise, “your safety is more important than the process”, you have the power to shutdown the plant when you consider it necessary” and other similar euphemisms.

•BUT there had been some previous incidents when someone had shut the plant down and management had commented almost simultaneously, “Yes, you should have pressed the stop button, BUT we lost $250,000 Dollars worth of production”. The message that was relayed to the operators was ambiguous and they were left with a feeling that they had done something wrong.

•Adding to this, the personal image of the plant manager was that of a GI at work, people were afraid of discussing any issue with him.

You will agree that, frequently, the influence or pressure of our peers or bosses can have a great effect on how we perceive risk and how we perform in our jobs.

The third perception is familiar to all who have started plants up. There are many actions required in a plant start-up. There is much equipment involved and, normally, few people on hand. Therefore, the decision to shut-down the plant also takes into consideration the perception of the amount of work and time it takes to re-start it.

3.This third perception could be the main reason why the organization, not to say the operators, avoided activating the controlled shutdown button and do whatever possible to maintain the plant operating; in many cases not having the foresight of the risk that is being taken.

The message should be clear – and should be clear in the minds of Management, that all levels of the organization should frequently discuss the risk perceptions and risk expectations, as to how to handle tolerable/intolerable risk situations for special and real settings in their immediate work environments.

Fatal accidents, in particular, project a provocative and valuable concept, to be considered always in the meetings prior to the work and risk evaluation. Although we cannot give the names of companies or people that are involved, we can describe the conditions that gave origin, and their consequences, hoping the lessons will be useful for all, and hoping that you will never need to live these situations.

One of these could illustrate the case.

The operators knew, through smell, sounds and vibrations that an explosion was imminent and “saw and fled”; the supervisors stayed behind and died.

a)What determined that the supervisors should stay behind?

b)What was their perception of risk, conscious or subconscious?

c)And, their perception of duty?

d)What finally determined their behavior, and why did they stay?

e)Why do only the good employees get hurt and, sometimes, die?

f)Did the supervisors relieve that their boss’ wanted them to stay and die?

g)Did the supervisors ignore the risk?

h)Did they accept it consciously?

Some questions come to mind:

1)Was this discussed at any moment, among themselves, as to when they should stay and solve the problems, or when to run? Was this ever on the agenda?

2)Did their bosses ever discuss and analyze this setting at a previous meeting?

3)Did the operators consider themselves less responsible for the plant security and therefore ran?

4)Did they ever have a meeting with Management addressing this situation explicitly?

5)Was the company implicitly encouraging the supervisors to be in risky situations?

Using the knowledge we have gained after the incident, we ask:

Why didn’t we do what we were supposed to have done?

And now that we know what we have to do, why are we not doing it?

Confucius said at one point:

“If you know what you need to do and you do not do it,

then you are worst than before “

“To see what is right and not to do it, is want of courage”

The questions are actually simple.

From the answers, we should obtain actions to carry out, and not only assign guilt.

Actually, finding a guilty person or persons serves to satisfy the desire of vengeance; to try to destroy the opponent’s image little by little, will not reflect in substantial improvements.

The person selected as guilty will simply be substituted by someone else, possibly with less training and surely with less experience, who will be afraid of taking any decision.

Some questions that can be asked are:

•How many more settings of “see and flee” can be identified in our work places before having to take that decision when the time comes?

•Is there a trans-functional communication protocol that will inform us of other dangerous conditions?

All these questions should be made and discussed regularly, evaluating the answers, planning, practicing and stage-managing the actions and reactions within the work groups in your installation.

In the theatre, war and emergencies…

“You are what you drill”

•Systematic incident analysis is now used in many mining, airlines, oil and nuclear plants to find root causes and the appropriate solutions. It has proven over and over, how useful it is in the proactive risk evaluating process. “Macho” countries do not generally use systematic approaches. Their analysis, generally reactive, most of the time conclude that the person who died is the one to blame and life goes on the same way.

A final challenge to be used in safety and hygiene group discussions at work is, what it means to be a hero (especially in the case of underground mining management and nuclear and petrochemical operations):

•Have you taken risks you should not have taken?

•Have you not reported deviations that could cause a tragedy, for fear of being seen as a coward?

To center a fatal incident analysis, in the search of someone to blame and to finish the analysis when that person is found, is a recipe for disaster.

Finding the root cause and taking the appropriate corrective actions is the only way out of the spiral. Closely following the corrective actions, and making sure they have done their job properly, will be the first step. If the information flow channel is kept open and objective between operation and management, this will insure further improvement, and a safer working environment.

M. C. Marco A. Flores V.


Applied Innovation

Jim Whiting

Risk in the Workplace

TECMEN Consultant in: Sponge Iron (DRI) handling Sponge Iron DRI Automated Storage Firefighting and Root Cause Analysis Pneumatic Conveying Consultants Phone 5281 8300 4456.

Risk Perception

Posted on 9. Jun. 2008 - 04:32

To continue the previous article...

Risk perception is a complicated issue as the people that do not perceive the risk, will not perceive the risk.

I have recently faced two new situations.

A worldwide oil company had to clean settling tanks, where the sand separates, to an extent, from the oil. These tanks are closed. To be able to clean them they need to open them. Now the way they are opened, first they by-pass the explosive atmosphere alarm and plant shut-down, because there are burners in the plant and they do not want the plant to shut-down, just because they are cleaning one or more tanks.

Now, when the opened the tanks, what they considered normal is to have an explosive atmosphere surrounding the tanks. So they went ballistic when an operator, doing his job, reset the alarm and shut-down when he found out that it had been by-passed.

Then the plant shut-down, as was expected.

They lost many barrels and caused long-lasting trouble to the installations, as an emergency shut-down, of this facility, affects everying including the utilities that supply them with electricity.

When we made a preventive analysis of their operation, they wanted to fire the operator because he had re-connected the alarm. They could not understand that the alarm was there for a purpose and that they were defeating the purpose of the alarm and putting the whole installation in jeopardy.

A couple of years later, one of their installations had a fire related to this issue. They could just not see the risk! They were used to doing it that way, and even if you pin-pointed the problem, they would just not take heed.

The second part was a large installation in Sout East Asia, where they had finished the engineering for a new plant. Next move, purchasing.

They specified and selected the equipment. Then hassops! (Too late my friends!) The engineering is finished and prchasing is on the way.

So, although dozens of problems were detected during hassops, the cost of changing the engineering and stopping the purchasing was stratospheric. Suppliers do tend to get greedy when this happens. So no changes were made.

A preventive analysis was done afterwards, once the plant was due to start-up, and we found that the equipment piping was designed for earthquake loads, but the valves were not specified, nor selected to meet the stresses related to sysmic loading. This means the valves are now the weakest link in the chain, and you do not want your valves to be the weakest link in the chain.

Now, in this plant, nothing has happened yet. They have not had any earthquakes. Let's hope it continues like this for a long time.

The answer, when the report was delivered was:

"I don't think I'm supposed to know this."


"Why are you telling me this?"

Now, the ostrich behavior is not rare. It is present in many countries proactive accident analysis sometimes is discarded as over meticulousity.

Let me attach a picture and you will see what I mean. This was the connection between two gas pipelines. Two different company, each one owned there own pipeline. At this point they connected with each other.

See the picture and you tell me what you find in it.

This is one pipeline that is being built and the second pipeline is at full pressure.

TECMEN Consultant in: Sponge Iron (DRI) handling Sponge Iron DRI Automated Storage Firefighting and Root Cause Analysis Pneumatic Conveying Consultants Phone 5281 8300 4456.

Google Plate

Posted on 29. Oct. 2008 - 04:35

the goggle plate is out of specs , it should be the same thickness as the Flanges

this is a real foto of a mexican gas pipeline . high pressure as you cann imagine , the company a large gas supplier in the country descided to install an automated actuator to open if the connectying pipeline supply pressure failed.

The actuator was spring actuated a piston kept the valve closed , the piston was actuated by the down steam pipeline pressure .... under construction.... believe it or not .... so as there was no pressure in the down stream line then they installed a nitrogen tank ...... to keep the valve closed ...... By the way the gas line management is not mexican...

so the valve was trying to open , severall docen people were working commissioning the DS pipeline , in the other side you have high pressure natural gas ... this is a call for disaster .

Fortunately we were able convince them to remove the actuator (not without some hazzle) before anybody got killed, the unbelivable PART IS THAT THEY WANTED TO RE INSTALL THE ACTUATOR, if the pipeline ruptured , or any of the branches did, the valve would open , talk abouth the Piper Alfa

People do dumb things and hold to them..... this is real , and happens everywhere .

TECMEN Consultant in: Sponge Iron (DRI) handling Sponge Iron DRI Automated Storage Firefighting and Root Cause Analysis Pneumatic Conveying Consultants Phone 5281 8300 4456.

Full Picture

Posted on 29. Oct. 2008 - 04:41

here is a picture to illustrate the case


gasoducto (JPG)

TECMEN Consultant in: Sponge Iron (DRI) handling Sponge Iron DRI Automated Storage Firefighting and Root Cause Analysis Pneumatic Conveying Consultants Phone 5281 8300 4456.

Piper Alfa

Posted on 12. Jun. 2009 - 11:08

I just received a copy of the video Spiral to Desaster, not a very good copy though but is shows that a bilnfd flange was used to clse the place where a safety vent valve for a compressor was.

This was enough evidence to suspect that the people involved in the incident, either had no perception of the risk involved, or were deeply involved in the "macho" syndrome.

You never, ever, substitute a safety valve for a blind flange.

Although the video shows otherwise, there is room to speculate that the blind flange was put there on purpose and with the purpose of allowing this compressor to go on line without the safety valve. This again is a no, no. Now the flange in question was probably not even a specification flange, but a "make-shift" blind flange, just to cover the hole. Something similar to the case just exposed in the previous post.

In another company, one of the engineering managers decided to boost the pressure of large vessels because he considered that the ASME code had a very large safety factor. It seems to me that he never understood the purpose of the ASME code. Fortunately we were able to stop him before he went ahead and boost the pressure.

An in depth analysis of this Engineer training showed that he was an electronic Engineer and that he never was exposed to the design of pressure vessels, and the way that the allowable stresses are specified. His selection as Engineering Management was due mainly to the fact that he posed no threat to the Engineering Vice-President, who was, in fact, the final decision maker and who also was a Control and Electronics Engineer, but that had a long-lasting personal friendship with the President of the Company. The way things were, in this company, friendship and loyalty had precedence over knowledge and safety.

In many of the oil companies, the production premius tend to override the risk perception. The Piper Alfa cost about 160 lives and several Billion Dollars. The company learned its lesson well, but it was an expensive lesson after all.

TECMEN Consultant in: Sponge Iron (DRI) handling Sponge Iron DRI Automated Storage Firefighting and Root Cause Analysis Pneumatic Conveying Consultants Phone 5281 8300 4456.

Re: Risk Perception - The "Macho" Syndrome

Posted on 14. Sep. 2009 - 11:58

Many great lessons in this post! Thanks for another great article (I'm always searching for more learning.

John Burke

Re: Risk Perception - The "Macho" Syndrome

Posted on 8. Feb. 2010 - 10:54

Update. The Engineering Vice President -of the afore mentioned company was promoted to the Sales Department, where he was less dangerous and later, much later, he was forced to retire. The whole set-up was later sold to another holding. This will effectively shield the stockholders from any pending liabilities. The new holder will be smart to check the overall qualification of the technical personnel involved in the decision making of the new installations.

TECMEN Consultant in: Sponge Iron (DRI) handling Sponge Iron DRI Automated Storage Firefighting and Root Cause Analysis Pneumatic Conveying Consultants Phone 5281 8300 4456.