By Ryan Nabozniak, Application Consulting Engineer at Aucotec
At the midpoint of my career, I worked in a small Northern Ontario community. The firm itself was comprised of six people – the owner, his wife, a civil engineer related to the owner, another structural engineer, a mechanical/process engineer, and me (an instrumentation engineering technologist). The firm was small and intimate, and I became fairly good friends with the Mechanical/Process Engineer – Peter.
After a few years with the firm, we went our separate ways. Peter joined a much larger, global EPC in a city farther south, and I left to go back to school to finish my degree.
One day after we’d started our new lives, Peter called me. He had been fired from his new job after 6 months. I was shocked!
Peter was one of the most conscientious, ethical, and detailed engineers I had met. He was a graduate of the University of Waterloo, one of the top engineering schools in Canada. And over the time I had worked with him, I had learned a lot from him, specifically about his specialization.
Naturally, I asked him what had happened. The story he told exemplified a classic ethical dilemma for many people in engineering and engineering-related professions.
Peter was sent to a local pulp and paper mill by his employer. His role was to support the existing engineering team at the mill site as a consulting process and mechanical engineer.
One of Peter’s projects was to design a crane and hoist system for use on site. He knew the minimum design specifications for this type of system, and for safety reasons, he chose to reinforce portions of the crane and hoist as well as size the motor to be larger than originally specified.
After completing the design, Peter submitted it to the client representative for review. The client immediately requested a meeting with Peter and a site union representative to discuss the design. The client was concerned about costs — he wanted them to be lower. Peter explained that his design made the system safer than originally planned. The union representative agreed.
A few days after the meeting, Peter was informed that he wasn’t allowed on site anymore. He was fired shortly thereafter.
Well, Peter ran into the engineer’s classic dilemma – the min/max problem. He weighed costs versus safety, and safety won out for him. But costs were more important to his client. He made the right choice based on his training and ethics, but it cost him his job.
This is not a new dilemma. Many engineers and engineering firms have had to wrestle with this question: Do you sacrifice your livelihood for your ethics?
The unfortunate fact is that cost pressure from clients and the need for engineer to make a living often win out. As a result, we’ve seen some very large and public disasters. My goal here isn’t to point the finger at any person or any company. My goal is to start a discussion about a systemic problem and look at how some high-profile companies have solved it internally so we can start solving it on a larger scale.
So, how do we go about discussing this critical dilemma within our own organizations?
Let’s first look at two disasters that arose from this same dilemma.
On March 24, 1989, the oil tanker Exxon Valdez struck a reef in Prince William Sound, opening the hull and spilling 11 to 38 million US gallons of oil. The remoteness of the location made cleanup incredibly difficult, resulting in much of the ocean and local area becoming contaminated.
At the time, much of the blame was placed on the shoulders of the Exxon Valdez captain. But time and perspective have vindicated him of responsibility.
Investigations revealed that the disaster was ultimately caused by a series of decisions in favor of cutting costs at the expense of safety — exactly like Peter’s dilemma above.
- The crew of the Exxon Valdez was overworked (shifts lasting 12-14 hours).
- Staff was reduced on transport ships.
- The Raytheon Collision Avoidance System, a critical sensor, was disabled.
- The U.S. Coast Guard had stopped monitoring ships and routes in the area (showing that governments are also susceptible to the engineer’s dilemma).
Let’s fast-forward 21 years to explore another example, BP’s offshore drilling rig the Deepwater Horizon.
On the night of April 20, 2010, the well blew out, killing 11 people, destroying the oil platform, and again causing a massive oil spill that may or may not still be occurring.
U.S. government investigations again found a series of cost-cutting decisions coupled with an inadequate safety system. The primary reason for the disaster was that the concrete around the well failed — a problem they would have found if they’d simply tested the concrete after the pour. The failure to perform this simple safety test led to BP incurring the largest corporate fine in U.S. history — $18.7 billion. The cleanup, loss of business, and negative press have cost BP an additional $90+ billion.
That’s more than $100 billion in losses, all of which could have been avoided with one simple test.
These were both terrible accidents. And it’s difficult to point fingers at any one individual, corporation, or institution. Many factors were in play in all of these situations.
What’s important to understand is that the decisions made in these situations weren’t unique. While they might not lead to such highly publicized disasters, these types of decisions are made every day in many types of organizations as they seek to cut their operating costs.
So, what can we learn from these stories? How do we approach the ethical dilemma of weighing safety against cost?
Here’s what I propose: We change the rules of the game.
To see how to do this, we can look to the very companies involved in the disasters above.
Let me explain.
Following the Exxon Valdez spill, Exxon (now ExxonMobil) started to look at their projects and culture differently. No longer were project go-aheads determined by the current price of their commodity. Instead, they were determined by an artificial, much lower, price that was internal to the company. This lower price insulated the company from the vagaries of the price fluctuations of oil and allowed them to better determine whether or not a project should proceed.
By lowering their commodity cost internally, ExxonMobil effectively removed cost as a factor in their decisions. As a result, projects typically went ahead during recessions and supply gluts because material and labor was cheap. This, in turn, allowed them to spend more money and time on the design, construction, operation, and commissioning of the projects.
Further, they changed their safety culture. Safety became a priority within the company, and they spent more time documenting their projects and plants to more accurately determine risk.
What ExxonMobil did was to change its approach to focus on stability, safety, and security, rather than on a desire to capitalize on high commodity prices. They may have sacrificed the opportunity to make more money in the short term, but they’re already profiting over the long term. Their stock price has nearly doubled over the last 10 years, and their employees and management no doubt sleep better at night knowing that safety is their first priority.
BP and the rest of the oil industry reacted in a similar way following the Deepwater Horizon spill. Oil majors started to seriously look at cultural attitudes towards safety. The price and liability of not doing this was simply too high to ignore.
It’s human nature to react and change things after large disasters like these. But it’s also human nature to forget the lessons we’ve learned in the past. We need to create a system where safety always takes first priority. And that starts with changing the corporate cultures we all work in.
I propose we — meaning engineers, managers, and other stakeholders — start having regular discussions about the ethical dilemma of cost versus safety. The choices we make based on these discussions will have consequences. But they are important because ultimately they will make our entire field stronger and the world we create safer.
Having these discussions will also open up the lines of communication, so the next time Peter (or another engineer just like him) faces the dilemma, he will have a framework to talk about it within his company and possibly with his clients. Hopefully, they’ll be able come to a decision that meets both cost and safety requirements. Everyone will be happy. And Peter will keep his job.