Luminact Insights by Andrew Tomaro – 22 June 2022
It is not uncommon today that engineers work on highly complex programs and mission systems with many sub-systems and products. These comprise of varying constraints, stakeholders and interfaces. Today’s engineers can be challenged to make decisions that go against their code of ethics and sound engineering practices with ambiguous requirements, tight schedules, lack of resources, corporate pressures, supply chain delays and the constant push for cost reductions. It is significantly important that engineers make an informed decision to ensure the delivered product is safe and fit for purpose. This Luminact Insights article will provide an overview of professional engineering ethics and a selection of real-world examples of consequences resulting from an incorrect ethical decision. The lessons we can learn from these examples allow engineers to integrate integrity into their everyday engineering practices.
Each professional engineering body has a code of ethics that underpins the expectations of its members and how they conduct themselves within the profession. The following are two sample extracts from Engineers Australia and International Council on Systems Engineering (INCOSE).
Engineers Australia’s Code of Ethics and Guidelines on Professional Conduct (Engineers Australia, 2019) defines the values and principles that shape the decisions engineers make in practice. These guidelines provide a framework for engineers to use when exercising their judgment in the practice of engineering, focussing on demonstrating integrity, practising competently, exercising leadership and promoting sustainability. The guideline states that ethical engineering practice requires judgment, interpretation and balanced decision-making in context. Significant justification is required for any departure from the code of ethics.
Similarly, INCOSE’s Code of Ethics (INCOSE – International Council on Systems Engineering, n.d.) Fundamental Principles include that Systems Engineers uphold and advance the integrity, honour and dignity of the engineering profession by being honest and impartial, maintaining the highest level of integrity and striving to increase the competence and prestige of the engineering profession. Fundamental Duties to Society include accepting responsibility for actions and engineering results, being open to ethical scrutiny and assessment, and proactively mitigating unsafe practices.
D’Entremont, in his textbook, Engineering Ethics and Design for Product Safety (D’Entremont, 2021), has developed a set of the Four Rules of Applied Engineering Ethics as a logical and practical extension of an engineering code of ethics. Rule One – Work hard, earn your wages from your employer. Rule Two – Do the right thing (integrity), be able to sleep at night, this captures the safety, health and welfare of the public. Rule Three – Make managers and executives earn their salaries, ensure they ‘put skin the game’ by weighing in on important product-safety decisions, blame should not fall on the engineer alone. Finally, Rule Four – Do NOT go to jail! the engineer’s obligation to self.
D’Entremont raises an essential point that the engineer should not be a solo operator in his third rule. In a joint white paper between INCOSE and Project Management Institute, it was observed that “While program management has overall program accountability and systems engineering has accountability for the technical and systems elements of the program, some systems engineers and program managers have developed the mindset that their work activities are separate from each other rather than part of the organic whole”. (Langley, Robitaille, & Thomas, 2011). The following real-world examples will highlight how this type of corporate silo environment can lead to unethical decisions and their significant consequences.
Volkswagen’s Emissions Scandal
In the early 2000s, Volkswagen was climbing as one of the most profitable automotive companies in the world, but it needed to increase its market share in the U.S., which stood at only 2% at the time. The challenge that Volkswagen faced was the U.S. clean air standards focussing on pollutants produced from diesel engines such as those in Volkswagen vehicles. A Volkswagen engineering program team was charged with developing a vehicle that could run on clean diesel and meet U.S. emission standards. This team seemed unable to produce a satisfactory result. In the meantime, U.S. emission standards became stricter.
Instead of developing a compliant technical solution, software engineers developed a detect device that would sense emission testing conditions and put the vehicle into a safety mode with reduced power and performance, masking the actual emission output to meet and pass emission levels. Production rolled out, and sales began to climb over the next six years. Then the truth came out, first a whistleblower, followed by the inconsistency between stationary and on-road test results by California Air Resources Board and the United States of America Environmental Protection Agency. The unethical decision to engineer a device to circumvent emission regulations as failure within their corporation was not an option. The consequence was a significant undermining of integrity. (Rebentisch, 2017).
Boeing 737 MAX
In 2018 and 2019, two catastrophic crashes of the Boeing 737 MAX passenger aircraft resulted in the death of 346 people and the grounding of the entire global fleet. To compete with Airbus, Boeing pursued the integration of larger fuel-efficient engines to existing 737 designs, considered a cheaper and faster option than developing a new generation 737. The integrated larger engines were mounted further forward and higher on the wing to gain the required ground clearances. This resulted in a change of aerodynamics and a tendency for a nose-up stall. To prevent this in-flight stall situation, Boeing implemented a Manoeuvring Characteristics Augmentation System (MCAS) software solution which reads data from one of the two angles of attack sensors to send a signal to the horizontal stabiliser located in the tail for stall avoidance. The MCAS forced the aircraft to dive and ultimately crash.
The key ethical issue was using the MCAS software to mask a questionable hardware design. Furthermore, the MARS software was not disclosed to the pilots until after the first crash making the situation worse. Initially, Boeing was reluctant to admit to a design flaw in its aircraft, instead blaming pilot error. (Herkert, Borenstein, & Miller, 2020). Boeing also tried to emphasise that this was not a new function but just a tweaking of the original 737 system so that little to no new training would be required. The user manuals did not even include details of the MCAS. Pilots were not adequately trained, not informed of the design flaw, and not told how to respond when the MCAS system activated. (Prentice, n.d.). There was a lot of interest and pressure on the certification process and analysis engineers, who had a lot of pressure from Boeing management to treat any changes as minor changes (Herkert, Borenstein, & Miller, 2020).
Space Shuttle Challenger
On January 28, 1986, Space Shuttle Challenger broke apart 73 seconds into its flight, due to the failure of O-ring seals in a joint in the Space Shuttle’s right Solid Rocket Booster (SRB), killing all seven crew members on board. What was shocking in the follow-on investigations, was the meeting leading up to clear Challenger for launch between National Aeronautics and Space Administration (NASA) and SRB suppliers Morton-Thiokol.
Prior to the meeting, engineers from Morton-Thiokol, started to feel cut off from the decision-making process regarding the launch parameters for their boosters (Bergin, 2007). Morton-Thiokol were concerned with the potential of the O-rings’ catastrophic failure due to the launch’s cold temperatures. To gain the attention of NASA, Morton-Thiokol even sent a memo titled ‘HELP!’ (Committee on Science and Technology House of Representatives, 1986), followed by a recommendation of no launch. The problem escalated in the urgent teleconference that followed as NASA could not go against a contractor’s ‘no launch’ recommendation, which would destroy the schedule. With NASA on mute, Morton-Thiokol’s general manager turned to his three senior managers and asked what they wanted to do. Two agreed to proceed to launch, and one (Roger Boisjoly) refused. The general manager turned to Boisjoly and said, ‘take off your engineering hat and put on your management hat’. (Rensberger, Klose, Pincus, & Isikoff, 1986) Boisjoly changed his mind, NASA was informed, did not challenge the decision, and the launch was cleared to proceed. After lift-off, the engineers thought they had dodged the bullet, expecting an explosion on the launchpad (Bergin, 2007).
Engineers must frequently make decisions that involve trade-offs between cost and safety. But bad things often happen when they are embedded in an organisational culture that emphasises profits over safety and punishes those who dissent (Prentice, n.d.)
It is not surprising then for an engineer to face ethical issues and dilemmas in their professional practice; whether they are faced with deciding on the best solution or simply trying to do their job without causing harm, they must make decisions about what is right and wrong in accordance with their profession’s code of ethics. (Khan, 2022). Referring to the code of ethics of Engineers Australia and INCOSE, integrity is a common value, and it is important that integrity is integrated into an engineer’s daily processes.
I close out this Luminact Insight with the following dialogue by Morton-Thiokol engineer, Roger Boisjoly, arriving home following his teleconference with NASA to reinforce D’Entremont’s Rule 2 – Do the right thing, be able to sleep at night.
‘I went home, opened the door and didn’t say a word to my wife, ‘added Boisjoly. ‘She asked me what was wrong, and I told her ‘Oh nothing honey, it was a great day, we just had a meeting to go launch tomorrow and kill the astronauts, but outside of that it was a great day.” (Bergin, 2007).
The examples raised in this article can be read in deeper and greater detail – details in the references below.
1. Bergin, C. (2007, January 28). Remembering the mistakes of Challenger. Retrieved June 14, 2022, from NASA Spaceflight.com: https://www.nasaspaceflight.com/2007/01/remembering-the-mistakes-of-challenger/
2. Committee on Science and Technology House of Representatives. (1986). Investigation of the Challenger Accident. Ninety-Ninth Congress Second Session. House Report 99-1016. Washington: U.S. Government Printing Office.
3. D’Entremont, K. L. (2021). Engineering Ethics and Design for Product Safety. McGraw Hill.
4. Engineers Australia. (2019, November). Code of Ethics and Guidelines on Professional Conduct. Barton, ACT, Australia.
5. Herkert, J., Borenstein, J., & Miller, K. (2020). The Boeing 737 MAX: Lessons for Engineering Ethics. Science and Engineering Ethics, 2957-2974.
6. INCOSE – International Council on Systems Engineering. (n.d.). Code of Ethics. Retrieved June 13, 2022, from INCOSE: https://www.incose.org/about-incose/Leadership-Organization/code-of-ethics
7. Khan, A. (2022, March 6). Disastrous Engineering Failures Due to Unethical Practices of Engineers. Retrieved June 13, 2022, from Engineering Passion: https://www.engineeringpassion.com/disastrous-engineering-failures-unethical-engineering-practices/
8. Langley, M., Robitaille, S., & Thomas, J. (2011, September). Toward a New Mindset: Bridging the Gap Between Program Management and Systems Engineering. INSIGHT, 4-8.
9. Prentice, R. (n.d.). Engineering Ethics and the Boeing Scandal. Retrieved June 13, 2022, from Ethics Unwrapped: https://ethicsunwrapped.utexas.edu/engineering-ethics-and-the-boeing-scandal
10. Rebentisch, E. S. (2017). Integrating Program Management and Systems Engineering. New Jersey: John Wiley & Sons.
11. Rensberger, B., Klose, K., Pincus, W., & Isikoff, M. (1986, February 26). Thiokol Engineers Tell of Being Overruled. The Washington Post. Retrieved from https://www.washingtonpost.com/archive/politics/1986/02/26/thiokol-engineers-tell-of-being-overruled/