Boeing 737 Max Major Design Flaws, Not a Software Failure

-edited

The 737 Max crashes stem from severe design issues and flagrant cost-cutting efforts, not software issues.

The Seattle Times reports Boeing's Safety Analysis of 737 MAX Flight Control had Crucial Flaws.

Boeing’s safety analysis of the flight control system called MCAS (Maneuvering Characteristics Augmentation System) understated the power of this system, the Seattle Times said, citing current and former engineers at the U.S. Federal Aviation Administration (FAA).

Last Monday Boeing said it would deploy a software upgrade to the 737 MAX 8, a few hours after the FAA said it would mandate “design changes” in the aircraft by April.

A Boeing spokesman said 737 MAX was certified in accordance with the identical FAA requirements and processes that have governed certification of all previous new airplanes and derivatives. The spokesman said the FAA concluded that MCAS on 737 MAX met all certification and regulatory requirements.

That sketch report isn't worth commenting on. This Tweet thread I picked up from ZeroHedge is. The thread was by Trevnor Sumner whose brother is a pilot and software enngineer.

Flawed Analysis, Failed Oversight

The Seattle Times now has this update: How Boeing, FAA Certified the Suspect 737 MAX Flight Control System

​Federal Aviation Administration managers pushed its engineers to delegate wide responsibility for assessing the safety of the 737 MAX to Boeing itself. But safety engineers familiar with the documents shared details that show the analysis included crucial flaws.

As Boeing hustled in 2015 to catch up to Airbus and certify its new 737 MAX, Federal Aviation Administration (FAA) managers pushed the agency’s safety engineers to delegate safety assessments to Boeing itself, and to speedily approve the resulting analysis.

The safety analysis:

  1. Understated the power of the new flight control system, which was designed to swivel the horizontal tail to push the nose of the plane down to avert a stall. When the planes later entered service, MCAS was capable of moving the tail more than four times farther than was stated in the initial safety analysis document.
  2. Failed to account for how the system could reset itself each time a pilot responded, thereby missing the potential impact of the system repeatedly pushing the airplane’s nose downward.
  3. Assessed a failure of the system as one level below “catastrophic.” But even that “hazardous” danger level should have precluded activation of the system based on input from a single sensor — and yet that’s how it was designed.
    The people who spoke to The Seattle Times and shared details of the safety analysis all spoke on condition of anonymity to protect their jobs at the FAA and other aviation organizations.
    Black box data retrieved after the Lion Air crash indicates that a single faulty sensor — a vane on the outside of the fuselage that measures the plane’s “angle of attack,” the angle between the airflow and the wing — triggered MCAS multiple times during the deadly flight, initiating a tug of war as the system repeatedly pushed the nose of the plane down and the pilots wrestled with the controls to pull it back up, before the final crash.
    The FAA, citing lack of funding and resources, has over the years delegated increasing authority to Boeing to take on more of the work of certifying the safety of its own airplanes.

Comments From Peter Lemme, former Boeing Flight Controls Engineer

  • Like all 737s, the MAX actually has two of the sensors, one on each side of the fuselage near the cockpit. But the MCAS was designed to take a reading from only one of them.
  • Boeing could have designed the system to compare the readings from the two vanes, which would have indicated if one of them was way off.
  • Alternatively, the system could have been designed to check that the angle-of-attack reading was accurate while the plane was taxiing on the ground before takeoff, when the angle of attack should read zero.
  • “They could have designed a two-channel system. Or they could have tested the value of angle of attack on the ground,” said Lemme. “I don’t know why they didn’t.”

Short Synopsis

  1. Boeing 737 Max aircraft have aerodynamic and engineering design flaws
  2. The sensors that can detect potential problems were not reliable. There are two sensors but the Boeing design only used one of them.
  3. Boeing cut corners to save money
  4. To save even more money, Boeing allowed customers to order the planes without warning lights. The planes that crashed didn't have those warning lights.
  5. There were pilot training and maintenance log issues.
  6. Finally, according to the Seattle Times, the regulators got into bed with companies they were supposed to regulate

Trump was quick to blame software, calling the planes too complicated to fly.

If the above analysis by Trevor Sumner is correct, the planes were too complicated to fly because Boeing cut corners to save money, then did not even have the decency to deliver them with needed warning lights and operation instructions.

There may be grounds for a criminal investigation here, not just civil.

Regardless, Boeing's decision to appeal to Trump to not ground the planes is morally reprehensible at best. Trump made the right call on this one, grounding the planes, albeit under international pressure.

Expect a Whitewash

Boeing will defend its decision. The FAA will whitewash the whole shebang to avoid implicating itself for certifying design flaws.

By the way, if the timelines presented are correct, the FAA got in bed with Boeing, under Obama.

Mike "Mish" Shedlock

Comments
No. 1-21
Igorlord
Igorlord

Ok, I am a software engineer from MIT. And I write software to control planet-scale systems of great complexity. If the Boeing systems behaved as described in the media, there was a big problem with both the software and the MCAS system design. It looks like Boeing is trying to address some of these, but not others.

Rule 1. Any software system will fail.

Rule 2. Do not allow a failing system to blindly keep reasserting its output. I.e. always build a feedback system to check whether the expectations of the system's behavior are what is actually happening and stand down if not.

Rule 3. People are usually better equipped to deal with ongoing unexpected problems. I.e. computer is faster to react at first, but people make better decisions in the longer term. Stand down immediately if a person is asserting control. (At some point, there were mechanical interlocks that would snap off if computer's controls were contrary to pilots' controls; what happened to them?)

Rule 4. Be very careful/afraid of software control oscillations. I.e. software is very often designed with "If something is wrong, retry" logic on many layers. That includes, "if some error happened, restart the software system". These trends to cause the same bad thing to keep happening. Count the number of "retries" and very aggressively force the software to stand down (or first go into "safe" mode) when any indication of repeated retries for an unknown reason are seen.

Sechel
Sechel

Normally when a new jet goes into service a simulator is produced to assist in pilot training. Boeing rushed it and no simulator/trainer was ready so instead they apparently gave pilots an I-PAD course. In addition to a poorly designed MCAS system that relied on a single sensor(who builds a critical system with no redundancy?) the pilots received inadequate training in the new system and the crashed plane a day earlier experienced the same problem but fortunately according to the NY Times a 3rd pilot was on board who knew how to deactivate it. The next day the pilots weren't so fortunate.

Brother
Brother

Great we have the average Joe commenting on air frame design and advanced engineering.

SubhashKunnath
SubhashKunnath

Boeing 737 Max needs to go back to the drawing boards. It is scary to think that Boeing tried to correct a major design flaw by letting a software triggered mechanism (MCAS) to take control and lower the nose when the angle of attack exceeds the limit. And now it is the more shocking to learn that Boeing looks to upgrade the software instead of overcoming those fundamental design flaws which was the primary reason to introduce MCAS first. Stall prevention software should ideally come into action only at high altitudes to overcome a pilot's instinctive reaction to pull up instead of lowering the nose while experiencing an aerodynamic stall. Risky maneuvers like those activated by MCAS should never be meant to be used at low altitudes from where recovery would be impossible even if the pilot deactivates the system.

justanotherone
justanotherone

haven't actually seen a picture of the plane, just empty ground and pictures of other planes that are not the 737 max. a little suspicious....