Boeing 737 Max Major Design Flaws, Not a Software Failure

-edited

The 737 Max crashes stem from severe design issues and flagrant cost-cutting efforts, not software issues.

The Seattle Times reports Boeing's Safety Analysis of 737 MAX Flight Control had Crucial Flaws.

Boeing’s safety analysis of the flight control system called MCAS (Maneuvering Characteristics Augmentation System) understated the power of this system, the Seattle Times said, citing current and former engineers at the U.S. Federal Aviation Administration (FAA).

Last Monday Boeing said it would deploy a software upgrade to the 737 MAX 8, a few hours after the FAA said it would mandate “design changes” in the aircraft by April.

A Boeing spokesman said 737 MAX was certified in accordance with the identical FAA requirements and processes that have governed certification of all previous new airplanes and derivatives. The spokesman said the FAA concluded that MCAS on 737 MAX met all certification and regulatory requirements.

That sketch report isn't worth commenting on. This Tweet thread I picked up from ZeroHedge is. The thread was by Trevnor Sumner whose brother is a pilot and software enngineer.

Flawed Analysis, Failed Oversight

The Seattle Times now has this update: How Boeing, FAA Certified the Suspect 737 MAX Flight Control System

​Federal Aviation Administration managers pushed its engineers to delegate wide responsibility for assessing the safety of the 737 MAX to Boeing itself. But safety engineers familiar with the documents shared details that show the analysis included crucial flaws.

As Boeing hustled in 2015 to catch up to Airbus and certify its new 737 MAX, Federal Aviation Administration (FAA) managers pushed the agency’s safety engineers to delegate safety assessments to Boeing itself, and to speedily approve the resulting analysis.

The safety analysis:

  1. Understated the power of the new flight control system, which was designed to swivel the horizontal tail to push the nose of the plane down to avert a stall. When the planes later entered service, MCAS was capable of moving the tail more than four times farther than was stated in the initial safety analysis document.
  2. Failed to account for how the system could reset itself each time a pilot responded, thereby missing the potential impact of the system repeatedly pushing the airplane’s nose downward.
  3. Assessed a failure of the system as one level below “catastrophic.” But even that “hazardous” danger level should have precluded activation of the system based on input from a single sensor — and yet that’s how it was designed.
    The people who spoke to The Seattle Times and shared details of the safety analysis all spoke on condition of anonymity to protect their jobs at the FAA and other aviation organizations.
    Black box data retrieved after the Lion Air crash indicates that a single faulty sensor — a vane on the outside of the fuselage that measures the plane’s “angle of attack,” the angle between the airflow and the wing — triggered MCAS multiple times during the deadly flight, initiating a tug of war as the system repeatedly pushed the nose of the plane down and the pilots wrestled with the controls to pull it back up, before the final crash.
    The FAA, citing lack of funding and resources, has over the years delegated increasing authority to Boeing to take on more of the work of certifying the safety of its own airplanes.

Comments From Peter Lemme, former Boeing Flight Controls Engineer

  • Like all 737s, the MAX actually has two of the sensors, one on each side of the fuselage near the cockpit. But the MCAS was designed to take a reading from only one of them.
  • Boeing could have designed the system to compare the readings from the two vanes, which would have indicated if one of them was way off.
  • Alternatively, the system could have been designed to check that the angle-of-attack reading was accurate while the plane was taxiing on the ground before takeoff, when the angle of attack should read zero.
  • “They could have designed a two-channel system. Or they could have tested the value of angle of attack on the ground,” said Lemme. “I don’t know why they didn’t.”

Short Synopsis

  1. Boeing 737 Max aircraft have aerodynamic and engineering design flaws
  2. The sensors that can detect potential problems were not reliable. There are two sensors but the Boeing design only used one of them.
  3. Boeing cut corners to save money
  4. To save even more money, Boeing allowed customers to order the planes without warning lights. The planes that crashed didn't have those warning lights.
  5. There were pilot training and maintenance log issues.
  6. Finally, according to the Seattle Times, the regulators got into bed with companies they were supposed to regulate

Trump was quick to blame software, calling the planes too complicated to fly.

If the above analysis by Trevor Sumner is correct, the planes were too complicated to fly because Boeing cut corners to save money, then did not even have the decency to deliver them with needed warning lights and operation instructions.

There may be grounds for a criminal investigation here, not just civil.

Regardless, Boeing's decision to appeal to Trump to not ground the planes is morally reprehensible at best. Trump made the right call on this one, grounding the planes, albeit under international pressure.

Expect a Whitewash

Boeing will defend its decision. The FAA will whitewash the whole shebang to avoid implicating itself for certifying design flaws.

By the way, if the timelines presented are correct, the FAA got in bed with Boeing, under Obama.

Mike "Mish" Shedlock

Comments (42)
No. 1-22
Bam_Man
Bam_Man

Apparently in the Lion Air crash, the First Officer had less than 200 hours of flight experience. To be flying a plane as complicated as the 737 Max with so little flight experience is completely insane.

AWC
AWC

As usual, we're being baffled with BS by multiple factions involved, who have some financial/political special interest in the issue.

tz1
tz1

MCAS = Self-Piloting Plane. What could go wrong? Especially if they thrift the sensors.

thimk
thimk

wow, the plane's software feedback loop via sensors compensating for a hardware/engineering/structural/design issue. Another US legacy company bites the dust.

bradw2k
bradw2k

The regulatory state is doomed to fail because it has a design flaw: the best judgment of individuals is replaced by calculations regarding government guns. But just watch how there will be calls for more regulations.

Mish
Mish

Editor

"Every experienced pilot once had only 200 hours of experience."

Correct - We do not know if there was a training issue or not. But every pilot starts with 0 hours of passenger flight hours.

Sechel
Sechel

Just like the SEC has no capability to produce financial modelling and risk assessment for derivatives I suspect that the FAA is incapable of determining the safety of a plane. All they can do is speak to Boeing and maybe audit the process. They don't have the engineering team Boeing does.

Interesting article. I'm surprised you didn't link to the original Seattle Times piece Mish, but instead chose a Reuters summary. If we assume MCAS was not capable of doing its job the question becomes did Boeing run adequate simulations and test flights? How does the FAA supervise Boeing?

As far as the plane design, it sounds hokey, but I'm not an aeronautical engineer. I also don't perceive Zero Hedge as being a legitimate news source . Lots of garbage on it. I question their editorial supervision and vetting of stories.

HubbaBuba
HubbaBuba

Moving the (bigger) engines forward created some new dynamics for the plane. The vertical pivot point b/tn the engines (power) vs the wings (lift) moved forward; likely the center of gravity as well. Either, much less both, would have created different handling characteristics. Taking those dynamics forward, and trying to offset it at the back via the tail would add to a more unstable dynamic. Since Boeing was penching pennies I wonder if in the fullness of time the MCAS was designed based upon the old airframe's behavior, when that model needed to change to accomidate the new pivot & balance characteristics.

abend237-04
abend237-04

Don't think so, Mish. Too many smart people touched this Engineering Change. If I had one guess, it was agreed to test with the one sensor configuration as a time saver and the dual sensor configuration phase-in got lost somewhere in the shuffle between Alpha/Beta testing and final, approved EC release.

hmk
hmk

Seems hard to believe they were pinching pennies on an inconsequential piece of equipment cost wise. Many parts missing along with a lot of wild speculation. Patience is required for the truth.

baldski
baldski

Why does the 737 Max require a MCAS system ? Does a 747 require a MCAS system? Not that I know of. 757? 767? 777? 787? Is this system on any of these? What is so wrong with this airplane that it requires this system?

Webej
Webej

Read pretty much the same analysis 5 days ago (

Boeing (which is in fact subsidized by a hose from military spending) wanted to compete with China and Airbus without designing a new air frame. The more it sinks in that it is the air frame & engines itself which demands software and other correcting issues, the less people are going to want to buy it.

Webej
Webej

() moonofalabama.org/2019/03/boeing-the-faa-and-why-two-737-max-planes-crashed.html

killben
killben

"the regulators got into bed with companies they were supposed to regulate"

Part of a regulator's job description.

Sechel
Sechel

Seems for the last 10 years the FAA let's Boeing certify its own planes. Why bother?

KidHorn
KidHorn

The explanation seems plausible as the plane went on a roller coaster ride prior to crashing.

justanotherone
justanotherone

haven't actually seen a picture of the plane, just empty ground and pictures of other planes that are not the 737 max. a little suspicious....

SubhashKunnath
SubhashKunnath

Boeing 737 Max needs to go back to the drawing boards. It is scary to think that Boeing tried to correct a major design flaw by letting a software triggered mechanism (MCAS) to take control and lower the nose when the angle of attack exceeds the limit. And now it is the more shocking to learn that Boeing looks to upgrade the software instead of overcoming those fundamental design flaws which was the primary reason to introduce MCAS first. Stall prevention software should ideally come into action only at high altitudes to overcome a pilot's instinctive reaction to pull up instead of lowering the nose while experiencing an aerodynamic stall. Risky maneuvers like those activated by MCAS should never be meant to be used at low altitudes from where recovery would be impossible even if the pilot deactivates the system.

Brother
Brother

Great we have the average Joe commenting on air frame design and advanced engineering.

Sechel
Sechel

Normally when a new jet goes into service a simulator is produced to assist in pilot training. Boeing rushed it and no simulator/trainer was ready so instead they apparently gave pilots an I-PAD course. In addition to a poorly designed MCAS system that relied on a single sensor(who builds a critical system with no redundancy?) the pilots received inadequate training in the new system and the crashed plane a day earlier experienced the same problem but fortunately according to the NY Times a 3rd pilot was on board who knew how to deactivate it. The next day the pilots weren't so fortunate.

Igorlord
Igorlord

Ok, I am a software engineer from MIT. And I write software to control planet-scale systems of great complexity. If the Boeing systems behaved as described in the media, there was a big problem with both the software and the MCAS system design. It looks like Boeing is trying to address some of these, but not others.

Rule 1. Any software system will fail.

Rule 2. Do not allow a failing system to blindly keep reasserting its output. I.e. always build a feedback system to check whether the expectations of the system's behavior are what is actually happening and stand down if not.

Rule 3. People are usually better equipped to deal with ongoing unexpected problems. I.e. computer is faster to react at first, but people make better decisions in the longer term. Stand down immediately if a person is asserting control. (At some point, there were mechanical interlocks that would snap off if computer's controls were contrary to pilots' controls; what happened to them?)

Rule 4. Be very careful/afraid of software control oscillations. I.e. software is very often designed with "If something is wrong, retry" logic on many layers. That includes, "if some error happened, restart the software system". These trends to cause the same bad thing to keep happening. Count the number of "retries" and very aggressively force the software to stand down (or first go into "safe" mode) when any indication of repeated retries for an unknown reason are seen.

SkyLord
SkyLord

There was one more problem too. When the plane is in a badly out of trim state, the aerodynamic forces on the stabilizer are large enough that the backup manual trim system requires superhuman strength to use. This is a latent problem that goes back to the 707. There were some procedural workarounds for this, but they stopped training pilots on them because no one ever used them because everyone knew when you fly a 737 you have to stay on top of the trim and never get too far out of trim. Then a new system is installed that has a new failure mode that can leave you in a badly out of trim state.

I would say this is a systems engineering/regulatory oversight failure to allow a backup system that doesn't work well under the conditions when you need it most.