It is, of course, inappropriate to speculate on the possible cause of the tragic Ethiopian Airlines crash. However, the reaction offers lessons I believe.
The authorities in Ethiopia and China have grounded the same model virtually instantly until root causes are understood and appropriate measures have been taken.
It may be significant that the aircraft is the same new model as crashed in Indonesia in late 2018, or it may not. Risk management is the name of the game.
One of the contributing factors in Indonesia was the new safety feature built into the aircraft software. The concept was to improve the way the aircraft responds to an unwanted state and is supposed to be an aid.
It is called the Manoeuvring Characteristics Augmentation System (MCAS). It is designed to prevent the aircraft from falling out of the sky in an aerodynamic stall.
However, it appears that the manufacturer, Boeing, did not incorporate the knowledge of the system in its training to crews. Many qualified crews around the world have stated that they nothing about the system's existence.
To compound the problem, it appears that the Indonesian aircraft may have been despatched with a known technical problem which the ground engineers may not have been able to reproduce on the ground.
In other words, the safety system responded correctly, but the suspicion is that it was being fed erroneous data from a faulty sensor. In this case the AoA (angle of attack) sensor – which detects the airflow over the wings.
Some of you may be aware that a cost saving feature is that this latest model of Boeing 737 is deemed to require only 'differences' training if a crew is qualified on the earlier model. They do not do a full technical knowledge course just learn and are assessed on the differences.
It seems the new safety feature was not included. Human error by the manufacturer? The result in a non-technical sense is a loss of situation awareness – how and why the aircraft is reacting.
Those of you who remember the tragic Kegworth crash in January 1989 which started with the crew identifying the wrong engine in an unclear flight situation and ultimately crashing across the M1 motorway in England's East Midlands.
Of the 126 people aboard, 47 died, and 74 sustained serious injuries.
Implicated was the pilots not being aware that the right engine supplied the air conditioning to the flight deck as opposed to the left engine in previous models.
The smell of burning was a small misleading clue. They had only completed a very brief 'differences' course.
Airbus have also suffered accidents in the past because the aircraft was so advanced and complicated that crew struggled to understand how the plane was reacting.
History can repeat, but we learn the lessons the hard way.
I flew six different airliners from four different manufacturers in my career. Comprehending what was going on was sometimes tricky and fundamentally down to how well the crew knew the aircraft. In other words, training.
In other professions such as healthcare, we all involve people trying to comprehend what is going on.
Healthcare professionals face a much more difficult job because no two patients are the same even more so than with aircraft. Knowledge and understanding are crucial, especially in how any one of us can get it wrong and right.
Human factors has been mandatory training in aviation since 1995, and even that can't prevent everything. But it sure helps.
I've presented training packages to several NHS Boards and senior management teams. Not surprisingly cost is a feature. Some get it; some seem to struggle.
After all, finding a direct correlation between safety and training is not easy.
But precisely who initially said, "If you think training is expensive, try having an accident"? I'm not sure, but it has been attributed to many wise people over the years.