←back to thread

341 points cjr | 1 comments | | HN request time: 0.21s | source
Show context
decimalenough ◴[] No.44536914[source]
> The aircraft achieved the maximum recorded airspeed of 180 Knots IAS at about 08:08:42 UTC and immediately thereafter, the Engine 1 and Engine 2 fuel cutoff switches transitioned from RUN to CUTOFF position one after another with a time gap of 01 sec. The Engine N1 and N2 began to decrease from their take-off values as the fuel supply to the engines was cut off.

So the fuel supply was cut off intentionally. The switches in question are also built so they cannot be triggered accidentally, they need to be unlocked first by pulling them out.

> In the cockpit voice recording, one of the pilots is heard asking the other why did he cutoff. The other pilot responded that he did not do so.

And both pilots deny doing it.

It's difficult to conclude anything other than murder-suicide.

replies(25): >>44536947 #>>44536950 #>>44536951 #>>44536962 #>>44536979 #>>44537027 #>>44537520 #>>44537554 #>>44538264 #>>44538281 #>>44538337 #>>44538692 #>>44538779 #>>44538814 #>>44538840 #>>44539178 #>>44539475 #>>44539507 #>>44539508 #>>44539530 #>>44539532 #>>44539749 #>>44539950 #>>44540178 #>>44541039 #
alephnerd ◴[] No.44536951[source]
> It's difficult to conclude anything other than murder-suicide.

Is it possible it could have been an accident or a mistake by one of the pilots? How intention-proofed are engine cutoffs?

replies(2): >>44537006 #>>44537365 #
xenadu02 ◴[] No.44537365[source]
It could be defective switch springs, fatigue-induced muscle memory error, or something else. The pilot who did it saying he did not may not have realized what he did. It's pretty common under high workload when you flip the wrong switch or move a control the wrong way to think that you did what you intended to do, not what you actually did.

That said Boeing could take a page out of the Garmin GI275. When power is removed it pops up a "60s to shutdown dialog" that you can cancel. Even if you accidentally press SHUTDOWN it only switches to a 10s countdown with a "CANCEL" button.

They could insert a delay if weight on wheels is off. First engine can shutdown when commanded but second engine goes on 60s delay with EICAS warning countdown. Or just always insert a delay unless the fire handle is pulled.

Still... that has its own set of risks and failure modes to consider.

replies(4): >>44537836 #>>44538111 #>>44538204 #>>44541826 #
pixl97 ◴[] No.44538204[source]
When your engine catches on fire/blows apart on takeoff you want to cut fuel as fast as possible.
replies(3): >>44538267 #>>44538687 #>>44538730 #
p1mrx ◴[] No.44538687[source]
Proposed algorithm: If the flight computer thinks the engine looks "normal", then blare an alarm for x seconds before cutting the fuel.

I wonder if there have been cases where a pilot had to cut fuel before the computer could detect anything abnormal? I do realize that defining "abnormal" is the hardest part of this algorithm.

replies(3): >>44539457 #>>44539593 #>>44539663 #
lxgr ◴[] No.44539457[source]
If the computer could tell perfectly whether the engine “looks normal” or not, there wouldn’t be any need for a switch. If it can’t, the switch most likely needs to work without delay in at least some situations.

In safety-critical engineering, you generally either automate things fully (i.e. to exceed human capabilities in all situations, not just most), or you keep them manual. Half-measures of automation kill people.

replies(2): >>44539682 #>>44541579 #
1. 7952 ◴[] No.44541579[source]
But humans can't tell perfectly either and would be responding to much of the same data that automation would be.

I wonder if they could have buttons that are about the situation rather than the technical action. Have a fire response button. Or a shut down on the ground button.

But it does seem like half measure automation could be a contributing factor in a lot of crashes. Reverting to a pilot in a stressful situation is a risk, as is placing too much faith in individual sensors. And in a sense this problem applies to planes internally or to the whole air traffic system. It is a mess of expiring data being consumed and produced by a mix of humans and machines. Maybe the missing part is good statistical modelling of that. If systems can make better predictions they can be more cautious in response.