When was the last time you asked yourself “What is it doing now”? 🤷🏽♂️ Automation complacency has been a contributor to many fixed wing and helicopter accidents, and has proven that it can affect even the best of us.
Using an Autopilot (AP) safely requires a lot of different disciplines, like system knowledge and adherence to standard operating procedures (SOP).
What are are the threats of automation complacency, and what can we do to stay on top of what the autopilot is or isn’t up to?💡
We don’t publish all our Notes from the Cockpit (like this one) publicly, some are shared only by email. Get the next one sent straight to your inbox ⤵️
What is Automation Complacency?
Automation complacency is defined by Lyman et al in 1976 as:
Yea, that’s just a fancy way of saying: you overly trust and stop checking the onboard systems, including the autopilot!
There have been countless accidents so far, in both the rotary and fixed wing industry, that have automation complacency as the root cause.
If you become overly reliant on the autopilot, or other on board systems, it doesn’t take much for things to go horribly wrong.
“Shit in = shit out” comes to mind as well! 💩
The issues here range from inputting the wrong take-off weight into an FMS, resulting in the wrong V1 and Vrotate, to completely misunderstanding what autopilot modes are engaged.
If you’re not aware of what the AP is actually doing and what information it’s using to “aid” your flight, it’s only a matter of time before something goes wrong.
It’s one of those cases where an aid becomes a threat if used incorrectly.
So let’s dive into the ins and outs of automation complacency. What are the main causes?💡
What Causes Automation Complacency?
Where to start? There are a lot of ways that we can get sucked into being complacent. Let’s go over the most important ones:

1) Over-Reliance on Automation
Especially in aircraft that are known for ‘taking care of the pilots’, it becomes easier and easier to go from a vigilant attitude towards the automation, to a more trustworthy attitude.
Yea, it’s important to be able to use the AP without constantly worrying whether it’s going to do what you asked from it. However, a certain amount of caution and cross-checking is something that you should develop to avoid situations where the AP is basically in charge of you.
2) Lack of Monitoring
Particularly in multi-crew operations, things can go south if each pilot doesn’t do the job they’re supposed to.
While the job of the flying pilot might sound like the more important one, this is not necessarily the case. As seen in many fixed wing and helicopter accidents.
Mistakes will happen, things will go wrong. It’s how long we take to detect these things and how we deal with them that makes the difference. Dealing with them is only an option though, if they get detected!
Pro-active monitoring and challenging onboard systems is crucial to stay ahead of the aircraft. In both single pilot and multi pilot environments.
3) Boredom
Remember the arousal curve from your ATPL exams? Well that’s quite relevant here. If things are so quiet that we become bored, effective monitoring reduces, and we start to become less engaged with what’s going on.

This is where complacency can set in, and where errors related to the AP can go undetected for very long times.
4) Task Overload
The opposite side of the scale is task overload. If things pile up too quickly to the point we no longer have an organised perspective on what’s going on, it’s easy to take the ‘background stuff’ for granted to focus on what you deem is more important.
Issues with the automation can then escalate, without being detected by the flight crew.
5) Reduced Manual Flying Skills
Relying too much on automation is also a natural side-effect of overusing it and no longer maintaining your manual flying skills.
This is currently a huge issue in the airlines, where pilots are actively requesting more hand flying, but SOP’s often don’t allow for it. Annoyingly this also works the other way around, but more on that later.
6) Fatigue
Flying outside of your normal circadian rhythm, or simply having a long day after a poor night’s sleep can cause havoc in lots of different ways.
Unfortunately one of those ways is the effect it has on your vigilance and overall alertness.
Detecting errors and rectifying them in time requires alertness. Fatigue kills alertness and therefore the ability to cross-check an AP properly.
7) Poor SOP’s and CRM
Especially for the helicopter pilots here, appropriate use of automation and SOP go hand in hand.
While the airlines have a much more established culture on autopilot disciplines, the helicopter industry can lag behind a little in this area.
It’s so important to nail the basics, like engaging a new mode in multi-pilot settings:
- The PF requests a mode
- The PM engages the mode and announces what he does
- The PF then verifies (out loud) what mode has just been engaged
It’s a closed loop. There will be pilots who roll their eyes at this principle, but if something this basic is not covered properly, you are setting yourself up for failure.
8) Low Conscientiousness in Pilots
Various research studies have found that conscientiousness (one of the big 5 personality traits we’ve covered here), is directly linked to the ability to fight complacency:
“Conscientious operators should be less susceptible to complacency and misuse”
This is something that has to be (and is) selected for in pilot assessments. People that score high on the conscientiousness trait find it easier to be organised and react effectively to changing circumstances.
9) High Level of Automation and System Complexity
There is a massive difference in automation complexity between different aircraft types. This applies to both fixed wing aircraft and helicopters.
Changing types from a training aircraft like a Cabri G2 (with a simple electronic RPM governor), to an AW169 with a 4-axis autopilot and various layers of automation, is quite a jump. Just like going from a Cessna to an A320.
10) Lack of System Knowledge
Having a high level of automation awareness is impossible without the system knowledge to back it up. Knowing what each autopilot mode actually does, and what their limitations are, is a crucial building block for combatting complacency.
This is not an exhaustive list of course, but are the most common reasons aircraft have crashed due to automation and pilots not working together properly.
So how do any of these pose an actual threat to flight safety? That’s what we’ll look at next ⤵️
What are the Threats of Automation Complacency?
On to the threats then. What are the dangers of trusting the system too much? Let’s have a look:

1) Delayed Detection of Malfunctions
The most obvious threat is that malfunctions with the automation, big or small, can be undetected for long periods.
During my graduation thesis, I programmed an A320 simulator to NOT show the pilots any warnings during an engine failure, other than the normal indications you’d see (like engine speed reducing).
I tested some very experienced captains with more than 15,000 hours of experience. Some of them took over a minute to detect something was wrong.
This is not to judge these captains, it’s human nature to become adjusted to what you ‘expect’ to happen.
They ‘expected’ a warning of some sort if shit hits the fan, but it never came…
Ask yourself: How dependent are you on the system telling you something is wrong?
2) Reduced Manual Flying Skills
Even between the larger aircraft types, there can be huge differences in just how much automation is doing for you in the background. In the AW169 for instance, the pedals are mostly done by the AP, even during ‘manual’ flying.
Great right?
Well yes, but it will reduce your manual flying skills over time, and you become reliant on it if you don’t actively manage your thought processes.
There have been instances of pilots moving back to more simple helicopters, and noticing how bad they’ve become at flying in balance!
3) Inaccurate Assumptions
Confirmation bias, a lack of monitoring, and being complacent can be deadly combination.
If plans are made on inaccurate assumptions, you’ll walk into a wall eventually. Complacency makes it easier to make mistakes in what data we choose to trust, and what data we don’t.
You might assume the correct AP mode is engaged, but without actually verifying it, how do you know for sure? Planes have crashed because of something as simple like this.
4) Increased Workload During Emergencies
So what if things do actually go horribly wrong? From engine failures to hydraulic failures and anything else we are trained to deal with: automation complacency will make things way harder.
Let’s not forget why the AP is there: it’s to aid us with repetitive tasks, to free up our cognitive capacity so we can focus on more important things.
If you’re hand-flying an aircraft, you will have less cognitive capacity compared to having all AP modes engaged.
If you’re complacent about automation, you could find yourself in a situation where your mental model does not match reality. This means that when things go wrong, your diagnosis will not only be a lot more difficult to carry out, there’s a good chance it’ll also be wrong.
5) Poor Monitoring
Why would you actively monitor a system that, in your opinion, ‘is perfect anyway’?
Automation complacency directly links to poor monitoring and verification.
6) Lack of Critical Thinking
Relying on automation too much takes away your authority regarding what the aircraft is doing. Your situational awareness reduces, and therefore your ability to critically think about whether what is happening is what should be happening.
7) Overconfidence
Another killer: overconfidence. This is a snowballing relationship. Overconfidence breeds automation complacency, but automation complacency also increases your confidence in the fact that ‘everything’s fine’.
8) The AP Being in Charge of Pilots
As mentioned before, overconfidence and complacency in regard to automation increases the risk of situations where the AP is in charge of the pilots, instead of the other way around.
If you combine a complex system, a complacent pilot, and an increase in workload, it’s very possible that the AP is doing things that the pilot did not intend.
9) Not Understanding the Actions of the AP
On top of the threats we’ve already discussed, the pilot may not even understand what exactly is going on behind the scenes, which makes dealing with unforeseen circumstances ever harder.
Again, if you blindly trust a system to work perfectly, you’re less motivated to understand exactly how that system works. Complacency results in a lack of understanding.
How to Prevent Automation Complacency
As we’ve talked about before in our complacency article, consciously increasing your vigilance counteracts complacency:

We’ll discuss the ways to prevent automation complacency here.

1) Evidence Based Training (EBT)
As we’ve covered in this article, EBT is a great tool to make sure pilots are equipped with the skills to analyse and assess how an AP might be malfunctioning.
Instead of covering every single exercise individually, it becomes more and more important to make sure we’re all demonstrating the competencies that will cover us in any circumstance.
This includes dealing with the tendency to become complacent, or not fully understanding how the automation works.
You won’t find this ‘exercise’ in any training syllabus. It’s the competencies that result in a good outcome that we should focus on instead!
2) Follow SOP’s
Pilot Flying ➡️ Fly the Aircraft, whether manual or with the AP
Pilot Monitoring ➡️ Check the Pilot Flying AND the AP!
For the pilots here that fly Single Pilot OPS, you’ll have to divide your attention between these 2 tasks. Action – Verification – Action – Verification, especially in aircraft with high levels of automation!
3) Stick to CRM Principles
A flat cockpit hierarchy is the foundation of a safe multi-crew environment. If the First Officer does not feel like they can actively challenge either the captain or the AP engaged by the captain, you’re going to have a bad time…
Teamwork, synergy, and a shared mental model between crew members are the antidote to automation complacency. If one person’s perception of reality does not match the other crew member’s or reality in general, it needs to be challenged!
4) Develop your System Knowledge
It can be hard to feel included in the control loop if the aircraft has high levels of automation, as so many processes are done for you. Staying on top of system knowledge and understanding what EXACTLY is being done for you is crucial here.
However, it’s not all down to the pilot. Aircraft manufacturers should develop systems in such a way that they include the pilot in the control loop, without excluding them completely.
5) Manage your Workload
Whether you fly single or multi pilot operations, managing workload is a great way to stay on top of things and not become complacent.
As we mentioned earlier, if you’re on either extreme of the arousal curve, the onset of complacency becomes more common. So, making sure you stay ‘adequately aroused’ (yea that sounds ridiculous…) is a great way to fight automation complacency.
Task overload and boredom do the opposite here.
6) Stay Ahead of the Aircraft
Like with so many other safety related subjects, staying ahead of the aircraft is a great way to manage threats to flight safety.
It’s not just about having a plan A and a plan B. It’s knowing what exactly the AP will do X amount of time from now. If you’re expecting a right turn but it’s turning left, it’s time to crosscheck what is going on, or intervene.
This won’t be possible if your mental capacity in the moment does not allow for you to stay ahead. If you’re not sure what to expect from the system, then how can you realise something is off?
7) Foster a Safety Culture
Effective CRM and adherence to SOP becomes harder if you’re in a culture where none of those values are emphasised and encouraged.
There is a massive difference in how different operators across the globe manage their internal cultures. While AP, CRM, and SOP related issues also happen within safety cultures, they are much less common.
If you feel empowered to speak up, to check, and to stand for what is right, you’ll have an easier time not blindly trusting any system or colleague. There is a balance of course, as some amount of trust is required, it’s a balance!
8) Find your weak spots
Your susceptibility to be complacent around automated systems depends on a lot of factors. Mainly your experience, your personality profile, the culture you’re in, and the kind of operation you’re flying.
Try to identify what your weak-spots are. You might be very agreeable for instance, and find it hard to challenge others, or cross-check complex systems. Identify these points and look for solutions to develop yourself!
9) Effective System Monitoring
A lot of automation complacency related accidents could have been prevented if one of the crew members had a more active role in monitoring the system.
We learn more and more how important the role of a pilot monitoring is, but even flying single pilot still requires a pro-active attitude towards monitoring your onboard systems.
10) Not Sure? Take Control
One day, the AP might do something that goes completely against what you expected to do. If this is the case and you have a colleague sitting next to you, the first thing is to ‘compare mental models’.
Ask if what you’re thinking or expecting is way off, and if you both agree that this isn’t expected behaviour and you’re unsure why it’s happening; take manual control!
Automation Complacency Related Accidents
Boeing 737 NG Rotated with only 260 m of Runway left and passed over the end at 10 feet: Crew did not detect an incorrect thrust setting caused by fault of the auto-thrust software.
Boeing 777-200ER, HL7742 Descent Below Visual Glidepath and Impact With Seawall: insufficient monitoring, non-standard crew communication when selecting the automation and a lack of manual flying caused mismanagement of the vertical profile. The main landing gear stuck a seawall causing the tail to break off. Sliding along the runway the plane partially lifted and spun 330°. The airplane was destroyed and three of the passengers were fatally injured.
Eurocopter AS332 L2 Super Puma, G-WNSB Crash into Sea on Approach: a combination of poor monitoring, SOPs not being clearly defined and not optimising the automation available resulted in an unnoticed airspeed reduction during the latter stages of the approach. This helicopter then entered a critically low energy state that could not be recovered from. Four of the passengers did not survive.
Boeing 777-7D7, HS-TKD Descent Below Approach Path: the crew were caught by surprise by the aircraft’s automation. An incorrect mode for the approach was then used, resulting in a rate of descent that was too high. The tower controller noticed the plane was too low and after an initial request to check altitude, the controller instructed a go-around.
Airbus A320, VH-VQT Mishandled Go-Around: whilst in fog at the decision altitude (DA) the crew were unaware of the aircraft’s current flight mode for the required go-around. The aircraft descended to within 38 ft of the ground before climbing.
Conclusion
Automation complacency is still a significant threat to aviation safety despite a lot of progress over the years.
If we overly trust autopilot systems and reduce our overall vigilance, we risk losing critical thinking and manual flying skills. This complacency can lead to delayed detection of malfunctions, inaccurate assumptions, and increased workload during emergencies.
There are plenty of ways we can manage this, but it requires a pro-active attitude towards self development and flight safety.
We don’t publish all our Notes from the Cockpit (like this one) publicly, some are shared only by email. Get the next one sent straight to your inbox ⤵️
2 Comments
Anonymous · June 3, 2024 at 2:32 AM
Thanks buddy!
Anonymous · June 3, 2024 at 2:28 AM
Great job buddy. Thank you!