We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
Mayday Mark 2! More Software Lessons From Aviation Disasters. by Adele Carpenter
Learn crucial software development lessons from aviation disasters: human factors, automation design, alert fatigue, and preventing systemic failures in complex systems.
-
Human factors and cognitive limitations play a crucial role in system failures - limited working memory, irrational decision making under stress, and habituation to warnings are key human vulnerabilities
-
Redundancy is not always better - too much redundancy can increase system complexity and cognitive load on users, potentially making systems less safe in practice
-
Automation should be designed to work with human operators, not against them - systems need to account for how humans actually behave under stress rather than how we wish they would behave
-
Consider the full context and environment users operate in - factors like fatigue, sensory overload, poor connections, and high-pressure situations significantly impact user behavior and decision making
-
Clear communication channels and shared understanding between team members are critical - breakdowns in communication often contribute to disasters even when technical systems are functioning
-
Alerts and warnings need careful design - too many alerts lead to alert fatigue and ignored warnings, while unclear or conflicting alerts increase confusion in critical situations
-
Regular training and practice are essential but can lead to complacency - users need to maintain readiness for unexpected situations while avoiding “going through the motions”
-
Complex systems rarely fail from a single cause - cascading failures across multiple systems, processes and human factors typically contribute to major incidents
-
Users includes everyone who interacts with the system - developers, operators, and end users all need to be considered in system design
-
Learning from failures requires moving beyond blame to understand systemic causes - focusing on individual blame prevents learning valuable lessons about system design