We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
Programming’s Greatest Mistakes by Mark Rendle
Explore infamous programming disasters, their multimillion-dollar consequences, and enduring lessons from Y2K to modern DevOps. Learn from history's costliest coding mistakes.
-
The Y2K bug wasn’t originally a mistake but a practical limitation due to expensive memory in the 1950s-60s. It cost around $500 billion to fix in the 1990s
-
The Ariane 5 rocket explosion ($580M loss) was caused by reusing code from Ariane 4 without proper adaptation - the 16-bit vs 32-bit integer conversion caused an overflow
-
The Mars Climate Orbiter crashed due to a units mismatch between imperial (pound-force seconds) and metric (Newton seconds) measurements - highlighting the importance of standardization
-
The Knight Capital trading disaster ($440M loss) happened when code deployment went wrong across servers, with no proper controls or limits on algorithmic trading
-
Null references, invented by Tony Hoare for ALGOL-W, are considered a “billion dollar mistake” that continues to cause problems in modern programming
-
JavaScript was created in just one week, leading to quirks and inconsistencies that still affect web development today
-
Infrastructure as code and automated deployments have made programming mistakes potentially more catastrophic, capable of taking down entire data centers
-
Copy-pasting code without proper understanding/adaptation remains a common source of serious bugs
-
The Hartford Center roof collapse ($90M) occurred because the CAD software only accounted for vertical snow load forces but not lateral ones
-
Modern agile/scrum methodologies and “enterprise” development processes often work against core software engineering principles and make development more difficult