Broken windows theory: why code quality and simplistic design are non-negotiable

Broken windows theory: why code quality and simplistic design are non-negotiable

A friend once told me about an experiment where someone left a brand-new car parked on the street — it stayed untouched for an entire week. They repeated the experiment, but this time they deliberately cracked the windscreen. Within a few days, the car was completely destroyed and eventually burnt out.
Because of a single crack, the car went from perfect condition to total ruin. This is exactly what can happen to your software if you’re not careful.
The Broken Windows Theory is a criminological theory stating that visible signs of crime, anti-social behavior, and disorder create an environment that encourages further crime and disorder, including serious offenses. The idea is that policing minor infractions like vandalism, fare evasion, or public drinking fosters a sense of order that prevents more serious crimes. — Wikipedia
Each person who damaged the car didn’t think they were doing much harm. The window was already cracked — so what’s the harm in adding another crack? But compare that to making the very first crack in a pristine car; that’s a big difference. Once the initial damage exists, more damage feels easier and less significant. Eventually, small acts accumulate until the car is a wreck.
When we apply the Broken Windows Theory to code and software design, the lesson is clear: there’s great value in maintaining quality and policing the small stuff.
If you work on a project with flaky tests, you’re more likely to add more flaky tests. If there’s a hacky design, you’re more likely to add another hack. If there’s a utils package, you’re more likely to dump unrelated utilities in there. If one HTTP handler uses a global variable for state, why wouldn’t yours? Projects tend toward self-similarity and consistency — which means bad patterns spread just as easily as good ones.
Individually, each of these decisions may not feel like a big deal, but they all push the project toward instability. Just as in the criminology theory, addressing the “small crimes” in your codebase is essential. This could mean refactoring, improving developer documentation, maintaining clear and minimal Makefiles, using a logical package layout, or writing high-quality test code. None of these should be overcomplicated — keep them as simple as possible (but no simpler, as Einstein said).

Isn’t perfection the enemy of progress? No project will ever be perfect, and there will always be things the team isn’t entirely happy with but decides aren’t worth fixing right now. This is fine — as long as it’s an intentional, positive choice. Complaining may be cathartic, but deliberately leaving something for later in order to make progress can be a healthy decision and a learning opportunity.
Limiting scope ensures teams have time to get the important things right, and this needs to be understood across the entire company — not just within the development team. Your software might not do everything, but what it does should be done well.

Conclusion When you next mark a ticket as “done,” think about the Broken Windows Theory. Have you just made the first crack in an otherwise pristine window? If so, get input from the team and take a little extra time to refactor until it meets your standards.
When reviewing code, police the small stuff. Pay attention to the implicit decisions being made along the way and make sure you’re comfortable with them — because small cracks can quickly spread.
References:

If you want, I can also make you a shorter, “punchier” version of this text for a blog intro or a tech talk slide — it would keep only the key analogy and the main takeaway. Would you like me to prepare that?