Premature intervention
In systems and software, premature intervention can be harmful. The term “premature optimization” represents a broader concern than is often understood. Usually, this involves making critical changes or wasting efforts before system parameters are understood (requirements, constraints, trade-offs, …) thereby adding complexity and reducing adaptability.
Instances of premature intervention include: (but are not limited to)
- Premature abstraction (≈ introducing abstractions before behavior stabilizes, obscuring intent and increasing indirection)
- Premature modularization (≈ imposing module boundaries before cohesion and coupling are evident, which can restrict flexibility)
- Premature implementation (≈ committing to code before key programming decisions are settled, resulting in rework)
- Premature architecting (≈ applying patterns or frameworks before goals and constraints are clear, adding rigidity)
- Premature scaling (≈ designing for performance, scale, or concurrency beyond foreseeable need, wasting effort)
- Premature optimization (≈ optimizing code without profiling or identifying bottlenecks, which can reduce clarity)
- Premature generalization (≈ engineering for speculative use cases that are imagined without basis, increasing complexity without payoff)
- Premature tooling (≈ building automation or infrastructure before workflows stabilize, risking misalignment)
Intervention is good and timely when grounded by strong reasons.
Related ideas
Variants of this idea have echoed across disciplines. Iatrogenesis in medicine and iatrogenics more broadly (≈ interventions intended to help may cause harm when applied without understanding or necessity). In software, YAGNI (≈ avoid speculative generalization, the Unix philosophy (≈ simplicity and doing one thing well), and ‘worse is better’ (≈ prefer minimalism and iteration over preemptive completeness).
Still, no heuristic applies universally, and sometimes a best guess suffices.