My friend and colleague, Alex Yakyma, from Kiev, Ukraine, wrote this interesting whitepaper which describes how, based on the underlying mathematics, software complexity tends to be inherently higher than one might think. He also describes how refactoring, our key weapon in this battle, can be used to continuously manage complexity over time and thereby keep our maintenance burden within controlled bounds.
Here’s the abstract.
The inherent complexity of software design is one of the key bottlenecks affecting speed of development. The time required to implement a new feature, fix defects, or improve system qualities like performance or scalability dramatically depends on how complex the system design is. In this paper we will build a probabilistic model for design complexity and analyze its fundamental properties. In particular we will show the asymmetry of design complexity which implies its high variability. We will explain why this variability is important, can and has to be efficiently exploited by refactoring techniques to considerably reduce design complexity.
Here’s the Refactoring and Software Complexity Variability _v1_35_: Feedback would be very welcome. Also, you can ping Alex directly at email@example.com.