Quality in Software
Quality
As in almost every other industry, we hear the word quality a lot in Software. What company does not display it proudly on their website or advertise it in their marketing material? Ask any sales rep of any software company and they will say quality is most important and is what they deliver on every project.
Yet still, in reality, most projects are seriously lacking quality and most members of our profession have a hard time giving a very convincingly definition of what quality means. Most projects, in best case scenarios, start by looking pretty and in good shape while newborn and very young, but do not age gracefully and sooner than anyone would hope, development slows down significantly and many changes result in unexpected (and apparently unrelated) failures and sometimes even costly bugs.
How it’s built is just as important as what it does
So what does quality mean? What are the characteristics of a genuinely well-built software product and why do they matter? According to Robert C. Martin (Clean Architecture) any software product has two main values:
- first, the more obvious value, the functionality it provides to the Business (the customer). It is the what, that the Customer has hired us to deliver. Hopefully, at go-live, our software does exactly what it was asked of it and exactly what is needed at that moment.
- however, there’s a second, maybe even more important quality: how easy it is to change the software, to accommodate new requirements in the future. This is not very obivous at first, when ideally most of the requirements are already built into the product. But if this was done at the expense of its flexibility, its ability to easily change with time, the first value will diminish very rapidly until the whole project becomes virtually useless and needs to be rebuilt.
In real life, most projects are so focused on What they deliver that the How it’s achieved is given a seat somewhere back in the stands, at nosebleed distance from the main stage. This is partly because how well a software system is build is not very easy to measure, especially if the customer is not exactly savvy in terms of software architecture, clean code and the best practices of our profession.
How to achieve quality
Building a software that remains relatively easy and safe to change as years (and tonns of new requirements) go by is no trivial thing. It takes constant care, from the start of the project and throughout it’s lifetime. Here are some of the things that can help, if applied diligently.
According to Dave Farley (Modern Software Engineering), in order to deliver quality in our day to day work, we need to focus on two main competences:
- we need to become experts at learning
- we need to become experts at managing complexity
Learning
Beside being one of the best predictors of job motivation, constant learning is the essence of what Software Development is all about: beside the most obvious part of our constant learning, the technology we employ and the techniques for applying our knowledge to the problem domain, we are in a profession of learning mainly because before we build anything useful, we must first be sure we understand very well what we’re supposed to build.
According to Dave Farley, this is accomplished using the behaviours of:
- Working iteratively
- Employing fast, high-quality feedback
- Working incrementally
- Being experimental
- Being empirical
“Software development is an exercise in exploration and discovery. We are always trying to learn more about what our customers or users want from the system, how to better solve the problems presented to us, and how to better apply the tools and techniques at our disposal.
[..] Learning is at the heart of everything that we do. These practices are the foundations of any effective approach to software development, but they also rule out some less effective approaches.” (Dave Farley)
Complexity
The real enemy of a software developer doing a good job, in terms of the quality they produce, is often the complexity of the systems we need to work on. If we were asked to work on simple, disposable apps, with short lifespans and no prospect for them to ever change, then this whole discussion about how things are done would have no point.
However, in Enterprise Software at least, systems are used for many years (sometimes decades), with the expectation of flexibility and adaptability with the changing business requirements. And one element that adds most complexity to our software in an Enterprise environment is the integration with other systems, upstream and downstream.
Most modern software systems are complex and large, too complex and large to completely fit into our brain at any time, regardless of how smart our engineers are. So we need to use techniques to properly manage this complexity, among which, according to Dave Farley, top candidates are:
- Modularity
- Cohesion
- Separation of concerns
- Information hiding/abstraction
- Coupling
Maintaining Flexibility
Maintaining the flexibility of a solution, after years of evolving requirements following ever-changing business needs, is made possible by taking constant care of the quality of design (using principles such as: Clean Architecture, DDD, Modularity/Cohesion, low coupling, etc) as well as the quality of the code (making constant use of SOLID principles, OOP paradigms, design patterns, etc).
Good architecture is key to the future flexibility of any software product and our team took great care to keep the architecture clean at all times. Architecture is not just what the team decides at the beginning of the project, architecture is what evolves out of the code written every day by every developer, so it is essential that all team members are aware of their role in keeping the code easy to maintain and to understand in the future, as well as delivering the current requirement from the sprint backlog.
One way to ensure the architecture can support such flexibility over the long term si to design using principles from: Hexagonal Architecture/Ports-and-adaptors (concepts first introduced by Alistair Cockburn), DDD (Domain-Driven Design, developed by Eric Evans) – with a rich domain model containing all of the business logic (handling the essential complexity) and isolated from the infrastructure details (handing the accidental complexity of our system).