Graduating out of maturity models

Share

“Truth is a pathless land” – J. Krishnamurti

Maturity models are very popular ways of describing how organizations can improve. Simple models have a single dimension: first do A, then do B, finally achieve C. More complicated models may have two dimensions, identifying a variety of goals to accomplish in the initial stage of development, moving on to a variety of more complex goals, and so on. Highly developed models such as the Scaled Agile Framework might include complex patterns to mold your organizations into. All of these maturity models are a vast improvement over having no idea how to improve or progress. They are analogous to the structured education process that guides individuals from kindergarten through high school.

SEE ALSO: DevOps lessons learned from the field: People, process and technology

There are two approaches to problem-solving: algorithmic and heuristic. An algorithmic approach implies that there are certain patterns and formulas that can be followed to yield a particular outcome. A heuristic approach is one where you try a particular experiment and then check to determine whether you were successful or not.

Following a maturity model is an example of an algorithmic approach: it suggests a particular set of practices that should be followed, and leaders or teams apply those practices to their organization in hopes of making the needed improvements. Many enterprises lack awareness of the full range of approaches to improving their development processes and IT organization. Skilled consultants are well-positioned to share a vision of what’s possible, a map of how they could get there, and to provide tactical assistance in the methods needed to make progress. In many cases, these enterprises may have been taking a simple heuristic approach by trying different improvements, but lacking a clear vision or plan for how they could make systematic progress. In other cases, the approach may have been more haphazard. A maturity model framework is well suited to serving the needs of these customers by giving a structure and a vision for a range of improvements.

But it would be a grave mistake to believe that a formulaic approach can yield repeatable results when applied across different organizations, filled with different people, serving different markets at different times. As much as we may try to demystify the DevOps journey, it is in the end a journey into the unknown. It is entirely possible to invest an enormous amount of time, money and energy in checking the boxes in a maturity framework, yet deliver little net benefit. In some cases, the investment in process improvements may exceed the value returned. Therefore it’s critical to assess the impact of improvements as you go. Truly mature organizations build a strong foundation of basic practices and then graduate into a heuristic approach of iterative continuous improvement.

The hallmark of continuous improvement is a learning culture where all members of the team are aligned with the organization’s overall goal, but nevertheless actively experiment with methods to reach that goal in the most effective way possible. IBM’s Cynefin framework divides systems into Simple, Complicated, Complex, and Chaotic. Problems in Simple and Complicated systems can be addressed through formulaic approaches, but problems in Complex systems cannot be dealt with in this way.

A complex system is, by definition, a system that changes when you act on it, such that the situation must be reassessed after each step, and the same action cannot be guaranteed to yield the same results if repeated. Organizations are complex systems, as are individuals. Complex systems require a heuristic approach, otherwise known as the scientific method.

The scientific method is not and must not be seen as the exclusive domain of scientists! It simply means having the humility to acknowledge in advance that we do not know what will happen when we take action. Changes we make to a system are in fact hypotheses: we believe that if we implement this change we will see a particular result. Those hypotheses must be followed by investigation to determine whether we achieved the desired results or not. Such investigations speak to the power of tracking metrics, such as monitoring the four key DevOps metrics (lead time, deployment frequency, change fail rate, and time to recover), so that we can determine whether our efforts actually led to positive results.

Unsuccessful experiments are entirely normal in the context of science. They may not have yielded the desired results, but they nevertheless did bring value: they taught us something that doesn’t work. That knowledge is extremely valuable, but our goal in a scientific approach is to learn what does not work as quickly and cheaply as possible. This is the underlying premise of the Lean Startup as well as other Lean methodologies that emphasize validated knowledge as their end goal.

SEE ALSO: DevOps at scale: Winning strategies for modern enterprises

If an approach is seen to be effective, then it can be integrated into the workflow of the organization. This approach is known as Plan – Do – Study – Adjust, and often called the ‘Deming Cycle’. There is no map for this part of the journey, and neither are there experts who know in advance what will work for every organization. On the contrary, the best guides for this phase of the journey are those who can help the organization to develop a learning culture.

A learning culture is the epitome of a successful knowledge-work organization; it is one where every contributor is using their mind to the fullest, both to observe the actual situation, to imagine possible alternatives, and to design solutions that help realize those goals. While we may guide organizations through the initial stages of a maturity model, we should recognize that real maturity is the ability to explore, learn, invent, and succeed in a world of the unknown.

The post Graduating out of maturity models appeared first on JAXenter.

Source : JAXenter