Enterprise IT Management Under DevOps: Everything as Code
(in the age of Cloud, Big Data, and AI)
Introduction
Enterprise IT Management evolves in spurts, adapting to changes in IT. This is because traditionally, with each IT revolution, the industry reinvents new approaches, rather than retrofits the existing legacy ways. This results in collective amnesia as each era rewrites the jargon and tactics for Applications, Systems, and Network Management. In the past, holding onto legacy software or thinking in the jargon of previous eras often led to catastrophic failure. This has created five distinct eras, based on these underlying IT revolutions:
- 1970-1990 Mainframe & Big Iron
- 1991-2000 Client Server & Dotcom
- 2001-2010 Globalization & Corporate Mergers
- 2011-2020 Cloud & DevOps
- 2021-2030 AI Enablement & Datasets
The current era and the next are changing both the industry and its traditions. The new social and technological fabric is battering down the silos of knowledge that have caused the periodic bouts of industrial amnesia. Specifically, the cultural shift to globalization, opensource, and devops are shattering company privacy, job isolation, and generation gaps. Additionally, virtualization has enabled a method of exchange between proprietary hardware and software silos. Across this exchange flows application requirements, infrastructure configuration, and business processes. Devops abstracts these flows into languages that can be written down, amended, and reapplied. In short, everything is becoming code.
Much like what the invention of money did for commerce and what written language did for knowledge building, the concept of everything as code eradicates the periodic amnesia that infects the industry. It enables the bureaucratic and infrastructure scaffolding to be recorded, shared, and evolved. This article describes the changes in the industry and why industrial amnesia is fading.
How Industry Amnesia Shapes the Eras of Enterprise IT Management
The industry’s recurring amnesia is rooted in its social-economics and how it tries to adapt to IT revolutions. Specifically, the industry uses startups and fresh recruits to assert new innovations, which breaks existing IT management. This younger perspective provides the energy required for a rebirth at the expense of connection to the past and the ability to fully leverage past knowledge and experiences. As such, the three forces driving the Industry’s amnesia from era to era are:
- Social-economic dynamics of the industry
- Arms race between IT management and open IT
- Lack of common exchanges
The industry’s social-economic dynamics drive new companies to leverage new technologies in order to create new monitoring and management products and industry standards. Due to limited budgets, they hire new recruits: book-smart, but experience-poor engineers. Their youthful energy creates new cultures and solutions that don’t always benefit from lessons learned in the field. Traditionally, this young energy is siloed into startup companies without an ability to bounce their ideas and approaches with the more experienced engineers that work for the deeper pockets of the established customer base. Within this context, the standards and industry experts are generally book-smart academics, a few years removed from the day-to-day business of adapting IT management to new technological trends. They too live within their own organizations, protecting proprietary insights. Further, this siloed perspective makes them oblivious to the social dynamics and true meta trends that determine the ultimate fate of their initiatives. Many of the top down standards imposed impossible constraints on the bottom up, gritty development and operational practices.
Beneath the industry’s social-economic mechanics is a traditional arms-race, similar to what drove the gigantification of predator/prey dinosaurs for millions of years or stealth/detection technology in military aircraft. For Enterprise IT Management, the arms race wrestles with imposing security versus allowing freedom as expressed through IT Management versus Open IT. As market pressures drive down cost and provide value, they also diversify hardware, software, and configuration as well as adding layers of abstraction and degrees of configuration freedom through cloud, containers, and other forms of virtualization. Management and monitoring complexity explodes. Most recently to regain control, the industry has pushed in with cloud and configuration management as code to commoditize change and reduce diversity in order to control costs and increase situational awareness. These forces together have created DevOps, both a collection of tools and approaches as well as a new culture that attempt to enable creativity within the constraints of a general process flow. Culturally DevOps did learn from the mistake of past eras – it unified development and operations through standard tools which further unified tasks, tools, and processes through simple GUIs and APIs. However, DevOps often missed the lessons-learned infused in the legacy tools as it recreated new tools to replace them.
What enabled DevOps is the emergence of exchanges. The invention of virtualization creates an abstraction from IT’s hardware, software, and networks. This enables the normalization of practices while respecting the integrity of these lower layers. Both normalization and integrity are required for emergence of a whole and why to date this hasn’t happened until now. The emergence of a whole enables the creation of language. The consequence of language is the concept of a “mind” that thinks in that language. In the case of DevOps, this mind provides a place to imagine, practice, and test without effecting the IT production. Moreover, it provides a place to record memories, remember them, and amend them in the dialog of the created language. Much like the human mind, this provides the necessary space for AI to play. Instead of telling machine “how” to do things, mankind now can provide examples of “what” they want and ask to machine to figure out “how” to do it. All of this is enabled through the creation of common exchanges.
Historically, adaptation to change creates the dance between Enterprise IT Management and Open IT. This unchoreographed dance by the outdated industry practices created the periodic bouts of industry amnesia. The participants danced independently and interacted only loosely. However, the industry is evolving and this dance is becoming cooperative. The creation of DevOps attacks this problem by creating shared ownership between the developer and operations. Dispensing with finger pointing and sharing knowledge and skills, eradicated the stagnation of solutions and enabled iteration through product cycles. Open Source through its public repositories (github, hub.docker, language packages, etc.) further breaks down the walls by allowing corporate protections to relax. Additionally, virtualization methods from virtual machines to containers to public clouds have provided ways to translate and exchange solutions in common currencies, breaking the silos created by proprietary hardware and software. Finally, over this common virtual fabric, requirements and tasks flow as code. Specifically, requirements transform into repeatable unit and system tests, infrastructure becomes evolving code, and end-to-end deployments are stored in repositories as versionable configuration. Together these create a written language of lessons learned and application context that can follow the solutions throughout its lifecycle. This overall disruptive meta pattern will only grow stronger as we move from the era of Cloud and into the era of AI. But before we discuss this, context is needed in the form of a history lesson as we walk through the paradigms of eras past.
Mainframe Era of Big Iron 1970-1990
Life was simpler in the beginning. In the early days of the Internet and prior to the PC takeover of corporate America, device and software variance were held hostage by the market of mainframes and limited to a handful of vendors, programs, and approaches. Networks were hierarchical and proprietary, which greatly simplified things. As a result, “what you didn’t know that you didn’t know” was minimal. Since management was handled from the top, everything from network and device discovery to metric proliferation and problem delegation was trivial. In its own way this was a golden age for Enterprise IT Management, a simple Garden of Eden, prior to IT’s bite of the Apple, PC, and the Internet, which introduced free will and kicked IT out of the garden.
Client Server Era and the Dot-com Boom 1991-2000
The golden years of Enterprise IT Management was the 1990’s. Budgets were plentiful and as a result the industry, companies, and individuals all did due diligence when it came to planning enterprise IT solutions. Proof of concept (POC) bake offs were often paid for by the customer and plenty of time and resource was provided by management.
The breakup of the monolithic mainframes into networks, midrange servers, and PCs, drove the fast evolution of Enterprise IT Management. It introduced new concepts such as device discovery, security validation, and fault detection as devices became peer to peer rather than hierarchical elements of the same solution.
Due to the dot-com boom, money was plentiful. The ageless arms race relaxed a bit as enough resource was thrown at the problem to provide broad management over the evolving mess. The Gartners and Network Worlds were also well funded, which enabled objectivity and intense research. Still, despite all the money and energy available for solutions, Gartner reported in 1999 that the majority of the network management solutions failed within the first three years. Though the plentiful funds enabled creativity and out-of-the-box thinking, it also made the industry lazy, creating solutions that assumed noncompetitive environments. When the dot-com crash came, resources dried up and approaches became constrained. Things went from hard to incredibly difficult.
Globalization Era of Corporate Mergers 2001-2010
The golden years were followed by the equivalent of the dark ages. As the dot-com boom came to a close, no one was willing to pay for planning at the customer level, much less the industry level. Customers would not fund POCs. As a result, large vendors such as IBM, HP, and CA cobbled together hierarchies by gobbling up the smaller niche players. Once absorbed into their respective “Borgs”, all innovation ground to a halt. Worse, the same polarization that has occurred on the news networks started to appear among the Gartner and Network Worlds of the day. With their funding sources drying up, these industry knowledge bases adapted by becoming increasingly biased and shallower in their reporting.
This posture fueled the rumor that the industry was a commodity market and the trend toward globalization only exasperated this delusion. Through the lens of globalization, the Walmart model prevailed – sell cheap commodities cheaply. The industry failure rates spiked closer to 90%. Worse, future solutions were crowded out. Venture capital stayed away from a market perceived as a commodity. New companies found it impossible to enter a market dominated by towering players like IBM, HP, BMC, and CA. The one move left for these small players was to provide a subset of services for the maintenance cost of the larger vendors. Yet this level of funding proved insufficient to fuel innovation at the previous decade’s level. At the end of 2010, the world of Enterprise IT Management seemed to be a very dark place.
Cloud Era Enabled by DevOps 2011-2020
In previous eras Enterprise IT Management tried to solve the war between secure versus open IT from within. As standard-setting institutions spun conflicting paradigms, the market place, in deference to the top down standards, settled on three primary monitoring products for Enterprise IT Management: fault management, performance management, and configuration management. Of these three, organizations implemented fault management first to address immediate issues, followed by performance management for reports to feed management and operations. In rare cases, configuration management was looked at to organize IT and reduce IT diversity. Ironically, though last in line, only configuration management could reduce variance and imposed structure and as a result, shrink the scope of work for both fault and performance management by an order of magnitude, fixing the root cause. However, prior to the cloud, few organizations could staff sufficiently for fault and performance management and never got around to implementing configuration management effectively. So, the problems compounded as operations tried to adapt to, rather than manage, the exploding diversity and complexity of IT
The solution to the problem came from the outside with the evolution of IT. Even as devices, application, and programming languages diversified, they were mitigated by the advent of virtual machines, clouds, and modern configuration management tools. These management constructs created an abstraction layer which could be altered without impacting reality. This enabled the ability to non-intrusively audit, develop, test and release improvements and ultimately reduced IT diversity and organized what remained. Together these technologies unified the disparate devices under common rules and management. Further, virtualization normalized the IT fabric and abstracted the end-to-end solutions from both hardware and software, enabling even more IT management choices. The DevOps processes that emerged recognized that clouds resembled unified, living, breathing collectives with self-healing features such as redundancy and automatic fault detection and fail over. In short, Clouds caught up to the industry hype – it commoditized, normalized, and automatically managed the components living within it. As a result, many big players faded, including: IBM, HP, BMC, and CA.
A big bang event sparked the Cloud Era and eclipsed the previous era. It came from a collection of white papers written by Google. These papers reinvented Enterprise IT Management from open source rather than from the big vendors of the previous age. The papers advanced several technological approaches, enabled by faster and cheaper hardware. These included: horizonal scaling across cheap hardware, database diversification from relational into NoSQL and others, simplifying protocols from XML and SOAP to RESTful APIs, JSON, and YAML. Both within and without Google, products spun up based on these approaches such as Splunk for Fault Management and Grafana/Prometheus for Performance management. However, in many cases these new tools did not recall lessons learned from the past era as lessons learned, more often than not, were baked deep into configurations and products and not spelled out explicitly through tests, infrastructure, and build as code as they are now in DevOps approaches.
AI Enablement Era and Dataset Centricity 2021-2031
The Cloud Era will likely conclude with the same list of hard problems the era of Globalization had as the hype of the technical changes fade. Specifically, those tasks that require abstract thought and human rule creation will remain challenging such as: event normalization, enrichment, and correlation as well as aspects of managed element discovery. However, the AI Enablement era is well equipped to deal with these issues by changing the focus of IT from the Turing Machine to the dataset. In short, the seat of the soul for IT is in long term memory of the dataset, not in the living, breathing subjective experience of the processor.
In order to see this shift, Enterprise IT Management needs to be seen as three layers: why, what, and how. Previous eras dealt primarily in “how” to do things, coding at a low level. If an application was to do something, it was told exactly what code instructions to follow in its language of choice and on its hardware. Next, came virtualization and solution as code which provided a common language, but still the application required specific instructions in that common language of “how” to do things. The era of AI enables Enterprise IT Management to move one layer up. That is humans will state “what” things are needed and the application will figure out “how” to develop rules to make that happen. An early forerunner of this is the declarative language SQL which declares to the database what data is desired without specifying exactly “how” to get it (what machine, which files, etc.)
The AI era will take this one step further. For example, imagine the human states what they want is to determine if a cat appears in a picture or not. The human provides a dataset of pictures, which either do or do not contain a cat and are labeled as such. The machine will analyze the pictures to determine a set of rules that correctly makes this prediction for the provided pictures. Going forward using that data model, the machine can answer whether or not a cat appears in a new picture. As such, the human only had to state “what” they wanted and provide enough examples of that. The machine, using AI supervised learning, determined “how” that is to happen. As this approach to solving problems becomes more common, AI based companies will emerge that focus on building and grooming an effective dataset.
Besides the development of improved AI and access to it, there is an additional shift in IT. Enterprise IT to date has largely revolved around the identity of living machines that persist and run multiple applications. With the advent of these core datasets, the identity shifts from the living machine to the static data and the “computer’s life” become much more fleeting than it is now.
For example, in the future when the cat identification problem is presented, a dataset will be identified and then a cloud of 1000 Linux servers can be spun up. The problem will be broken into pieces and distributed. Each machine will solve its individual problem in seconds. The dataset will then be updated and the 1000 Linux servers will be torn down and forgotten as their CPU and memory resources returns to the greater cloud for future use. Notice that the only thing that remains is the dataset itself. This concept of data centric identity is a paradigm shift from all past eras and will impact the future of Enterprise IT Management in ways that are hard to anticipate. Further, like the invention of money and written language that came before it, this modernization will change our perspective on life and likely how we perceive ourselves.
Summary
As we near the end of the Age of Cloud, the final hard problems in Enterprise IT Management remain unsolved. Managing large rule sets, definitive element discovery, and root cause analysis all remain relatively human intensive and difficult problems outside a single homogeneous cloud. However, just over the horizon the answer to these problems can be seen. The silos separating people and eras are eroding. New languages have developed to encode the experience and lifecycles around the application. The combination of these two enable collaboration at unprecedented levels. This will enable the AI age and allow humans to start asking the IT infrastructure “what” they want rather than telling machines “how” to get the answers.
Another lens through which to view this evolution and predict its outcome is the concept of emergence in complex systems. Complex systems such as life and knowledge start out as disparate, independent parts that don’t play well together. As time progresses, they evolve in trust. They create standard methods of communication and cooperation. They work together until they become a whole, wrap themselves with a skin, and create new methods for communication and cooperation with larger pieces like themselves. This process starts at the bottom with the lowest and most atomic pieces and slowly progresses upward into larger and larger groupings.
We are experiencing the same growth and evolution in IT. As the pieces of IT become commoditized from programs to computers, larger and larger groupings will work as a whole rather than as individual parts. We are on the cusp of the creation of language for all things IT and with it a method of exchange. Just as in the case of written language and currency, this emergence of language and exchanges will usher in a paradigm shift in the social fabric of not only the industry, but of mankind as a whole.
Leave a Reply