DevOps: How will the different forms of AI transform it for the better?
The pace at which multiple Artificial Intelligence (AI) technologies converge to automate DevOps processes directly accelerates the speed of creation and deployment of applications. And this even at scales that were once considered unattainable.
A majority of recent advances in AI are focused on the use of large language models (LLM) to improve developer productivity. Although its use is valuable, the volume of code passing through an integration /continuous delivery platform (CI /CD) only increases in parallel with the productivity of the development teams. While most companies are not able to recruit additional software engineers to manage additional pipelines, applying AI to DevOps processes is therefore becoming a real need.
The complementarity between developers and AI
AI makes it possible to analyze large amounts of data in real time to extract useful information and optimize work cycles. For example, those analyzed by algorithms can locate bottlenecks in a DevOps cycle, and provide recommendations to solve them. Teams can then apply machine learning to configure a platform, named AIOps, which eliminates many manual tasks that previously made management tedious. AI can also be a digital “peer programmer” to help in the analysis of anomalies and the search for innovative solutions.
Generative AI tools go even further by using natural language to give instructions to a trained LLM to generate code in order to create the necessary scripts to automate DevOps processes. These can then be applied using an orchestration engine to automate CD processes at unprecedented scales.
At the same time, companies will be able to use other technologies such as a vector database to compare the code to other samples that an LLM has been trained to recognize. The interest is then the identification of defects in the recently written code, likely to impact the security of the application. It is therefore only a matter of time before the overall quality of the code introduced into production environments improves to the point that the number of incidents that the DevOps team has to deal with decreases.
Configuration and validation will remain necessary
The realization of these advantages and other advances will require the construction and maintenance of several types of AI models. Many companies invest in defining machine learning operational practices (MLOps) to build them. Countless LLMs built by the open source community and IT teams will have to be integrated into DevOps processes, and the ability to examine these models to promote transparency and trust will be crucial, especially thanks to the use of open CI/CD platforms.
Naturally, AI arouses as many fears and concerns as enthusiasm. No one knows for sure to what extent DevOps processes could be automated, but one thing is certain: engineers will always need to validate them. It is still human to make a mistake, but it is much more serious to make it on a large scale, and those caused by a faulty AI model can prove catastrophic. However, there is no turning back possible.
With the convergence of several AI technologies, teams will be able to more easily manage DevOps processes at scale. And if one of the recurring criticisms of DevOps is the difficulty of managing cycles at scale, this debate will soon be closed thanks to advances in AI.
In the meantime, DevOps teams are encouraged to start reviewing their current cycles to determine which processes can be automated. AI will be ubiquitous in the future of software development and promises to significantly accelerate the speed at which better applications can be created, deployed and updated. DevOps procedures, thanks to the rise of AI, will then be democratized in a way that will allow more people to adopt them, given that the level of expertise required in programming will be significantly reduced. As these changes occur, the number of applications will continue to increase, and if the impact of all these software is promising, DevOps engineers will indeed be at the origin.