MLOps will enable the scalability of AI

February 22, 2022
#AI22 - No. 7 of 10

#AI22 is a series of articles highlighting what we believe to be 10 developments that will be impacting AI this year.
This series is co-written by
Dr. Johannes Otterbach, Dr. Rasmus Rothe and Henry Schröder.

---

Production and scaling are key challenges AI has been faced within an operational context. Gartner research suggests that more than 80% of AI projects do not go into production. Key obstacles include lack of necessary data, non-existent integrated development environments and inconsistent model execution. The era of manual AI will be upended through the introduction of MLOps. MLOps is the application of a framework for automation of the industrialized usage of AI: gathering data, building models, training models, rolling out and monitoring as well as retraining models. The implementation of a structured framework will shorten development life cycles and automate AI in everyday applications.  

In the last few years, there has been considerable investment into MLOps to minimize these difficulties with more than $3.8bn invested into the industry by the end of 2021 and more to follow. In comparison to DevOps, in MLOps it is difficult to serve companies with point solutions - software for single use cases. Generally, clients prefer end-to-end and use case specific applications, as this provides highly specialized providers - however from a MLOps provider side, this is only economical in verticals that are large enough. While the combination of use-case generic and end-to-end solutions is the holy grail, it is unsure when or if this will be achievable, however do expect end-to-end solutions with a vertical focus for large industries such as manufacturing.

As companies increase their AI development massively, they will benefit from the same technical and operational framework that DevOps has brought to software development. MLOps may assist and streamline all phases of model building and management by automating laborious and inefficient procedures. Companies will, however, almost certainly need to expand their AI teams with new talent, whose skills complement those of data scientists and expand the team's knowledge beyond model construction to operationalization. These teams will be better able to solve challenges related to accountability, regulation and compliance and other matters linked to managing and running AI models. Furthermore, this streamlining allows companies to focus on AI research and innovation with new technologies that go beyond basic implementation, helping enterprises to grow AI projects while simultaneously providing stable, running models.


Read more