BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

CNCF Drives Efforts To Standardize AI Functions In Our Cloud-Native Future

Following

Software is diverse. Because enterprise software systems are built using a wide variety of code libraries, internal architectural structures, development methodologies and of course different software languages, software diversity enables programmers to dip into a vast toolkit of tools. From this diversity comes breadth, adaptability and flexibility, but - equally perhaps - from this breadth also comes complexity, (some) confusion and a lack of standardization.

When we move any technology (or indeed any industrialized process or physical product) to a point of standardization, we generally enjoy a simpler route to usage, a higher ability to integrate… and an option to abstract lower-level complexities and move towards a utility model (not dissimilar the way we consume gas & electricity today) where we can just plug in when needed. While many technologies have progressed to a point of standardization (other word processors do exist, but most users just adopt Word or Google Docs and take them for granted) then that technology becomes more eminently usable and more widely considered to be a de facto part of the way we live.

Cloud Native Computing Foundation (CNCF)

Keen to promote, enable and underpin these operational economic principles in the technology space is the Cloud Native Computing Foundation (CNCF), an organization that falls under the parent umbrella of the Linux Foundation and so logically focuses on enterprise open source technologies, with cloud orchestration platform Kubernetes at the fore.

Always vocal on emerging techniques, tracks and tools across this now expansive area of technology is CNCF executive director Priyanka Sharma. Speaking to press and analysts in Europe this month, Sharma made note of 2024 being the 10-year anniversary of Kubernetes, a milestone which she and others thinks sees this technology now exist as a de facto standard for cloud-native applications. Together, she enthuses, the cloud-native community has built the majority of web applications that exist today and, significantly, the technology base here has extended to new and unique workloads. That ‘new workloads’ element is key (i.e. actual working cloud-native apps), especially if we remember how much complexity we have to deal with here.

“I believe the reason we have grown so much is the scalability and extensibility of Kubernetes i.e. give us an [Internet of Things] edge device, give us a traditional server or any other computing entity and we can see how cloud-native Kubernetes is fit for purpose, at scale,” said Sharma. “As we now stand and see that the world of AI has impacted the technology universe, it’s not hard to see why people have referred to this period as the ‘age of irrational exuberance’ - and we can remember that the term itself came from ex-chairman of the Federal Reserve Alan Greenspan. But as we look back through the ages from steam, to early train travel, through automotive innovation and manufacturing, nothing important has ever really been innovated without a degree of irrational exuberance being involved, often where people have considered initial innovations in any given area to be outlandish or misguided at first.”

Prototype AI is ‘easy’

Now that organizations are establishing their own AI Centres of Excellence (CoE), the IT industry is seeing firms prototyping AI applications. But Sharma is realistic and notes that this is where so many businesses are also experiencing a large number of challenges right now. She says that this is because prototyping AI is ‘easy’ when compared to operationalizing and deploying real world scalable AI solutions.

With so many ‘opinionated solutions’ that offer AI-based upon a proprietary ‘walled garden’ approach according to the way Large Language Models (LLMs) and AI engines are created by the specialists that build them, businesses find it a struggle to deploy to AI into production stated Sharma, in something of a call for open source openness and wider collaborative exchange across communities, industries and teams.

Some of the proprietary cloud solutions that are less than open source might let developers only make a certain number of ‘calls’ (i.e. connections for an IT service). But Sharma advises that there is a place for everything – propriety technology also, so what matters is standardization and that tends to happen more organically in open models.

The era of standardization

But we’ve seen this struggle before – the cloud-native world has seen codifications and governance structures established such as the OCI (Open Container Initiative) to offer essential standardization, which has helped us move forward. We will see the same thing happen in cloud-native (with the right guardrails in place) so that the platform engineering teams who build the infrastructure behind new, functional AI-empowered applications can do what they need to do. According to Sharma, the community is hard at work solving infrastructure challenges for AI. Echoing the voices of many in the tech industry, she notes that it’s still ‘early’ by some measures for AI and so the drive to standardization has several chapters still to come.

Another standard in this space (there are many, but let’s list one more) is Dapr (Distributed Application Runtime), a CNCF open source project that provides

developers with a set of building block Application Programming Interfaces (APIs) designed to abstract away the complexity of common challenges that developers encounter regularly when building distributed applications.

In addition to the thread of this discussion so far, CNCF exec director Sharma was also in media-facing conversations with Paige Bailey, lead product manager at Google’s generative models and Google DeepMind division. The pair discussed the real pain points of evolving AI models in terms of their size and the capabilities that have surfaced.

The AI view from Google

“When I started working on machine learning back in 2009, we really saw software engineers faced with a whole new set of technology use cases that they had typically not been using before,” said Bailey. “They needed to work with a new tooling stack, new libraries, new infrastructure and so much more. As we now work to train out AI models with machine learning logic today, developers want to try out new models with new deployment constraints, new operational requirements and other factors all spanning AI engineering tasks that span training, fine-tuning and inference – in other words, so much is evolving right now.”

In terms of what kinds of tools the software industry should be using in this space, the overwhelming consensus of opinion in this space pushes towards the use of open source software, open data models, open governance and open everything. This is perhaps because none of these AI models should be running in a vacuum i.e. monitoring, observability, event logging, security scanning and a whole selection of infrastructure requirements needs to be shared. While there are many paid technologies across the technology universe, many are urging teams and individuals to create open versions of many of the key facilitating technologies.

“Our goal has always been to champion openness and put the needs of the end user community forward first and foremost,” said CNCF director Sharma. “We know that end users have asked us time and time again to ‘structure their engagement’ with the organization with all the training opportunities - and this whole effort leads us towards what we might call user-driven development. Cloud-native infrastructure offers the power needed to offer the resiliency that [our modern approach to] AI needs."

She further notes that the CNCF knows that the engineers leading platform engineering functions are the ones being tasked with development in this space and so the organization wants to engender a real cross-pollination of talent going forwards, as it has always done in the past, but now with a renewed focus on the intersection point between AI and cloud-native technologies at the core.

"The advent of the 'AI everywhere; era has repositioned infrastructure as a central topic of discussion, with Kubernetes emerging as a focal point in these debates. The Kubernetes community is still in the process of identifying both the technical and cultural bridges that will connect AI to software infrastructure. This includes adapting to the stringent requirements imposed by AI ecosystem and people, such as GPU support, model deployment, workloads distribution, the tool ecosystem and methodologies, plus also the alignment of business and operational models of hardware and GPU manufacturers," said Cyrille Chausson, research manager for European application modernization strategies (lead) at IDC Europe. "The community is diligently working towards these goals, but the journey to standardize Kubernetes as the go-to solution for AI deployments will require time. This is particularly true given the rapid evolution of the AI and generative AI movement and the fact that a significant majority of organizations have yet to fully adopt cloud-native architectures."

Chausson further adds that aligning with AI discussions and constraints could potentially accelerate the transition from traditional lift-and-shift approaches to cloud-native ones in Europe (and elsewhere) with Kubernetes, thereby catalyzing application modernization with AI/GenAI functions.

The Linux moment for Kubernetes?

If we have to sum up the zeitgeist and defining mood behind these cloud-native technologies right now with microservices, containers, open source and the new application of AI and Large Language Models (LLMs) together, the CNCF and otherwise affiliated groups - including CNCF parent organization the Linux Foundation - would like to suggest that this could be the ‘Linux moment’ for Kubernetes.

What does that mean? It means being at the point when Linux started to really gain traction in the back office server space in working operational enterprises. It means the point at which Linux started to really worry Microsoft, way back when Linux was a cancer (according to previous CEO Steve Ballmer) and it even means the enlightenment moment when Microsoft (according to current CEO Satya Nadella) now loves Linux. This quarter in 2024, some 45 companies have newly joined the CNCF and, further, the cloud-native development market is projected to grow (according to research from MMR) to $2.3 trillion by 2029, up from $547 billion in 2022. The Linux moment for Kubernetes is the point when it becomes ‘assumed’ as expected to be used within any given deployment.

For Kubernetes to have that Linux moment, it means when a company builds an app tomorrow, it goes cloud-native from the start and uses a distributed hybrid approach to application workload architecture before it coalesces and manages all those elements using this technology as a standard.

Follow me on Twitter or LinkedIn