BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

How AI Became A Cloud ‘Workload’

Following

Good technologies disappear. Although they are essentially still present, really useful and effective technologies start to slip into the fabric of the other software tools and data services that we all use every day. Almost like a home utility that you don’t really think about (who ponders the state of the electricity grid when they turn a light on, or thinks about the water company’s supply lines when they draw a bath?) good technologies like the spellchecker in your word processor or the screen refresh utility on your PC become almost invisibly absorbed.

That process has not yet happened with Artificial Intelligence (AI) - it’s drawing far too much fanfare and enjoying its time in the limelight thanks to the arrival of generative AI (gen-AI) and the proliferation of Large Language Models - but AI has the potential destiny to become an assumed, consumed and subsumed function that makes all our apps smarter in a pleasingly automated way.

AI as a workload

If that time comes, we will start to talk about AI itself as a system ‘workload’ i.e. a function that our enterprise or consumer software carries out to perform smart predictive, generative or reactive actions on our behalf. In fact, the IT industry has already started to use this term. It has surfaced in the latest enterprise AI study by hybrid multi-cloud platform company Nutanix.

The Nutanix State of Enterprise AI Report suggests that AI will now be a workload that will advance hybrid multi-cloud adoption. It’s first job - even before it gets to work on the applications in your pocket - will be focused on modernizing an organization’s IT infrastructure, which will often need to be improved to more easily support and scale AI workloads.

“In just one year, gen-AI has completely upended the worldview of how technology will influence our lives. Enterprises are racing to understand how it can benefit their businesses,” said Sammy Zoghlami, SVP EMEA at Nutanix. “While most organizations are still in the early stages of evaluating the opportunity, many consider it a priority. [Our] survey uncovered an important theme among enterprises adopting AI solutions: a growing requirement for data governance and data mobility across datacenter, cloud, and edge infrastructure environments making it even more important for organizations to adopt a platform to run all apps and data across clouds.”

Invisible cloud services

It was last year (before gen-AI even) that Nutanix talked about a dreamy vision for so-called ‘invisible cloud’ services, so this theme is arguably starting to validate itself and take shape. This year the company is saying that it speaks to enterprises that now plan to upgrade their AI applications or infrastructure. Where some firms struggle to do this is in many areas, but the movement of workloads (AI and other) between Cloud Services Provider (CSP) hyperscalers is typically among the usual suspects.

Today, hybrid and multi-cloud deployments are well established and are synonymous with modern IT infrastructure workloads. AI technologies along with growing requirements for speed and scale, are likely to bring edge strategies and infrastructure deployment to the forefront of IT modernization.

“It's probably simultaneously exciting and terrifying to be a datacenter manager right now,” Greg Diamos, a Machine Learning (ML) systems builder and AI expert. “You don't have enough compute in your datacenter, no matter who you are.” Diamos’ comment was made in the context of the Nutanix report and the wider proposition that AI itself is driving a need for a) spiralling cloud services and b) greater agility to move workloads across the cloud landscape (for want of a cloudier skyward analogy) to capture price-performance deals, to make use of diversified services, to meet local regional compliance legislature and so on.

A unified cloud operating model

Organizations now looking to migrate their existing applications to the public cloud can make use of Nutanix Cloud Clusters (NC2) on AWS, which provides the same cloud operating model on-premises as in the public cloud. This is all part of what the company likes to call its notion of a unified cloud operating model i.e. most organizations of any reasonable size will inevitably use more than one cloud, so they need a management model to enable that control factor.

“Customers can jumpstart cloud usage without going through the costly and time-consuming process of newly architecting an application,” said Zoghlami. “Nutanix licences are truly portable, meaning customers can choose where to run their applications and move them later if needed, without needing to purchase new licences. Customers can also use their existing AWS credits and purchase licences on the AWS marketplace.”

In the company’s cloud market study, almost all organizations say that security, reliability and disaster recovery are important considerations in their AI strategy. Also key is the need to manage and support AI workloads at scale. In the area of AI data rulings and regulation, many firms think that AI data governance requirements will force them to more comprehensively understand and track data sources, data age and other key data attributes.

“AI technologies will drive the need for new backup and data protection solutions,” said Debojyoti ‘Debo’ Dutta, vice president of engineering for AI at Nutanix. “[Many companies are] planning to add mission-critical, production-level data protection and Disaster Recovery (DR) solutions to support AI data governance. Security professionals are racing to use AI-based solutions to improve threat and anomaly detection, prevention and recovery while bad actors race to use AI-based tools to create new malicious applications, improve success rates and attack surfaces, and improve detection avoidance.”

Generative AI in motion

While it’s fine to ‘invent’ gen-AI, putting it into motion evidently means thinking about its existence as a cloud workload in and of itself. With cloud computing still misunderstood in some quarters and the cloud-native epiphany not shared by every company, considering the additional strains (for want of a kinder term) that gen-AI puts on the cloud should make us think about AI as a cloud workload more directly and consider how we run it.

Follow me on Twitter or LinkedIn

Join The Conversation

Comments 

One Community. Many Voices. Create a free account to share your thoughts. 

Read our community guidelines .

Forbes Community Guidelines

Our community is about connecting people through open and thoughtful conversations. We want our readers to share their views and exchange ideas and facts in a safe space.

In order to do so, please follow the posting rules in our site's Terms of Service.  We've summarized some of those key rules below. Simply put, keep it civil.

Your post will be rejected if we notice that it seems to contain:

  • False or intentionally out-of-context or misleading information
  • Spam
  • Insults, profanity, incoherent, obscene or inflammatory language or threats of any kind
  • Attacks on the identity of other commenters or the article's author
  • Content that otherwise violates our site's terms.

User accounts will be blocked if we notice or believe that users are engaged in:

  • Continuous attempts to re-post comments that have been previously moderated/rejected
  • Racist, sexist, homophobic or other discriminatory comments
  • Attempts or tactics that put the site security at risk
  • Actions that otherwise violate our site's terms.

So, how can you be a power user?

  • Stay on topic and share your insights
  • Feel free to be clear and thoughtful to get your point across
  • ‘Like’ or ‘Dislike’ to show your point of view.
  • Protect your community.
  • Use the report tool to alert us when someone breaks the rules.

Thanks for reading our community guidelines. Please read the full list of posting rules found in our site's Terms of Service.