BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

AWS Technology VP: What Cloud Does Next

Following

Cloud solidified. Early notions of cloud computing deployments in pre-millennial times were beset with concerns about application and Intellectual Property (IP) ownership i.e. customers asked - ‘you mean, we let someone else look after our data?’ in perplexed terms. Security provisioning was flaky, incomplete and in some cases too fragile to mention.

The very act of cloud computing services construction and development was beleaguered with spiralling complexity and nobody had really written a manual covering how to do all this stuff in the first place. There were few cloud engineers and where migration did happen, only a small percentage of new cloud-native elements of software were being created.

But that all changed.

As we all know, cloud has evolved, broadened, enjoyed many levels of ubiquitous deployment and benefitted from automation and simplification. All of which has helped to move our notion of cloud computing from being some ethereal esoteric technology practice, to it being the norm. In short, cloud solidified.

As pleasing as this progression is for most of us, it wasn’t easy at times. So how did we get to where we are now and what does the cloud do next? One person who has lived through this brief history of tech time and understands how new tiers of virtualized computing are now being built is Mai-Lan Tomsen Bukovec. In her role as vice president of technology at Amazon Web Services, Inc. (AWS), Tomsen Bukovec works with a team of engineers tasked with creating and running the cloud services of today and tomorrow.

Starting from first principles, one change that Tomsen Bukovec calls out is that - for AWS at least - security has been ‘job zero’ and the first action performed since the first cloud service the company launched in 2006. When a customer starts using compute or storage or any resource with AWS, it’s always secure by default.

“It’s part of the culture and mental model for how we build and it is in every aspect of what we provide for customers from our AWS datacenters to any type of usage,” said Tomsen Bukovec. “But it’s true that cloud services have been evolving quickly. Very often, organizations don’t have to make any changes to their applications to take advantage of the improvements we are constantly adding – this means they can focus on how they want to differentiate the applications and experiences that they deploy.”

No compression algorithm for experience

Analyst house Gartner predicts that by 2026, 75% of organizations will adopt a digital transformation model predicated on cloud as the fundamental underlying platform. Tomsen Bukovec is resolute on how we stand today and insists that (as she puts it) ‘there’s no compression algorithm for experience’ in this field. AWS reminds us that it has learned a lot about building cloud services since it launched the first generally available service, Amazon S3 in 2006. There are more than 240 AWS services available today and CEO Adam Selipsky has spoken of his firm’s focus on making cloud ‘simpler’ to use, so the skies should be clearing.

“What [CEO] Adam said about making the cloud simpler to use is definitely a focus for us and has been for a while now. Here’s one example. In 2018, we launched a new storage class called Amazon S3 Intelligent Tiering which automatically charges less for data that isn’t accessed in a given month,” explained Tomsen Bukovec. “The longer the data sits in storage without being accessed, the more an organization automatically saves, with discounts up to 95% less compared to data that is accessed every day. That makes it very simple for users who don’t know if the access patterns of their storage will change and it is one of the reasons why companies pick this storage class for data lakes, where data access patterns change all the time.”

With an engineering team focused on providing choice, Tomsen Bukovec says that users must always be in control. For example, with the Amazon EC2 compute service, customers have over 575 instances to choose from, ranging from basic compute to high-performance compute featuring AWS-designed chips, optimized for Machine Learning (ML). In Amazon S3, users can choose to store their data in any of the company’s seven storage classes. The storage options in Amazon S3 range from archive storage that companies can use to store backups and historical data at less than a tenth of a US cent per gigabyte (GB) per month, to highly performant storage that is used for the pre-training of foundation models (FMs) for generative Artificial Intelligence (gen-AI), like Falcon 3 and Claude.

AWS Trainium and AWS Inferentia are the company’s custom-designed chips built for training and inference work with ML and AI applications, respectively. These chips also have higher performance per watt than comparable GPU-powered instances, so it’s clear that firms like AWS and the other Cloud Service Provider (CSP) hyperscalers have a defined focus on seeking out the most energy-efficient processors for these workloads.

The difference is inference

“Looking forward, most of the future ML costs will come from running inference, which is the process of applying an ML model to a dataset and generating an output or ‘prediction’. We engineer AWS Inferentia to give the best price for performance for running any type of inference,” said Tomsen Bukovec. “In fact, we use AWS Inferentia in many teams across Amazon, which has been doing machine learning at scale for over 20 years now. Whether it is an intelligent storage class that dynamically adjusts pricing based on data access or purpose-built custom processors for training and inference in ML/AI, AWS is helping simplify how organizations choose the cloud technology they need.”

Asked to pinpoint one of her favorite technologies now surfacing upwards, Tomsen Bukovec admits being ‘excited’ about the company’s third generation ARM-based processor, AWS Graviton 3 in EC2. At a time when we’re all concerned about the environmental impact of cloud, she points out that this chip uses 60% less energy for the same performance than comparable Amazon EC2 instances.

“As an organization, we are very focused on combating sustainability and climate change. In fact, in November of 2023 we added 78 new wind and solar projects to our portfolio – and we’re up to 479 projects globally so far for 2023 – enough to power 6.7M US homes or 19.4M European homes. These projects are powering Amazon operations like AWS datacenters, office buildings and each project is bringing us closer to powering our operations with 100% renewables by 2025. I am personally very excited about this commitment to sustainability,” enthused Tomsen Bukovec.

It’s tough to move far in technology circles this year without hearing about gen-AI and of course, AWS is working extensively in this field. As a software engineering purist, Tomsen Bukovec points to some of the ‘breakthroughs’ she is seeing in the generalized capabilities of different AI ‘foundation models’ (a term which we have previously explained in detail here) in companies across all industry verticals. Speaking to chief data officers regularly, Tomsen Bukovec says that firms are taking their existing AWS data infrastructure - that has security controls already built in - and using its AWS services to work with that data to give company-specific context to Large Language Models (LLMs) with techniques like fine-tuning and retrieval augmented generation (RAG).

Nectar in vector

“That is where vector databases come in,” urged Tomsen Bukovec. “Most of our customers don’t want to pre-train their own foundation model. They want to take an existing foundation model and use it with a much smaller, specialized data set. When you are using an existing foundation model, you can take your own data and create ‘embeddings’ for it that are stored in a vector database so that your LLM can use it to provide the best, more relevant response to a user prompt. This significantly improves the relevancy of the responses that your LLM provides, which really matters in the user experience for generative AI applications. You’ll see more and more customers using their own custom data this way because it means you can get the benefit of the generalized capabilities of the foundation model combined with the context of a company’s own specialized data in the generative AI experience, and it is much more cost-effective and faster than pre-training your own foundation model from scratch."

In terms of actual market adoption for these still-nascent technologies, the AWS team says that no shortage of highly relevant use cases for generative AI. The use cases that seem to be the most prevalent across industries are improved knowledge search, automatic document generation, highly personalized experiences and automated processing of data or resources like images.

Organizations are also putting AI models to work in the ‘back office’ of the enterprise. This means applying it to tasks ranging from code generation assistants to automated data prep agents. Amazon CodeWhisperer sits in this space, it provides AI-powered code recommendations for developers to build applications and filter out code suggestions that might be considered biased or unfair, flags code suggestions that may resemble particular open source training data - it also scans for vulnerabilities and proposes code to remediate them.

Internally, AWS ran a productivity challenge. Participants who used Amazon CodeWhisperer were 27% more likely to complete tasks successfully and did so an average of 57% faster than those who didn’t use it. In many (or in fact most) of these back-office use cases, there’s still a ‘human in the loop’ (a factor explained here) which might be a customer support agent, data scientist or developer etc. examining the response from the foundation model and then deciding how to use it. It’s the act of the generation itself, which is the heart of generative AI, that is a huge help in iteration time and ultimately efficiency for workers in many organizations.

But there are questions to be asked. Given the breadth of open data being used in generative AI and the fact that Large Language Models (LLMs) essentially rely upon open streams of information in order to learn, how does AWS position its stance on the secure ringfencing and guardrails needed to protect mission-critical corporate data in these use cases and scenarios?

“We have built cloud services for so many years for the enterprise and we take that same security-first approach to the use of generative AI,” urged Tomsen Bukovec. “When customizing Amazon CodeWhisperer and Amazon Bedrock to generate company-specific, relevant responses, customers’ codebases and data are completely private and do not train the underlying models, protecting their valuable Intellectual Property (IP). Customers simply point Amazon Bedrock at a few labeled examples in Amazon S3 and the service can fine-tune the model for a particular task without having to go through large volumes of data, which is very time-consuming. None of the customer’s data is used to train the original base models. Organizations can configure Virtual Private Cloud (VPC) settings to access Amazon Bedrock APIs and provide model fine-tuning data in a secure manner and all data is encrypted. That was true from even the beta launch of Bedrock. We build security into our services from the ground up.”

Technically, it’s worth remembering that in the case of AWS users, data is already secured using AWS controls so chief security officers (CSOs) don’t have to come up with a new security model for generative AI applications.

Cloud computing in 2030?

As we move to a point - now - where cloud becomes the new normal, Tomsen Bukovec says that for most organizations today, the question isn’t if anymore; it’s how fast can they move to the cloud. In the AWS camp, enterprises have the choice of running their applications ‘as-is’ on AWS. This means they can shift on-premises software to the same environment in AWS without requiring code changes.

“Looking immediately ahead and even towards the end of this decade, cloud computing will be pervasive in everything we do,” said Tomsen Bukovec. “Cloud computing will power everything from smart cities to banking infrastructure to government services to the next social experiences. You already see it happening and with the introduction of powerful new technologies like generative AI that introduce Natural Language Processing (NLP) into the customer experience of technology, it’s just going to pick up speed and happen faster.”

We appear to be at a point when all organizations will need to develop cloud skills as part of their IT department’s digital modernization efforts. Cloud skills will be the new standard for organizational agility, whether that is in software development, data analysis, or any other job function in an organization.

The cloud forecast is, more cloud, but the skies are clear and you don’t need a raincoat.

Follow me on Twitter or LinkedIn

Join The Conversation

Comments 

One Community. Many Voices. Create a free account to share your thoughts. 

Read our community guidelines .

Forbes Community Guidelines

Our community is about connecting people through open and thoughtful conversations. We want our readers to share their views and exchange ideas and facts in a safe space.

In order to do so, please follow the posting rules in our site's Terms of Service.  We've summarized some of those key rules below. Simply put, keep it civil.

Your post will be rejected if we notice that it seems to contain:

  • False or intentionally out-of-context or misleading information
  • Spam
  • Insults, profanity, incoherent, obscene or inflammatory language or threats of any kind
  • Attacks on the identity of other commenters or the article's author
  • Content that otherwise violates our site's terms.

User accounts will be blocked if we notice or believe that users are engaged in:

  • Continuous attempts to re-post comments that have been previously moderated/rejected
  • Racist, sexist, homophobic or other discriminatory comments
  • Attempts or tactics that put the site security at risk
  • Actions that otherwise violate our site's terms.

So, how can you be a power user?

  • Stay on topic and share your insights
  • Feel free to be clear and thoughtful to get your point across
  • ‘Like’ or ‘Dislike’ to show your point of view.
  • Protect your community.
  • Use the report tool to alert us when someone breaks the rules.

Thanks for reading our community guidelines. Please read the full list of posting rules found in our site's Terms of Service.