Remove 2021 Remove AWS Remove Blog
article thumbnail

Racing into the future: How AWS DeepRacer fueled my AI and ML journey

AWS Machine Learning Blog

In 2018, I sat in the audience at AWS re:Invent as Andy Jassy announced AWS DeepRacer —a fully autonomous 1/18th scale race car driven by reinforcement learning. But AWS DeepRacer instantly captured my interest with its promise that even inexperienced developers could get involved in AI and ML.

AWS 106
article thumbnail

Enhance customer support with Amazon Bedrock Agents by integrating enterprise data APIs

AWS Machine Learning Blog

They can ask questions like “What wiper blades fit a 2021 Honda CR-V?” Developer tools The solution also uses the following developer tools: AWS Powertools for Lambda – This is a suite of utilities for Lambda functions that generates OpenAPI schemas from your Lambda function code. or ”Tell me about part number 76622-T0A-A01.”

AWS 129
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Create a multimodal chatbot tailored to your unique dataset with Amazon Bedrock FMs

AWS Machine Learning Blog

In this post, we show how to create a multimodal chat assistant on Amazon Web Services (AWS) using Amazon Bedrock models, where users can submit images and questions, and text responses will be sourced from a closed set of proprietary documents. For this post, we recommend activating these models in the us-east-1 or us-west-2 AWS Region.

AWS 125
article thumbnail

AWS machine learning supports Scuderia Ferrari HP pit stop analysis

AWS Machine Learning Blog

In this post, we share how Amazon Web Services (AWS) is helping Scuderia Ferrari HP develop more accurate pit stop analysis techniques using machine learning (ML). Since implementing the solution with AWS, track operations engineers can synchronize the data up to 80% faster than manual methods.

AWS 76
article thumbnail

Easily deploy and manage hundreds of LoRA adapters with SageMaker efficient multi-adapter inference

AWS Machine Learning Blog

You can use open-source libraries, or the AWS managed Large Model Inference (LMI) deep learning container (DLC) to dynamically load and unload adapter weights. Prerequisites To run the example notebooks, you need an AWS account with an AWS Identity and Access Management (IAM) role with permissions to manage resources created.

AWS 103
article thumbnail

Use Amazon Q to find answers on Google Drive in an enterprise

AWS Machine Learning Blog

Google Workspace users and groups are mapped to the _user_id and _group_ids fields associated with the Amazon Q Business application in AWS IAM Identity Center. After you configure your identity source, you can look up users or groups to grant them single sign-on access to AWS accounts, applications, or both.

AWS 117
article thumbnail

Derive generative AI powered insights from Alation Cloud Services using Amazon Q Business Custom Connector

AWS Machine Learning Blog

This blog post is co-written with Gene Arnold from Alation. This post shows how to configure an Amazon Q Business custom connector and derive insights by creating a generative AI-powered conversation experience on AWS using Amazon Q Business while using access control lists (ACLs) to restrict access to documents based on user permissions.

AWS 105