Remove 2012 Remove AWS Remove Data Science
article thumbnail

Securing MLflow in AWS: Fine-grained access control with AWS native services

AWS Machine Learning Blog

In a previous post , we discussed MLflow and how it can run on AWS and be integrated with SageMaker—in particular, when tracking training jobs as experiments and deploying a model registered in MLflow to the SageMaker managed infrastructure. To automate the infrastructure deployment, we use the AWS Cloud Development Kit (AWS CDK).

AWS 98
article thumbnail

How VirtuSwap accelerates their pandas-based trading simulations with an Amazon SageMaker Studio custom container and AWS GPU instances

AWS Machine Learning Blog

Prerequisites To run this step-by-step guide, you need an AWS account with permissions to SageMaker, Amazon Elastic Container Registry (Amazon ECR), AWS Identity and Access Management (IAM), and AWS CodeBuild. Complete the following steps: Sign in to the AWS Management Console and open the IAM console.

AWS 143
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Promote pipelines in a multi-environment setup using Amazon SageMaker Model Registry, HashiCorp Terraform, GitHub, and Jenkins CI/CD

AWS Machine Learning Blog

Building out a machine learning operations (MLOps) platform in the rapidly evolving landscape of artificial intelligence (AI) and machine learning (ML) for organizations is essential for seamlessly bridging the gap between data science experimentation and deployment while meeting the requirements around model performance, security, and compliance.

AWS 126
article thumbnail

Use LangChain with PySpark to process documents at massive scale with Amazon SageMaker Studio and Amazon EMR Serverless

AWS Machine Learning Blog

By using the Livy REST APIs , SageMaker Studio users can also extend their interactive analytics workflows beyond just notebook-based scenarios, enabling a more comprehensive and streamlined data science experience within the Amazon SageMaker ecosystem. This same interface is also used for provisioning EMR clusters.

AWS 122
article thumbnail

Fine-tune LLMs with synthetic data for context-based Q&A using Amazon Bedrock

AWS Machine Learning Blog

Amazon Bedrock offers a serverless experience, so you can get started quickly, privately customize FMs with your own data, and integrate and deploy them into your applications using AWS tools without having to manage any infrastructure. Our dataset includes Q&A pairs with reference documents regarding AWS services.

AWS 81
article thumbnail

Use Amazon SageMaker Model Card sharing to improve model governance

AWS Machine Learning Blog

During AWS re:Invent 2022, AWS introduced new ML governance tools for Amazon SageMaker which simplifies access control and enhances transparency over your ML projects. Depending on your governance requirements, Data Science & Dev accounts can be merged into a single AWS account.

AWS 131
article thumbnail

Use Amazon SageMaker Studio with a custom file system in Amazon EFS

AWS Machine Learning Blog

Amazon EFS provides a scalable fully managed elastic NFS file system for AWS compute instances. Using this folder, users can share data between their own private spaces. This means that each user within the domain will have their own private space on the EFS file system, allowing them to store and access their own data and files.

AWS 120