Remove snorkel-flow model-distillation
article thumbnail

Crossing the demo-to-production chasm with Snorkel Custom

Snorkel AI

Today, I’m incredibly excited to announce our new offering, Snorkel Custom, to help enterprises cross the chasm from flashy chatbot demos to real production AI value. The Snorkel team has spent the last decade pioneering the practice of AI data development and making it programmatic like software development.

AI 80
article thumbnail

Crossing the demo-to-production chasm with Snorkel Custom

Snorkel AI

Today, I’m incredibly excited to announce our new offering, Snorkel Custom, to help enterprises cross the chasm from flashy chatbot demos to real production AI value. The Snorkel team has spent the last decade pioneering the practice of AI data development and making it programmatic like software development.

AI 52
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Accelerating AI development in manufacturing with Snorkel Flow and AWS SageMaker

Snorkel AI

phData Senior ML Engineer Ryan Gooch recently evaluated options to accelerate ML model deployment with Snorkel Flow and AWS SageMaker. This involves developing and tuning a model customized to extract rich insights from invoices using Snorkel’s data development platform and deploying with AWS Sagemaker.

AWS 52
article thumbnail

Accelerating AI development in manufacturing with Snorkel Flow and SageMaker

Snorkel AI

Whether it’s creating real-time reporting on manufacturing output, constructing sophisticated data engineering pipelines, architecting robust MLOps solutions, or building models to help predict when to perform maintenance, we empower leading manufacturers to bring ambitious data and analytics projects to life.

AWS 52
article thumbnail

Enterprise LLM challenges and how to overcome them

Snorkel AI

in-house to build production deployable models. Between the proliferation of available models and team upskilling, that’s no longer true. This summer, I gave a presentation titled “Leveraging Foundation Models and LLMs for Enterprise-Grade NLP” at Snorkel AI’s Future of Data-Centric AI virtual conference. Low accuracy.

AI 59
article thumbnail

Two approaches to distill LLMs for better enterprise value

Snorkel AI

Large language models (LLMs) have taken center stage in the world of AI. While impressive, these models demand a lot of infrastructure and generate high costs. Distilling LLMs can create models that are just as powerful, but cheaper to run and easier to deploy. Let’s dive in.

article thumbnail

How Snorkel Flow users can register custom models to Databricks

Snorkel AI

Snorkel AI is thrilled to announce our partnership with Databricks and seamless end-to-end integration across the Databricks Data Intelligence Platform. The synergy between Snorkel and Databricks enables data scientists to navigate their entire machine learning pipeline—from data access to model deployment—all within Snorkel Flow.