Remove redis discussions
article thumbnail

Ways to shoot yourself in the foot with Redis

Hacker News

Previously I discussed ways I've broken prod with PostgreSQL and with healthchecks. Now I'll show you how I've done it with Redis too. I've caused plenty and hope that by sharing them publicly, it might help some people bypass part one of the production outage learning syllabus.

94
article thumbnail

Roadmap to learning Large Language Models

Data Science Dojo

Embeddings – high-dimensional, dense vectors which represent the semantic content of unstructured data can remedy this issue.

professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Liveblocks (real-time collaboration API) & Livekit (open source live video and audio API) - S01E04

Console DevTools podcast

Episode 4 of the Console DevTools Podcast, a devtools discussion with David Mytton (Co-founder, Console ) and Jean Yang (CEO, Akita Software ). Tools discussed: Liveblocks - real-time collaboration API. Livekit - Open source live video and audio API. Zoom Bachelor. Read our selection criteria. Recorded: 2021-07-20.

40
article thumbnail

Cracking the large language models code: Exploring top 20 technical terms in the LLM vicinity

Data Science Dojo

We will also discuss the different applications of LLMs, such as machine translation, question answering, and creative writing. Redis Redis is an in-memory data store that can be used to store and retrieve data quickly. Redis is a popular choice for NLP applications because it is fast and scalable.

article thumbnail

SQL vs. NoSQL: Decoding the database dilemma to perfect solutions

Data Science Dojo

Recapitulating the main points discussed, SQL databases provide strong consistency, ACID compliance, and robust query capabilities, making them ideal for transactional systems. Replication is the process of copying the data to multiple nodes. This ensures that the data is always available, even if one node fails.

SQL 195
article thumbnail

How we matured our ML-on-Kubernetes capabilities and saved on cloud costs

Snorkel AI

In order to address these challenges, we implemented autoscaling at multiple levels in our infrastructure, which we’ll discuss in the following sections. Worker autoscaling The Snorkel Flow platform abstracts compute into a paradigm where jobs wait in Redis queues and workers run as processes in worker pods. overnight).

ML 72
article thumbnail

How we increased or ML on Kubernetes capabilities and saved cloud costs

Snorkel AI

In order to address these challenges, we implemented autoscaling at multiple levels in our infrastructure, which we’ll discuss in the following sections. Worker autoscaling The Snorkel Flow platform abstracts compute into a paradigm where jobs wait in Redis queues and workers run as processes in worker pods. overnight).

ML 59