Creating Impact: A Spotlight on 6 Practical Retrieval Augmented Generation Use Cases

Abhinav Kimothi
3 min readDec 4, 2023

In 2023, RAG has become one of the most used technique in the domain of Large Language Models. In fact, one can assume that no LLM powered application doesn’t use RAG in one way or the other. Here are 6 use cases that RAG forms a pivotal part of.

If you’re interested in finding out more about retrieval augmented generation, do give my blog a read-

Document Question Answering Systems

By providing access to proprietary enterprise document to an LLM, the responses are limited to what is provided within them. A retriever can search for the most relevant documents and provide the information to the LLM. Check out this blog for an example —

Conversational agents

LLMs can be customised to product/service manuals, domain knowledge, guidelines, etc. using RAG. The agent can also route users to more specialised agents depending on their query. SearchUnify has an LLM+RAG powered conversational agent for their users.

Real-time Event Commentary

Imagine an event like a sports or a new event. A retriever can connect to real-time updates/data via APIs and pass this information to the LLM to create a virtual commentator. These can further be augmented with Text To Speech models.IBM leveraged the technology for commentary during the 2023 US Open

Content Generation

The widest use of LLMs has probably been in content generation. Using RAG, the generation can be personalised to readers, incorporate real-time trends and be contextually appropriate. Yarnit is an AI based content marketing platform that uses RAG for multiple tasks.

Personalised Recommendation

Recommendation engines have been a game changes in the digital economy. LLMs are capable of powering the next evolution in content recommendations. Check out Aman’s blog on the utility of LLMs in recommendation systems.

Virtual Assistants

Virtual personal assistants like Siri, Alexa and others are in plans to use LLMs to enhance the experience. Coupled with more context on user behaviour, these assistants can become highly personalised.

If you’re someone who follows Generative AI and Large Language Models let’s connect on LinkedIn — https://www.linkedin.com/in/abhinav-kimothi/

Also, please read a free copy of my notes on Large Language Models — https://abhinavkimothi.gumroad.com/l/GenAILLM

I write about Generative AI and Large Language Models. Please follow https://medium.com/@abhinavkimothi to read my other blogs —

WRITER at MLearning.ai / 48K+ GPTs / Enterprise Ready AI Models

--

--

Abhinav Kimothi

Co-founder and Head of AI @ Yarnit.app || Data Science, Analytics & AIML since 2007 || BITS-Pilani, ISB-Hyderabad || Ex-HSBC, Ex-Genpact, Ex-LTI || Theatre