RAGNAR: Retrieval-Augmented Generation using Networked and Advanced Relational Data
Abstract
The technological evolution carried out in recent years has enabled significant developments in various areas of Artificial Intelligence (AI), such as Generative AI. Large Language Models (LLMs) are becoming increasingly complex, allowing for better results and enhancing their real-world applicability. However, these models still face issues such as hallucination or outdated information. This last one occurs due to the temporal gap between the training process and the model’s use. A Retrieval-Augmented Generation (RAG) architecture can address these issues since the information source used is not involved in the training phase, which also facilitates the reuse of models for different applications. One of the challenges of RAG is its applicability when the data source is a relational database, becoming even more challenging as the database size and complexity increase. This article proposes a potential architecture and approach for solving this problem and implementing a RAG architecture using a relational database as the data source.
Conference: 2024 8th International Symposium on Innovative Approaches in Smart Technologies (ISAS)
Read the full article: link.
Keywords: RAG, LLM, Generative AI