Progress® Recognized in The Forrester Wave™: Digital Experience Platforms, Q4 2025
Learn More
Hallucination mainly happens when you rely on the LLMs general knowledge and ask questions that are not part of it. For example, if you ask the LLM a question based on proprietary data, and the LLM doesn’t have access to that data, the LLM will provide an answer. This will be a hallucination. However, DGE uses RAG. This means that the LLM is used only for content formatting purposed, because the actual context that the LLM will need to provide an answer is provided as part of the prompt to the LLM.