📑 arXiv 2d ago
RAGognizer: Hallucination-Aware Fine-Tuning via Detection Head Integration
RAGognizer uses token-level hallucination annotations from real RAG outputs as a direct training signal, integrating a detection head during fine-tuning rather than treating hallucination detection as post-hoc. The approach trains models to recognize when generated content is unsupported by retrieved context, addressing closed-domain hallucinations in retrieval-augmented generation.