LLM Citation Loops
noun
Definition:
A recursive feedback phenomenon in which AI-generated content citing prior AI outputs creates a loop of self-reinforcing references. Over time, this can distort truth, inflate authority of low-quality data, or embed misinformation into generative outputs.
Usage:
“Low-quality blogs repeated the same AI-written errors, contributing to an LLM citation loop that misrepresented the topic.”
Compare:
AI Hallucination, Information Cascade, Training Contamination