Original link: AI models collapse when trained on recursively generated data / Nature.
Sounds very logical.
If an LLM produces some percentage of bad output from good input and always create bad output from bad input, then the quality will obviously drop when output is used as input repeatedly.