Repetita Non Iuvant: Why Generative AI Models Cannot Feed Themselves
Lingua:
Inglese
Track 1 - Data Science
(Green House)
Orario: 10:45
- 11:30
Abstract
As AI floods the digital landscape with content, what happens when it starts repeating itself?
This talk explores model collapse, a progressive erosion where LLMs and image generators loop on their own results, hindering the creation of novel output.
We will show how self-training leads to bias and loss of diversity, examine the causes of this degradation, and quantify its impact on model creativity.
Finally, we will also present concrete strategies to safeguard the future of generative AI, emphasizing the critical need to preserve innovation and originality.