Page 139 - Weiss, Jernej, ur./ed. 2025. Glasbena interpretacija: med umetniškim in znanstvenim┊Music Interpretation: Between the Artistic and the Scientific. Koper/Ljubljana: Založba Univerze na Primorskem in Festival Ljubljana. Studia musicologica Labacensia, 8
P. 139
ai and musical interpretation
operate following the same principles the resulting problems are likely to
be very similar, too.
The main issue causing problems is called “model collapse”. It was the
20
subject of an article in Nature in July 2024 which triggered a number of
21
other articles in popular science journals and on tech websites discussing
the phenomenon. However, some experts already issued warnings about
it a year ago, when applications such as ChatGPT were still relatively new.
Carl Franzen, for example, asked in June 2023: “What happens as AI-gen-
erated content proliferates around the internet, and AI models begin to train
on it, instead of on primarily human-generated content?” 22
As discussed in the previous section, LLMs operate by scouring a
very large database – ideally the entire world wide web – in order to deter-
mine what the most convincing response to a prompt might be. Model col-
lapse occurs if a larger and larger section of the data consulted consists of
data previously created by AI, as pointed out by Shumailov et al.: “indis-
criminately learning from data produced by other models causes ‘model col-
lapse’—a degenerative process whereby, over time, models forget the true un-
derlying data distribution.” This means
23
that, over time, models start losing information about the true distri-
bution, which first starts with tails disappearing, and learned behav-
iours converge over the generations to a point estimate with very small
variance. 24
20 Devin Coldewey, “‘Model Collapse’: Scientists Warn Against Letting AI Eat Its Own
Tail,” Tech Crunch, July 24, 2024, https://techcrunch.com/2024/07/24/model-col-
lapse-scientists-warn-against-letting-ai-eat-its-own-tail/.
21 Ilia Shumailov, et al., “AI Models Collapse When Trained on Recursively Generat-
ed Data,” Nature 631 (24 July 2024): 755–9, https://www.nature.com/articles/s41586-
024-07566-y. Shumailov et al. discuss the impact of model collapse not just on LMMs
yet also on other AI systems such as Gaussian Mixture Models (GMMs) and Varia-
tional Autoencoders (VAEs), yet these don’t need to be discussed further in our con-
text.
22 Carl Franzen, “The AI Feedback Loop: Researchers Warn of ‘Model Collapse’ as AI
Trains on AI-Generated Content,” Venture Beat, June 12, 2023, https://venturebeat.
com/ai/the-ai-feedback-loop-researchers-warn-of-model-collapse-as-ai-trains-on-
ai-generated-content/. Franzen’s text was based on an earlier article by the Shumai-
lov team that was already submitted on 27 May 2023: Ilia Shumailov, et al., “The
Curse of Recursion: Training on Generated Data Makes Models Forget,” arXiv 2305,
no. 17493 (27 May 2023), https://doi.org/10.48550/arXiv.2305.17493.
23 Shumailov, et al., “AI Models Collapse.”
24 Ibid.
139