Rex Kerr
Mar 5, 2024

I don't think that's necessarily the case. They probably just put in a training set which emphasized high diversity in output despite lower diversity in input, and the model dutifully hallucinated diversity everywhere.

Generative AI is really good at hallucinating. I found it pretty funny that with all the worry about hallucinations, they ended up training it to do it more.

Rex Kerr
Rex Kerr

Written by Rex Kerr

One who rejoices when everything is made as simple as possible, but no simpler. Sayer of things that may be wrong, but not so bad that they're not even wrong.

Responses (1)