One explanation is Hapsburg AI. The AI content is splattered far and wide, deliberately made hard to identify and avoid. But when you trait new AI on old AI-generated material, then you tend to get gibberish. So they shot themselves in the foot there.
Another is that humans are less good at many logical tasks such as summarizing data than they used to be, as education degrades due to low investment. I've seen some painfully bad summaries of scientific studies. You can't teach what you don't know, so AI learning from idiots will be inept.
Well ...
Another is that humans are less good at many logical tasks such as summarizing data than they used to be, as education degrades due to low investment. I've seen some painfully bad summaries of scientific studies. You can't teach what you don't know, so AI learning from idiots will be inept.