AI-Hallucinated Citations in Libraries: A Discussion
With the advent and widespread use of large language models like ChatGPT, hallucinations (information presented as factual but which does not actually exist) have become more prevalent, even in academia.