Cambridge dictionary's word of the year for 2023 is "hallucinate" – (0) ______________. The original definition of the chosen word is to "seem to see, hear, feel, or smell" something that does not exist, usually because of "a health condition or (21) ______________". It now has an additional meaning, relating to when artificial intelligence systems such as ChatGPT, which generates text that mimics human writing, "hallucinates" and (22) ______________. The word was chosen because the new meaning "gets to the heart of why people are talking about AI", according to a post on the dictionary site. Generative AI is a "powerful" but "far from perfect" tool, "one we're all still learning how (23) ______________ – this means being aware of both its potential strengths and its current weaknesses." "AI hallucinations remind us that humans still need (24) ______________ to the use of these tools," continued the post. "Large language models (LLMs) are only as reliable as the information their algorithms learn from. Human expertise is arguably more important than ever, (25) ______________ that large language models can be trained on." Henry Shevlin, an AI ethicist at the University of Cambridge, said that it is striking that rather than choosing a computer-specific term like "bugs" to describe the mistakes that LLMs make, the dictionary team (26) ______________. He said that this may be because "it's so easy to anthropomorphise these systems, treating them as if they had minds of their own." The dictionary provides two example sentences of "hallucinate" relating to AI: "LLMs are notorious for hallucinating – (27) ______________, often supported by fictitious citations" and "The latest version of the chatbot is greatly improved but it will still hallucinate facts." It wasn't only Cambridge's decision (28) ______________: according to Collins dictionary, the word of 2023 is "AI" itself.
