"Cambridge Dictionary names 'hallucinate' as word of the year, expanding its meaning
Cambridge Dictionary has declared 'hallucinate' as its word of the year, introducing an expanded definition.
Historically, hallucination meant experiencing sensory perceptions of absent entities, such as seeing nonexistent pink elephants due to a head injury, drug use, or health conditions.
Today, this concept extends beyond human experience. With the integration of advanced technology like ChatGPT, Bard, BingAI, and Snapchat’s chatbots into daily life, artificial intelligence now 'hallucinates' too. These AI systems, while analyzing vast data sets to mimic human dialogue, occasionally produce inaccurate or unverified information. Recognizing this phenomenon, the Cambridge Dictionary now defines AI hallucination as: "When an artificial intelligence, a computer system with human brain-like qualities, hallucinates, it generates false information."
Wendalyn Nichols, Publishing Manager at Cambridge Dictionary, emphasizes the importance of human critical thinking when utilizing AI. Despite AI's proficiency in processing large volumes of data, its originality can lead to errors. AI tools, reliant on their training data, reflect this limitation.
The new meaning of 'hallucination' encapsulates the growing discourse around AI's role and reliability. It serves as a prompt for tech companies to address AI inaccuracies, now wittily termed as 'hallucinating'.
Most Read News
- Wreckage from US plane crash lifted out of Potomac River
- Greenland to ban foreign, anonymous election donations
- EU urged to ban all business with Israel's illegal
- Unions sue US Treasury Department for sharing
- In tit-for-tat move, China to impose 10% to 15% tariffs
- Germany’s Scholz and Qatar’s emir discuss Gaza ceasefire
- EU ready for 'tough' negotiations with US, says European
- 5 people shot at school in central Sweden
- Armenia 'gained opportunity' to have independent,
- Spain working to ensure crimes committed in Gaza’ do not