This year’s ‘Word of the Year’ is reflective of the times. What is it, and what does it say about our language landscape?
You’re not hallucinating…
The word ‘hallucinate’ is the Cambridge Word of the Year for 2023. This highlights the growing impact of AI on language and society. At first, it may seem strange to choose such an old word.
The word ‘hallucinate’ has a fascinating etymology. It has roots in the Latin verb ‘hallucinari’. This means ‘to wander in the mind’. This Latin verb originates from the Greek word ‘alyein’. This means ‘to be distressed, to wander’. The earliest recorded use was in the early 1600s.
As language evolves, words get alternative meanings. The word's new meaning refers to AI generating false or misleading information.
A sign of the times
Editors noted that it aptly captures the concerns surrounding AI. Particularly in the realm of generative AI. AI can produce human-quality text. These AI systems, whilst powerful tools, are still prone to errors. This can lead to the generation of fabricated facts and information.
The dictionary's decision aligns with Collins Dictionary's choice of ‘AI’ as its Word of the Year. This further emphasises the pertinence of AI to our everyday lives. AI's influence on language is evident. Especially in new terms like ‘large language model’ and ‘generative AI’. These have also been added to the Cambridge Dictionary this year.
AI ethicist Henry Shevlin has commented. He suggests that this stems from the tendency to anthropomorphise AI systems. We tend to attribute human qualities to them. This can lead to misinterpreting AI's capabilities and limitations.
Despite the concerns, he expresses optimism. He notes that AI companies are working to address these issues. They're doing this through human feedback and model specialisation. Also, users are becoming increasingly aware of AI's limits. They are learning to exercise caution when relying on AI-generated information.
This use of ‘hallucinate’ is particularly apt. It harkens back to the word's original meaning, ‘to wander in the mind’. Hallucination is a disconnect from reality. An AI system can produce false or distorted information.
All of this underscores the transformative power of AI. It also highlights a need for continued vigilance. We should exercise critical thinking as we interact with these increasingly advanced systems.