
'Users should be wary of taking answers they receive from AI systems for granted'
The Peninsula
DOHA: Artificial Intelligence is no longer confined to imaginations or fantasy films. Today, it can detect cancerous tumours, translate spoken languag...
DOHA: Artificial Intelligence is no longer confined to imaginations or fantasy films. Today, it can detect cancerous tumours, translate spoken languages, compose music, write text messages, recommend holiday destinations, and even chat with us.
Because of this, people have become increasingly reliant on AI. Some have found it a valuable companion and a good listener that does not argue with or judge them, while others have come to see it as a way of solving their problems or keeping their secrets — often without realising the potential risks. Many have got used to taking the advice of these highly intelligent systems in all aspects of their lives, from choosing outfits and meals to making decisions and offering solutions that affect their present and future. Sometimes, they even rephrase their questions repeatedly to get the answers they want, seeing AI as a remedy for doubt and uncertainty.
However, while AI is reshaping human behaviours, responsibilities, and choices, it is still far from perfect. In fact, it sometimes fabricates answers simply to please the questioner. An Associate Professor at Northwestern University in Qatar — one of Qatar Foundation’s (QF) partner universities Dr. Wajdi Zaghouani — defines AI hallucination as the production of information that appears to be true but is, in fact, false or fabricated. “Imagine it as someone confidently telling you a story that seems believable, but the events of that story are completely wrong.
“I’ve seen some fascinating cases in my research. One common example is when AI systems generate fake academic citations; they create paper titles that sound legitimate, with realistic author and journal names, but the papers don’t exist.
“During my work in Arabic Natural Language Processing, I came across systems that generate fake Arabic proverbs which sound authentic, but have no basis in the culture. They capture the linguistic style perfectly, yet produce entirely fictional cultural content,” he adds.













