
AI 'hallucinations' could prove real problem for owner of fire-ravaged Vancouver property
CBC
As the owner of a fire-ravaged property he's been accused of neglecting, Fu De Ren has defended himself in both civil proceedings and bylaw trials.
But the East Vancouver landlord may be forced to pay for his latest attempt at self-representation: a bid to cut his property's assessed value by nearly half — from $19 million to $10 million.
According to the board that hears assessment appeals, Ren's arguments are riddled with fictitious case law — possible artificial intelligence (AI) "hallucinations" that sent B.C.'s assessment authority on a wild goose chase in search of legal precedent that doesn't exist.
Now, the board says Ren may have to pay for those mistakes.
"The unraveling of these falsehoods has required investigation and research by both the Assessor and the Board," board panel chair John Bridal wrote in an Oct. 7 decision.
"I find an order for costs may be warranted, reflecting the additional time of both the Board and the Assessor in addressing this matter."
Ren's battle with B.C. Assessment is the latest chapter in a story that began more than two years ago with a massive fire that rendered the low-rise apartment building that used to sit in the 400 block of East 10th Avenue uninhabitable.
In the time since, another fire resulted in an order for the demolition of the remaining structure, and Ren has fought with tenants, bylaw officials and the city over his alleged neglect of the property and his reluctance to pay for tearing down what was left of it.
But beyond the specifics of Ren's fights with authorities, his submissions to the property assessment appeal board have landed him in the middle of an issue currently plaguing B.C.'s many courts and tribunals.
CBC News has found multiple examples of judges calling out fictitious citations manufactured out of thin air — so called AI hallucinations — in material filed by self-represented litigants in proceedings in the past year, ranging from the B.C. Court of Appeal and B.C. Supreme Court, to small claims court and the Workers' Compensation Appeal Tribunal.
Last year, the situation led to a reprimand and an order for costs against a lawyer who used ChatGPT in her bid to help a divorced millionaire win the right to take his children to China.
And more recently, AI hallucinations have led to directives and warnings from court and tribunal administrators.
"Generative AI tools can be useful to self-represented parties to distill and understand complex legal principles. However, they are not designed to always provide truthful answers but instead to be human-like in their interactions," B.C. Court of Appeal registrar Timothy Outerbridge wrote in a decision posted online earlier this month.
"Not all humans, even those with natural intelligence, are always truthful. This Court is aware generative AI is being used by some, but like any litigation aid, the human behind the tool remains responsible for what comes before the Court. "













