
Be wary of AI-generated content on Indigenous cultures, say experts
CBC
AI-generated Indigenous language dictionaries, elders’ teachings and history circulating online could be harming culture and language revitalization efforts, say experts.
Content can easily and convincingly be created by generative artificial intelligence. Large language models (LLM) like ChatGPT are trained on massive amounts of data and use predictive guesswork to generate a response.
"These systems are especially likely, given the limited datasets available for many Indigenous languages, to produce invented words, fabricated cultural teachings, or generalized 'pan-Indigenous' representations that flatten distinct nations or communities into one interchangeable identity," said Michael G. Sherbert, a postdoctoral fellow at Queen's University in Kingston, Ont.
Sherbert, a member of Algonquins of Pikwakanagan First Nation, researches the ethics of using AI for cultural preservation of Indigenous languages and knowledge.
Sherbert said AI use in language and cultural preservation is still relatively new and some communities are prioritizing structured knowledge system AI, which is curated and controlled by the community or enterprise.
Sherbert said generative AI is highly flexible and conversational but can "hallucinate" or fabricate information and although there are teams that go through AI to try to make it more responsible and ethical, these hallucinations or misrepresentations can be appropriative and harmful.
"You could say that the AI is inadvertently colonizing and hurting Indigenous language revitalization because [people] are taking information generated by an artificial intelligence and putting it out there for people to read," said Sherbert.
People who may not be connected to a community may put their trust in generative AI, he said, because it seems to be giving them pretty good answers.
Based on that assumption, people may use AI for educational purposes or ask it something like "give me a good elder story," Sherbert said, and a generative AI uses statistical prediction to offer one that is "completely constructed from false information."
"[Generative AI] is optimized for something like fluency to give you answers that are good. It's not optimized for truth or for ethical or cultural responsibility or accountability," he said.
When AI is structured around verified knowledge systems from a community "rather than probabilistic pattern-matching, the likelihood of fabricated language or misrepresentation drops significantly," Sherbert said.
"More importantly, the authority over what is included, excluded, or restricted remains with the community itself."
He's working on language and culture revitalization with his First Nation's education services using structured knowledge system AI in collaboration with kama.ai, an Indigenous-owned AI company that specializes in this type of architecture.
Kaitlyn Lazore works in Kanien'kéha language preservation as a program support officer for the Mohawk Language and Culture with the Ahkwesáhsne Mohawk Board of Education.

Alberta Crown corporation study finds Red Deer OD prevention site closure didn’t lead to more deaths
New research by a Crown corporation created by Alberta’s UCP government has found that last year’s closure of Red Deer’s only overdose prevention site did not lead to an increase in overdose deaths, emergency department visits or ambulance calls among former site users.












