
AI minister wants more clarity on OpenAI’s changes after Tumbler Ridge
Global News
In a statement Friday, Evan Solomon said OpenAI's stated safety policy changes did not include 'a detailed plan for how these commitments will be implemented in practice.'
Artificial Intelligence Minister Evan Solomon says he wants more clarity on OpenAI’s committed safety protocol changes after the Tumbler Ridge, B.C., mass shooting, and isn’t ruling out legislative changes to address the issue.
The company behind ChatGPT on Thursday said it would enhance its police referral and repeat offender detection practices, after it did not elevate the shooter’s AI chatbot activity to police months before she killed eight people and wounded dozens of others.
In a statement Friday, Solomon said OpenAI’s statement did not include “a detailed plan for how these commitments will be implemented in practice.”
He said he would be meeting with CEO Sam Altman next week to “seek further clarity” and assurances of “concrete action.”
“The tragedy in Tumbler Ridge has raised serious questions about how digital platforms respond when credible warning signs of violence emerge,” the minister said. “Canadians deserve greater clarity about how human review decisions are made, how escalation thresholds are applied, and how privacy considerations are balanced with public safety.
“We will be seeking further clarity on how human review is conducted and whether Canadian context and best practices are appropriately embedded in those decisions. I will also be consulting with my cabinet colleagues on additional options.”
Solomon added he would also be meeting with other AI companies in the coming weeks “to ensure there is a consistent and clear approach to escalation, local coordination, and youth protection.”
“Decisions affecting Canadians must reflect Canadian laws, Canadian standards, and Canadian expertise,” he said.






