Close Menu
Daily Guardian
  • Home
  • News
  • Politics
  • Business
  • Entertainment
  • Lifestyle
  • Health
  • Sports
  • Technology
  • Climate
  • Auto
  • Travel
  • Web Stories
What's On

NEC X Selects Seven Startups for Batch 15 of Elev X! Ignite, Advancing AI in Healthcare, Governance, Space and Enterprise Operations

March 24, 2026

ChatGPT and Gemini are fighting to be the AI bot that sells you stuff

March 24, 2026

From Singapore, Southeast Asia’s First AI-Powered Sci-Fi Universe Lands Major Fund Backing

March 24, 2026

DUOL SHAREHOLDER ALERT: Faruqi & Faruqi, LLP Investigates Claims on Behalf of Investors of Duolingo

March 24, 2026

GEMI SHAREHOLDER REMINDER: Faruqi & Faruqi, LLP Reminds Gemini Space Station (GEMI) Investors of Securities Class Action Deadline on May 18, 2026

March 24, 2026
Facebook X (Twitter) Instagram
Finance Pro
Facebook X (Twitter) Instagram
Daily Guardian
Subscribe
  • Home
  • News
  • Politics
  • Business
  • Entertainment
  • Lifestyle
  • Health
  • Sports
  • Technology
  • Climate
  • Auto
  • Travel
  • Web Stories
Daily Guardian
Home » AI minister wants more clarity on OpenAI’s changes after Tumbler Ridge
News

AI minister wants more clarity on OpenAI’s changes after Tumbler Ridge

By News RoomFebruary 27, 20264 Mins Read
AI minister wants more clarity on OpenAI’s changes after Tumbler Ridge
Share
Facebook Twitter LinkedIn Pinterest Email

Artificial Intelligence Minister Evan Solomon says he wants more clarity on OpenAI’s committed safety protocol changes after the Tumbler Ridge, B.C., mass shooting, and isn’t ruling out legislative changes to address the issue.

The company behind ChatGPT on Thursday said it would enhance its police referral and repeat offender detection practices, after it did not elevate the shooter’s AI chatbot activity to police months before she killed eight people and wounded dozens of others.

In a statement Friday, Solomon said OpenAI’s statement did not include “a detailed plan for how these commitments will be implemented in practice.”

He said he would be meeting with CEO Sam Altman next week to “seek further clarity” and assurances of “concrete action.”

“The tragedy in Tumbler Ridge has raised serious questions about how digital platforms respond when credible warning signs of violence emerge,” the minister said. “Canadians deserve greater clarity about how human review decisions are made, how escalation thresholds are applied, and how privacy considerations are balanced with public safety.

“We will be seeking further clarity on how human review is conducted and whether Canadian context and best practices are appropriately embedded in those decisions. I will also be consulting with my cabinet colleagues on additional options.”

Solomon added he would also be meeting with other AI companies in the coming weeks “to ensure there is a consistent and clear approach to escalation, local coordination, and youth protection.”

Get the day's top news, political, economic, and current affairs headlines, delivered to your inbox once a day.

Get daily National news

Get the day’s top news, political, economic, and current affairs headlines, delivered to your inbox once a day.

“Decisions affecting Canadians must reflect Canadian laws, Canadian standards, and Canadian expertise,” he said.

“All options remain on the table as we assess what further steps may be necessary. Public safety must come first.”

Solomon and other federal ministers expressed frustration with OpenAI after the company did not present an action plan during a meeting in Ottawa on Tuesday.

The ministers said they would give OpenAI a chance to come back with one before considering a legislative response to the issue of how AI companies handle and address users’ violent behaviour.

Researchers and opposition MPs have urged the federal government to speed up efforts to regulate the AI industry in the wake of the Tumbler Ridge shooting.

OpenAI acknowledged on Thursday that, if it had detected Jesse VanRootselaar’s ChatGPT activity today, it would have flagged it to law enforcement under its current police referral thresholds, which were updated “several months ago.”

Instead, that activity was only referred to RCMP after the shooting occurred.

It also revealed that it found a second ChatGPT account linked to VanRootselaar after she was identified as the shooter in Tumbler Ridge — despite her first account being shut down last June due to “violent” activity and a system meant to detect repeat violators of OpenAI’s policies.


The company committed to further enhancing both of those protocols, as well as establishing direct points of contact with Canadian authorities and developing better practices of connecting users to local mental health supports if they exhibit troubling behaviour.

B.C. Premier David Eby said Thursday he will also be meeting with Altman, calling OpenAI’s commitments “cold comfort for the people of Tumbler Ridge.”

He told reporters Friday in Vancouver there is no firm date yet for the meeting with the CEO, who has yet to comment publicly on the Tumbler Ridge tragedy or the changes his company says it will make in Canada.

“I want to recognize that OpenAI did come forward,” Eby said. “They did bring the information forward to police. They didn’t try to cover it up after the fact, but this was a colossal, horrific mistake, I guess, is the most generous interpretation I can offer, to fail to bring that information forward to authorities.

“It’s important that Mr. Altman realizes that, and I will be looking for his support for a national standard across Canada, a national threshold where all AI companies must report — and clear consequences for if they fail to report — incidents where people are planning violence, planning to hurt other people, and using these tools to develop those plans.”

—with files from the Canadian Press

&copy 2026 Global News, a division of Corus Entertainment Inc.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Keep Reading

False spring strikes again: Saskatchewan prepares for incoming winter weather

Calgary city councillor says RCMP seized his devices as part of investigation

Prairie Harm Reduction fires executive director, citing ‘significant’ deficit

Tactics questioned after Halifax RCMP pose as panhandler to catch distracted drivers

Canada’s sports system is ‘broken, unsustainable,’ report finds

Canada’s federal minimum wage is about to go up

Atlantic Canada’s biggest cities are growing more than the national average

Fuel cost spikes are tanking hopes for Canadian business, survey suggests

Four premiers ask for greater say in superior, appeal court judge appointments

Editors Picks

ChatGPT and Gemini are fighting to be the AI bot that sells you stuff

March 24, 2026

From Singapore, Southeast Asia’s First AI-Powered Sci-Fi Universe Lands Major Fund Backing

March 24, 2026

DUOL SHAREHOLDER ALERT: Faruqi & Faruqi, LLP Investigates Claims on Behalf of Investors of Duolingo

March 24, 2026

GEMI SHAREHOLDER REMINDER: Faruqi & Faruqi, LLP Reminds Gemini Space Station (GEMI) Investors of Securities Class Action Deadline on May 18, 2026

March 24, 2026

Latest News

MNDY SHAREHOLDER REMINDER: Faruqi & Faruqi, LLP Reminds monday.com (MNDY) Investors of Securities Class Action Deadline on May 11, 2026

March 24, 2026

NAVN SHAREHOLDER REMINDER: Faruqi & Faruqi, LLP Reminds Navan (NAVN) Investors of Securities Class Action Deadline on April 24, 2026

March 24, 2026

ENPH SHAREHOLDER REMINDER: Faruqi & Faruqi, LLP Reminds Enphase (ENPH) Investors of Securities Class Action Deadline on April 20, 2026

March 24, 2026
Facebook X (Twitter) Pinterest TikTok Instagram
© 2026 Daily Guardian Canada. All Rights Reserved.
  • Privacy Policy
  • Terms
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.

Go to mobile version