Feds need to bring in social media and AI protections for kids, B.C. AG Sharma says
Advertisement
Read this article for free:
or
Already have an account? Log in here »
We need your support!
Local journalism needs your support!
As we navigate through unprecedented times, our journalists are working harder than ever to bring you the latest local updates to keep you safe and informed.
Now, more than ever, we need your support.
Starting at $15.99 plus taxes every four weeks you can access your Brandon Sun online and full access to all content as it appears on our website.
Subscribe Nowor call circulation directly at (204) 727-0527.
Your pledge helps to ensure we provide the news that matters most to your community!
To continue reading, please subscribe:
Add Brandon Sun access to your Free Press subscription for only an additional
$1 for the first 4 weeks*
*Your next subscription payment will increase by $1.00 and you will be charged $20.00 plus GST for four weeks. After four weeks, your payment will increase to $24.00 plus GST every four weeks.
Read unlimited articles for free today:
or
Already have an account? Log in here »
VICTORIA – British Columbia’s attorney general says if the federal government doesn’t bring in protections on social media and AI chatbots for children, then the province will look to follow Manitoba with its own regulatory regime.
Niki Sharma says parents know firsthand of the devastating impacts of social media platforms and AI chatbots on children, and the artificial intelligence link to the Tumbler Ridge shooting where eight victims died is just one example.
The shooter in Tumbler Ridge used the ChatGPT chatbot in ways that drew concern from some staff at its maker, OpenAI, but the firm did not alert police before the killings in February.
Sharma says she wrote to the federal government after OpenAI didn’t report the suspected dangers posed by shooter Jesse Van Rootselaar, reinforcing B.C.’s call for clear national guardrails.
The attorney general says self-regulation isn’t working and they can’t have companies that control a lot of the world’s wealth deciding what’s safe for children and other vulnerable people.
She says she believes regulations would work best at the federal level, but they have also reached out to Manitoba to see what its plans are for implementing such a ban and an alliance or protections among provinces could also work.
With the Tumbler Ridge mass shooting in mind, Sharma says in a statement that there’s a need for consistent, Canada-wide standards for all AI companies, including clear reporting thresholds when serious safety concerns are identified.
“Strong, enforceable federal rules will help ensure AI is used responsibly, support workers and families, and give people confidence that these powerful technologies are being developed and deployed with their safety and well-being at the forefront,” she says in a statement issued Tuesday.
B.C. has had several cases of sexploitation leading to suicide and the platforms can also lead to eating disorders and anxiety, she says.
“These companies say they share our goal of keeping our kids safe, and it is time we ask them to put their money where their mouth is and prove it if they want to continue operating in Canada,” Sharma says.
This report by The Canadian Press was first published April 28, 2026.