B.C. attorney general advises social media, adult-content sites to comply with intimate images act
Niki Sharma says she expects companies to 'do the right thing and align themselves with our legislation'
B.C. Attorney General Niki Sharma said on Thursday she has sent a letter to several social media and adult-content sites advising them to comply with the province's new expedited legal process aimed at preventing people from posting intimate images of others online without their consent.
"With this letter, I expect that social media platforms, dating applications and pornographic websites will do the right thing and align themselves with our legislation to better protect people from this type of sexualized violence and put people before profits," she said at a news conference in Victoria.
In March, Sharma outlined how the new Intimate Images Protection Act aims to provide a path to justice that will allow victims to regain control of their private images and for perpetrators to be held accountable.
On Thursday she said the legislation has now passed into law and steps are being taken to bring it into force.
"We want everybody far and wide to understand its implications," she said.
AG wants follow-up meetings
Sharma said the letter, sent to companies such as Meta, Twitter, Tinder, Grindr, PornHub and OnlyFans, advises them that a judge or tribunal decision-maker can order a social media company, online platform or any website to stop distribution and remove an intimate image from its platform.
She said she expects to meet with the companies to discuss B.C.'s legislation and how the sites will create tools or systems to comply.
Sharma said she has already met with Google.
"We had a very productive conversation. I expect that a lot of companies want to protect people from this type of sexualized violence," she said.
"We had a discussion about the types of orders and how compliance would be best achieved through their platform and we really look forward to having more conversations with them and other companies."
The legislation will streamline the process for images to be taken down, Sharma said, and will give victims an avenue they can use to claim compensation from people who shared their photos without permission.
The province says the legislation will cover intimate images, near-nude images, videos, livestreams and digitally altered images and videos.
It will require perpetrators to destroy the images and remove them from the Internet, search engines and all forms of electronic communication.
How long to get images down?
Individuals seeking to have images removed will eventually be able to engage in a process with B.C.'s Civil Resolution Tribunal, which will have the power to order people to stop distributing or threatening to distribute intimate images.
Sharma was asked on Thursday how long it would take to have images removed and compensation to be issued. She did not provide an exact timeline.
"The goal is to be a very fast-acting, trauma-informed process. We're working right now on launching a website with the Civil Resolution Tribunal, which will be the main forum for these types of orders," she said, adding that it will be accessible 24 hours a day, seven days a week.
If a social media company, online platform or website does not comply with a court order, it could face consequences, such as administrative penalties and orders to pay for damages, Sharma said on Thursday. She also said that B.C. has the tools to enforce the new laws internationally.
B.C.'s legislation is expected to come into force through regulation in the coming months, according to the province.
The provincial tool, which now exists in several provinces, seeks to provide civil legislation to complement laws enshrined in Canada's Criminal Code over the matter.
In March 2015 the code was changed as part of the Protecting Canadians from Online Crime Act.
The law was drawn up in response to public outrage over the suicides of Canadian teenagers Amanda Todd and Rehtaeh Parsons, who were targets of cyberbullying and so-called sextortion.
Reports of online sexual violence against children have increased on the apps your children use daily:<br>Tiktok: 86.3% <br>Instagram: 47.1% <br>Discord: 473.5% <br> <br>Read more re: <a href="https://twitter.com/MissingKids?ref_src=twsrc%5Etfw">@MissingKids</a> newest report: <a href="https://t.co/z8J0chlHz1">https://t.co/z8J0chlHz1</a>
—@CdnChildProtect
On Thursday the Canadian Centre for Child Protection released new numbers around suspected child sexual abuse material found by major technology platforms such as Facebook, Instagram, Tiktok and Pinterest.
The companies, which are required by law to report suspected child abuse material when they become aware of it, collectively flagged more than 31.8 million reports of the crimes in 2022. It's an increase of more than 2.6 million relative to the previous year.
"These figures are rising either due to an increase in distribution of this material by users, or because companies are only now starting to look under the hood of their platforms, or both," says Lianna McDonald, executive director of the Canadian Centre for Child Protection (C3P) in a release.
C3P said electronic service providers are not legally required to make use of prevention tools designed to block the upload of known child sexual abuse material.
McDonald said the numbers "continue to underscore the urgent need for governments to step in and mandate online safety standards and duties of care in the technology sector."
With files from The Canadian Press and CBC's The Early Edition