Site icon IT World Canada

Proposed Canadian law puts burden on large internet providers to police child porn, hate

Designated social media providers, live-streaming services and adult sites that allow users to upload content will have to scrutinize and delete objectionable messages, images, and videos if the Liberal government’s proposed Online Harms Act, which includes the creation of a Digital Safety Commission to hear complaints, is passed.

The intent is to stop a range of online content on websites and services available to Canadians including including child bullying, sextortion, or inducements to self-harm such as suicide.

The act, Bill C-63, which was introduced today, says designated services must remove within 24 hours two categories of content: Material that sexually victimizes a child or re-victimizes a survivor; and intimate content posted without the consent of an individual (also called sextortion or revenge porn).

The 24-hour deadline would be subject to oversight and review to allow for the screening of frivolous complaints. It would also allow a person posting alleged objectionable content to appeal if they believe the material doesn’t meet the legislated definition of forbidden content.

RELATED CONTENT: This is a link to the full proposed legislation

The law would also make it clear anyone who provides an internet service — including social media platforms — has to report to a designated law enforcement agency if child porn is posted on their service. To enhance that reporting, service providers will have to hold user content for one year — up from the current 21 days — to ensure content is available for criminal investigation.

Designated providers would have a duty to put child mental and physical safety first when designing products and features. Regulations created by the Digital Safety Commission could require providers to implement parental controls so parents can limit content their childen can see, content warning labels for certain types of content. The Commission could also set rules around targeted content or ads directed at children. Providers would also have to be more upfront about measures they are taking to protect children.

In addition, the government plans to amend the Criminal Code to fight hate by creating a new hate crime punishable by up to life imprisonment, and by raising the maximum penalties for the existing four hate propaganda offenses.

Related content: This is a link to the government backgrounder on the proposed legislation

“We know that there are powerful organizations and people who may line up against this legislation,” Justice Minister Arif Virani told reporters in the lobby of the House of Commons. People with money and people with influence. My message to these people and organizations is very simple: Now is the time to work directly with us. Profit cannot be prioritized over safety. Right now it is too easy for social media companies to look the other way as hate and exploitation fester on their platforms. This bill will require platforms to do their part and to do better to keep people safe from harm and exploitation. Failure to do so will have a price — significant monetary penalties.”

The Canadian Human Rights Act would be amended to specify that posting hate speech online is discrimination, and allow the Human Rights Tribunal to handle hate speech complaints, including granting them the power to order the removal of such content.

Amendments would also make it clear anyone who provides an internet service — including social media platforms — has to report to police if child porn is posted on their service.

The Criminal Code would also be changed to create a new standalone hate crime. Currently hate can be considered by a judge as an aggravating factor in other crimes. To make it clear what hatred is the Criminal Code would include a definition adapted from a Supreme Court ruling that it must include vilification or detestation of an individual or group and incites a person or group to act. This definition would also apply to cases brought before the Human Rights Commission.

Only online services that have a big enough number of users would be covered under the Online Harms Act. The size would be covered in yet-to-be-announced regulations.

The proposed law would cover seven categories of harmful content:

— content that sexually victimizes a child or re-victimizes a survivor;

— content that could be used to bully a child;

— content that induces a child to harm themselves;

— content that incites violence;

— content that foments hatred;

— intimate content communicated without consent, including deepfaked audio, images and videos.

The Digital Safety Commission would be composed of five people appointed by the government, with the power to order providers to remove content that sexually victimizes a child. It would also have the responsibility of setting norms for online safety.

Individuals could either ask a service provider to remove content or ask the Commission to order the removal of offensive content.

SIDEBAR: The powerful Digital Safety Commission

The proposed legislation would also create a Digital Safety Ombudsperson, also appointed by the government.

The overall goals of the proposed changes, the government says, are to reduce the exposure of harmful content to Canadians, give special protections for children and stronger reporting of child pornography; give public oversight of and accountability from online services and “improved safety over time.”

The legislation will allow people to request the quick removal of child porn, submit complaints to the Digital Safety Commission, allow people to contact the Digital Safety Ombudsperson to receive support and be directed to the right help resource and to file complaints with the Human Rights Commission when facing online hate.

The bill makes it clear it doesn’t cover private communications such as email and text messages or services such as WhatsApp. But it does apply to social media platforms such as Facebook and others where a poster can invite many people into a group.

 

Exit mobile version