Canada

Free
86
100
A Obstacles to Access 23 25
B Limits on Content 32 35
C Violations of User Rights 31 40
Last Year's Score & Status
88 100 Free
Scores are based on a scale of 0 (least free) to 100 (most free). See the methodology and report acknowledgements.
Canada_hero

header1 Key Developments, June 1, 2023 – May 31, 2024

Canada remains one of the most open online environments in the world. Internet access is reliable and affordable for most of the population, but a notable digital divide persists for internet users in rural areas. Canadians enjoy strong protections for free expression on the internet. While the country’s federal data protection framework is inadequate, stronger safeguards have been enacted at the provincial level.

  • The April 2023 merger of two of Canada’s largest telecommunications companies, Rogers and Shaw, raised concerns that competition in the sector has diminished. To increase the competitiveness of smaller providers and lower consumer prices, in November 2023, the Canadian Radio-television and Telecommunications Commission (CRTC) ordered certain companies to offer wholesale access to their fiber networks in Quebec and Ontario (see A4).
  • In February 2024, the government introduced the long-anticipated Online Harms Act as part of Bill C-63. The bill would target seven categories of harmful online content, establish significant monetary penalties—and as part of the broader legislation—add provisions related to hate-motivated offenses to the Criminal Code (see B3, B6, and C2).
  • In June 2023, the government enacted the Online News Act, which requires tech companies to negotiate with Canadian media companies to compensate them for news content that appears on their platforms. In response, Meta blocked news content on Facebook and Instagram in Canada, threatening the survival of some small media outlets that depend on social media and severely curtailing access to reliable journalism on these platforms (see B6 and B7).
  • Multiple journalists who report online were arrested while covering protests or police activities, raising some concerns over the legal obstacles that affect journalists’ work (see C3).

header2 Political Overview

Canada has a strong history of respect for political rights and civil liberties, though in recent years citizens have been concerned about laws relating to the administration of elections, government transparency, the treatment of inmates in prisons, and restrictions on public sector employees wearing religious symbols. While Black, Indigenous, and other marginalized Canadians still face discrimination and economic, social, and political challenges, the federal government has acknowledged these problems and made some moves to address them.

A Obstacles to Access

A1 1.00-6.00 pts0-6 pts
Do infrastructural limitations restrict access to the internet or the speed and quality of internet connections? 6.006 6.006

Both fixed-line and mobile internet penetration rates have remained steady in Canada, and access remains high. Mobile service providers continue to deploy several newer technologies to provide mobile broadband service. 5G network coverage reached 91.4 percent by 2022, up almost 4 percent from the previous year.1 According to 2023 data from the International Telecommunication Union (ITU), Canada has 43 fixed-broadband subscriptions per 100 inhabitants and 88 mobile-broadband subscriptions per 100 inhabitants.2

The CRTC, the regulatory body that oversees the communications industry, has sought to ensure that all Canadians have access to “high-quality” internet service as part of the “universal service objective” defined in a landmark 2016 policy decision.3 The universal service objective aims to give all Canadians access to 50 megabits per second (Mbps) download speeds, 10 Mbps upload speeds, and unlimited data transfers. The 2024 federal budget set a goal to provide 98 percent of Canadians with access to such service by 2026 and 100 percent by 2030.4 Canada is on track to achieve this goal, moving from 91.4 percent availability in 2021 to 93.1 percent in 2022. Additionally, 78.8 percent of all subscribers had service that provided at least 50 Mbps download speeds and 10 Mbps upload speeds by the end of 2023.5

Also in 2016, the CRTC declared high-speed internet access a “basic telecommunications service” and established a C$750 million dollar (US$552 million) fund to reach those targets.6 In 2018, the CRTC announced criteria for the fund’s use.7 A third round of calls for project applications opened in November 2022,8 which focused on bringing high-speed internet access to remote transportation corridors and satellite-dependent communities. By April 2024, the government had awarded over C$300 million (US$220.8 million) from the fund.9 The CRTC’s fund is part of a larger commitment to broadband access through the C$3.225 billion (US$2.38 billion) Universal Broadband Fund (see A2).10

Robust infrastructure generally safeguards against power shortages or blackouts that limit internet access. However, a failure following a maintenance update caused customers of Rogers, a major internet service provider (ISP), to lose mobile and internet access across the country for 15 hours in July 2022.11 In response, the CRTC hired an outside consultant in May 2023 to investigate the outage.12 The regulator published an executive summary of the firm’s report in July 2024, after the coverage period.13

A2 1.00-3.00 pts0-3 pts
Is access to the internet prohibitively expensive or beyond the reach of certain segments of the population for geographical, social, or other reasons? 2.002 3.003

Internet access is not prohibitively expensive or beyond the reach of most Canadians, although a geographical digital divide persists and people with lower incomes struggle to afford internet service. The government named universal access as the first of 10 draft principles for a digitally connected Canada in its October 2019 Digital Charter.1

Mobile broadband data remains expensive compared to fixed-line broadband data. High-speed, fixed-line access remains relatively affordable due to more robust competition. The CRTC has aimed to increase competition further, and lowered prices for wholesale high-speed internet access in March 2023.2 In November 2023, the CRTC ordered large incumbent telephone companies to provide smaller, wholesale-based ISPs with “workable wholesale access” to their fiber-to-the-premises (FTTP) networks in Quebec and Ontario, a decision meant to lower consumer prices and spur competition (see A4).3 According to 2023 ITU data, a 5 gigabyte (GB) fixed broadband connection costs 1.05 percent of gross national income (GNI) per capita,4 while a 2 GB mobile broadband connection costs 0.77 percent of GNI per capita.5

Perhaps the most important obstacle to availability and ease of access is geography. While providing “reliable and affordable telecommunications services of high quality” to the 18 percent of Canadians who live in rural areas is enshrined in law,6 affordable high-speed internet service is less available in more isolated areas, especially in the vast northern territories. Connectivity projects initiated under the CRTC’s recent call for applications (see A1), however, will help to lessen this divide.

Urban areas also have better access to CRTC-defined high-quality broadband service. Service with 50 Mbps download speeds, 10 Mbps upload speeds, and unlimited data transfers was available to 99.4 percent of urban households in 2022, but only 67.4 percent of rural households.7 However, the gap between urban and rural access decreased from 53 percent to 32 percent between 2019 and 2022.8

Although high-speed internet access has historically been more expensive in rural areas than in cities, in 2022, the CRTC reported that rural customers paid C$4 (US$2.94) less on average than their urban counterparts for high-quality broadband service.9

The government has provided several funding mechanisms to improve connectivity in remote communities. In 2019, the government pledged to spend between C$5 billion (US$3.7 billion) and C$6 billion (US$4.4 billion) to improve rural broadband service over 10 years.10 In November 2022, Prime Minister Justin Trudeau announced a C$475 million (US$350 million) increase to the Universal Broadband Fund, which now totaled C$3.225 billion (US$2.37 billion).11 The minister of rural economic development proposed a comprehensive strategy to improve connectivity in 2019,12 which finally seemed to achieve tangible results during the coverage period. In December 2023, for instance, the CRTC announced that its fund (see A1) would provide up to C$26.8 million (US$19.7 million) to bring high-speed satellite internet service to all the communities of Nunavut, Canada’s largest territory.13

Lower-income Canadians have less access to the internet. In 2022, 79.1 percent of Canadians in the lowest income quartile had access to an internet connection other than mobile data at home, compared to over 93.4 percent of those with an income in the highest quartile.14 The government has launched the Connecting Families Initiative to provide lower-income families and senior citizens with subsidized, affordable internet packages that cost C$20 (US$14.72) or less per month.15

Major mobile data service providers generally offer services with data caps, resulting in increased fees for users who exceed the limit. The federal government’s 2024 budget promised a crackdown on “junk fees” charged by telecommunications companies, which may help to lower prices for both wireless and wired connections.16

A3 1.00-6.00 pts0-6 pts
Does the government exercise technical or legal control over internet infrastructure for the purposes of restricting connectivity? 6.006 6.006

The government does not exercise technical or legal control over the internet infrastructure for censorship. Authorities do not restrict access to any social media platforms or communications apps. In early 2023, the federal government and all provincial governments banned TikTok from government-issued phones over security and privacy concerns, but no restrictions were placed on personal devices (see B1).1

The government has not centralized the telecommunications infrastructure. However, given the vertical integration of the marketplace, the infrastructure is controlled by a small number of companies, which could theoretically facilitate greater control of content and enable surveillance.

A4 1.00-6.00 pts0-6 pts
Are there legal, regulatory, or economic obstacles that restrict the diversity of service providers? 5.005 6.006

There are some legal and economic obstacles that restrict the diversity of service providers, although the market remains relatively open. Service providers must be Canadian-owned and entry and infrastructure costs are high, which has led to market concentration, especially for mobile service.

To operate as a Canadian telecommunications provider, a company must meet the requirements in Section 16 of the Telecommunications Act. The telecommunications market has been dominated in recent years by the five largest companies (Bell, TELUS, Rogers, Shaw, and Québecor), which accounted for approximately 91 percent of the market as of January 2023.1

The Canadian telecommunications market has become more concentrated following the merger of two of the five largest companies, Rogers and Shaw. After the CRTC approved the broadcasting portion of the merger in 2021,2 the government, through the minister of innovation, science and industry, gave final approval to the merger, valued at C$26 billion ($19.1 billion), on March 31, 2023;3 the deal was finalized three days later, on April 3.4 A government-imposed condition to the merger excluded Shaw’s wireless division, Freedom Mobile, which was purchased by Québecor's Vidéotron. The government claimed that this mitigation measure would increase competition and affordability in the telecommunications sector,5 but commentators remained unconvinced.6

These concerns have been borne out by recent developments. In January 2024, the innovation, science, and industry minister acknowledged that competition was lacking in the mobile marketplace, and the government responded to reported price increases at both Rogers and Bell by encouraging Canadians to “consider switching service providers.”7 In February 2024, a Competition Bureau official testified in Parliament that prices for certain mobile plans in Alberta and British Columbia had increased since the merger.8 In August 2023, Rogers filed a court challenge seeking to raise the fees CRTC had determined Québecor must pay Rogers to access its cellular network.9

Despite concerns over waning competition and the dominant role of Bell, TELUS, and Rogers, Canadians have a choice of wireless internet providers, all of which are privately owned. With the launch of Freedom Mobile’s first nationwide plan in May 2023,10 there are at least four providers in all markets, although providers vary from region to region and some providers are restricted to urban areas. Restrictions on foreign investment and ownership impose barriers to entry, limiting competition in the telecommunications market.11 Rules on tower-sharing and domestic-roaming agreements regulate access services. In February 2024, Statistics Canada reported that prices for new plans were 26.5 percent lower than they were during the same month in 2023,12 helping to counteract recent price hikes for existing plans.

While Canadians generally enjoy a choice of fixed-line internet providers, the available choices vary from region to region. Though the government has taken some recent actions to boost competition in the fixed-line market, concerns remain that smaller ISPs are unable to compete with the major providers. As part of a review meant to increase competition and lower prices (see A2), in November 2023, the CRTC ordered large incumbent telephone companies to offer smaller providers access to their FTTP networks in Quebec and Ontario within six months, allowing wholesale providers to offer more competitive services in these markets.13 In response, Bell (the ISP most affected by the decision) announced that it would slash capital spending by C$1 billion (US$736 million),14 sought to appeal the decision through the courts,15 and petitioned the federal cabinet in February 2024 to overturn it.16 In August 2024, after the coverage period, the CRTC expanded the November 2023 decision to apply nationwide, requiring Bell, TELUS, and Saskatchewan Telecommunications (SaskTel) to allow wholesale access to their fiber networks by February 2025.17

A5 1.00-4.00 pts0-4 pts
Do national regulatory bodies that oversee service providers and digital technology fail to operate in a free, fair, and independent manner? 4.004 4.004

The CRTC largely operates independent of the government. The government appoints the CRTC chairperson and commissioners without public consultation, but they are not subject to political pressure. In some cases, the government has provided guidance on telecommunications regulations, but its input is nonbinding. Moreover, CRTC decisions can be appealed, or a government review can be requested. The government has rarely overturned CRTC decisions.

The CRTC regulates internet access but has not traditionally regulated internet content, a principle known as the “new media exemption.” The CRTC’s position to refrain from internet content regulation dates to 1999 and has been reinforced on numerous occasions,1 including by the Supreme Court of Canada (SCC).2 The Online Streaming Act passed in April 2023 threatens to alter Canada’s media landscape and expand the scope of the CRTC’s regulatory powers.3 It potentially allows for regulation of the internet and its content in new and myriad ways, effectively discarding the new media exemption and regulating content from non-Canadian sources (see B3).4

B Limits on Content

B1 1.00-6.00 pts0-6 pts
Does the state block or filter, or compel service providers to block or filter, internet content, particularly material that is protected by international human rights standards? 5.005 6.006

The government does not generally block or filter online content or require service providers to do so. Websites found to violate intellectual property rights, particularly those used for illegal streaming, are subject to blocking.

In February and March 2023, the federal government1 and all Canadian provinces2 banned TikTok on government-issued devices, citing cybersecurity and privacy concerns over the Chinese-owned video platform. The actions followed similar bans in the United States and European Union (EU). In March 2024, media reports revealed that in September 2023, the federal government secretly ordered a national security review of TikTok,3 which remained ongoing at the end of the coverage period.4

In November 2019, a court ordered all major ISPs to block several domains associated with a service that sold copyright-infringing material. The decision came after several large media companies petitioned the Federal Court to block the domains for rebroadcasting their programming without permission, in Bell Media Inc. v. GoldTV.Biz. Twelve domains and subdomains were blocked under the order, which permitted the media companies to seek further blocking orders against offending websites.5 The Federal Court of Appeal rejected the ISP TekSavvy’s appeal against the decision in May 2021.6 In March 2022, the SCC declined to hear TekSavvy’s second appeal, ending the case.7

The media companies from the Bell Media case ramped up their efforts to block copyright-infringing material when they requested a “dynamic” site-blocking order in court in 2021. In May 2022, the Federal Court granted a preliminary injunction that required ISPs to block internet protocol (IP) addresses of websites with pirated content (specifically, live-streamed professional hockey games) in real time.8 The temporary order, which lasted for the duration of the professional hockey season, was considered the first of its kind in North America.9 Similar decisions have followed. For example, Rogers and TVA, a Quebec-based broadcaster, obtained a site-blocking order in July 2023 for Toronto Blue Jays Major League Baseball (MLB) games,10 suggesting that dynamic site-blocking orders have become a regular fixture of efforts to combat copyright violations in Canada.

In January 2021, the CRTC launched a public consultation “to strengthen Canadians’ online safety” by blocking sites infected with botnets.11 Commentators criticized the plan and a broad range of industry actors opposed it.12 In June 2022, the CRTC released an enforcement decision that provided a framework for regulating botnets and required a CRTC working group to present a plan to block such websites within nine months.13 After several months of gathering submissions,14 the working group presented a draft plan in April 2023, which advised blocking botnet command-and-control servers at the IP layer and ensuring that the blocking framework would not be used for criminal or political purposes. The framework did not provide specifics on how the blocking system would work.15 The CRTC’s 2024–25 Departmental Plan, released in March 2024, stated that the CRTC “will continue to advance rules to authorize Canadian carriers to block botnets and other cyber-related threats at the network level, including spam and ransomware attacks.”16

B2 1.00-4.00 pts0-4 pts
Do state or nonstate actors employ legal, administrative, or other means to force publishers, content hosts, or digital platforms to delete content, particularly material that is protected by international human rights standards? 3.003 4.004

Large media companies have used legal means to force digital platforms to delete content, generally for copyright infringement. However, 2018 amendments to the Copyright Act reduced the misuse of the law’s notice-and-notice regime.

The previous notice-and-notice regime required ISPs to forward notices from copyright holders claiming infringement to the alleged copyright violator (see B3). In 2018, Parliament passed amendments that restricted the information that can be included in the notices, no longer allowing misstatements of Canadian copyright law that had been used to mislead Canadians. Further, ISPs are no longer required to forward notices to subscribers if they contain an offer to settle the infringement claim, a request or demand for payment or personal information, or a URL linking to such offers or demands.1

Media companies have continued to use the courts to shut down websites and other online services that violate copyright laws or facilitate such activities. In November 2019, a group of media companies obtained an order forcing ISPs to block certain websites that hosted copyright-infringing content, which was subsequently upheld by a court of appeal in May 2021 (see B1). In February 2022, the owner of TVAddons, a website that distributed software facilitating online piracy, admitted liability in court and agreed to pay C$25 million ($18.4 million) to a coalition of major Canadian media companies. The offending site was also shut down.2

In 2017, the SCC upheld the decision by the British Columbia Court of Appeal in Google, Inc. v. Equustek Solutions, Inc.,3 ordering Google to remove URLs in its global index pointing to websites that infringed on the plaintiffs’ trademark (see B3).

Defamation claims may also result in content removal, as content hosts fear potential liability as publishers of the defamatory content (see B3). Defamation claims may also prevent the posting of content. A March 2022 court decision, for example, granted a temporary injunction against TikTok user Brooke Dietrich that ordered her to stop using the platform to advocate against the antiabortion group 40 Days for Life and prevented others from reposting her content (see B8 and C3).4 In March 2023, a Quebec court ordered Google to pay moral damages of C$500,000 (US$368,000) to a plaintiff and de-index links to defamatory websites in search results, but for Quebec users only.5

In October 2023, a parliamentary committee released a report recommending that large tech companies be held accountable for certain false and misleading information published on their platforms,6 potentially incentivizing them to remove such content. However, there were no further updates by the end of the coverage period.

In Quebec, where French is the official language, the law requires commercial websites to be available in French,7 but other languages can also be used. Violators may receive a warning from a government agency and are then subject to fines if they do not comply. Bill 96, which became law in June 2022, imposed even more onerous obligations on the use of French online and carries harsher fines for violators.8 Some website operators may choose to take their sites down rather than pay for translation or face fines. National or international operators of websites that do business in Quebec (and would therefore be subject to the law) sometimes block Quebec residents’ access to their websites rather than comply,9 and at least one company temporarily halted e-commerce in Quebec because of Bill 96.10 Draft regulations on Bill 96 published in January 2024 reinforced the French language requirements for websites.11

B3 1.00-4.00 pts0-4 pts
Do restrictions on the internet and digital content lack transparency, proportionality to the stated aims, or an independent appeals process? 4.004 4.004

Restrictions on the internet are generally fair and proportionate. However, the Online Streaming Act and some pending legislation has raised concerns about the transparency and proportionality of internet content restrictions. The full impact of the regulations remains unclear and certain directives from the government may alleviate some concerns.

The Online Streaming Act passed in April 2023 allows the CRTC to regulate online streaming services on par with traditional radio and television broadcasters. The CRTC can require streaming platforms to promote Canadian content, amounting to a significant expansion of the CRTC’s regulatory powers (see A5).1 The law allows the CRTC to impose regulations requiring a certain proportion of Canadian programs to be available on streaming platforms and requires streaming services to make investments to support the Canadian broadcasting system.2

The final law did not include Senate amendments that would have shielded user-generated content, such as YouTube videos, from the law, raising concerns that the CRTC could potentially regulate such content.3 However, the government has consistently rejected claims that it intends to regulate user-generated content.4 In November 2023, the Department of Canadian Heritage published high-level policy directions for how the CRTC should implement the Online Streaming Act.5 These directions stated that social media content creators are not regulated by the law and clarified certain other matters.6 While these policy directions are binding on the CRTC, their effects remained unclear in practice during the coverage period, since the CRTC must develop a regulatory framework for the law. The Online Streaming Act will not likely be implemented for several years.7

After years of signaling that it would pursue legislation to regulate harmful online content, the Canadian government introduced the Online Harms Act as part of Bill C-63 in February 2024.8 The Online Harms Act targets seven types of harmful online content, including content that sexually victimizes children or revictimizes a survivor, content that foments hatred, and content that incites violence. The proposed law would require social media services to implement measures to limit users’ exposure to harmful content, protect children, block content that sexually victimizes a child or revictimizes a survivor, and keep records that ensure compliance.9 The act would impose large fines for noncompliance (see B6) and would create three new regulatory bodies: the Digital Safety Commission of Canada, the Digital Safety Ombudsperson of Canada, and the Digital Safety Office of Canada.10 Bill C-63 would also introduce hate-motivated offenses to the Criminal Code (see C2). Parliament is still debating Bill C-63 and the long-term effects of the Online Harms Act portion of the bill remain unclear.

Project Arachnid, an initiative launched in 2017, scans the internet for child sex abuse imagery and requests its removal.11 Part 4 of Bill C-63 would broaden the law that requires ISPs to report child pornography and would require internet access providers, internet content hosts, and services that facilitate “interpersonal communication” to report such content.

Bill S-210, An Act to Restrict Young Persons’ Online Access to Sexually Explicit Material, which was initially introduced in the Senate in November 2021, advanced through the House of Commons during the coverage period.12 It would mandate “age verification methods” for adult content on the internet, levy fines for noncompliance, and empower an enforcement authority to request that the Federal Court order Canadian ISPs to block noncompliant websites. Analysts have raised concerns that the bill’s broad language could lead to the blocking of sites that do not contain exclusively adult content or create restrictions on lawful content for adults.13 The bill had not passed by the end of the coverage period.14

In 2004, the SCC ruled that ISPs are not liable for copyright infringement violations committed by their subscribers,15 a principle now enshrined in law.16 Copyright law includes a notice-and-notice provision in effect since 2015, which was amended in 2018 (see B2). No online content is removed without a court order. Courts can order ISPs to block content (see B1). ISPs do not need to disclose subscriber information without court approval, although courts have granted more of these approvals in recent years.17

The SCC’s ruling in Google, Inc. v. Equustek Solutions, Inc.—which ordered Google to remove URLs for websites that infringed on the plaintiffs’ trademark from its global index—strictly focused on intellectual property laws and interlocutory injunctions. It is unclear whether the SCC will make worldwide orders pertaining to other areas of the law in the future, or whether they will have effect in foreign jurisdictions.18 However, there has been little evidence this is occurring. For instance, a March 2023 Quebec court ruling required Google to de-index defamatory search results for users in Quebec only (see B2).

Although platforms are legally protected from liability for copyright infringement by their users, they may be liable for defamation once they are aware of the offending content.19 A court may also order the removal of such content. The SCC has held that linking to defamatory content online is not defamation and the URLs would not be removed,20 but a website that repeats the defamatory material would be liable for defamation.

B4 1.00-4.00 pts0-4 pts
Do online journalists, commentators, and ordinary users practice self-censorship? 4.004 4.004

Online self-censorship is not widespread. Certain individuals may self-censor for fear of potential government surveillance under the Anti-Terrorism Act (see C5). However, there was no indication that the law has stifled online speech in recent years. During the coverage period, some individuals reported facing repercussions from employers or educational institutions for posting pro-Palestinian views on social media during the Israel-Hamas war, creating a potential chilling effect.1

B5 1.00-4.00 pts0-4 pts
Are online sources of information controlled or manipulated by the government or other powerful actors to advance a particular political interest? 4.004 4.004

Online sources of information are not widely controlled or manipulated by the government or other powerful actors.

The government advanced legislation to combat disinformation and foreign interference in advance of the October 2019 federal election. The Election Modernization Act, which went into effect in June 2019, imposed regulations on third-party online advertising and restrictions on how much campaigns can spend before a campaign season officially commences.1 In March 2021, provisions of the Election Modernization Act that prohibited misinformation about political candidates’ past criminal offenses and their place of birth were struck down by an Ontario Court as unconstitutional for violating freedom of speech.2 In March 2022, a report by the Canadian Election Misinformation Project found that while false information proliferated on social media surrounding the 2021 election, the overall effects of misinformation and disinformation were minimal.3

In recent years, the Canadian government has attempted to reduce online disinformation, especially on COVID-19 and the 2022 Russian invasion of Ukraine. Notably, the government implemented the Digital Citizen Initiative, “a multi-component strategy that aims to support democracy and social inclusion in Canada by building citizen resilience against online disinformation.”4

An April 2024 study published by the Cable Public Affairs Channel (CPAC) found that 42 percent of 2,001 respondents believed that the spread of disinformation was a serious problem in Canada, with social media cited as the source most responsible for its spread.5

B6 1.00-3.00 pts0-3 pts
Are there economic or regulatory constraints that negatively affect users’ ability to publish content online? 3.003 3.003

There are few economic or regulatory constraints on users’ ability to publish legal content online. However, the passage of the Online News Act in June 2023 created some obstacles to posting online content.

In April 2022, the government introduced Bill C-18, the Online News Act,1 which requires digital news intermediaries, including Google and Meta, to negotiate agreements to compensate Canadian media companies for providing news content on the intermediaries’ platforms. In February 2023, open internet advocates raised concerns that Bill C-18 could limit the online content available to Canadians and create economic barriers for new companies looking to enter the digital market.2

The Online News Act became law on June 22, 2023.3 In response, Meta announced in August 2023 that it had restricted access to news content on Facebook and Instagram in Canada.4 The platforms continued to block news content throughout the coverage period,5 severely diminishing the availability of reliable news content on Facebook and Instagram (see B7). Google, on the other hand, negotiated a deal with the government in November 2023 that allows the company to broker a compensation structure with a single media representative, rather than individual news organizations. According to media reports, Google will compensate Canadian news outlets C$100 million (US$73.6 million) annually.6 In June 2024, after the coverage period, Google announced that the Canadian Journalism Collective will distribute the annual funds, which will be indexed to inflation, to news publishers.7

Meta’s decision to ban news content on Facebook and Instagram impacted small media outlets that had relied on Facebook to drive engagement and readership. In February 2024, for instance, the cofounder of Wreckhouse Press, a Newfoundland-based local news outlet, estimated that its online presence declined 60 percent due to Meta’s ban.8 That month, Wreckhouse Press announced that it would stop creating news content to instead focus on independent book publishing.9

Canada has strengthened its commitment to net neutrality as a matter of national policy. In 2017, the CRTC enacted a pair of telecommunications policies that prohibited differential pricing for some data services offered by ISPs and the zero-rating of certain media services, barring ISPs from offering such preferred media free of charge.10 With these policies, the CRTC has substantively completed a national framework that ensures the continuation of net neutrality.

If enacted, the Online Harms Act would impose large fines on social media services that fail to comply with the law, up to the greater of 6 percent of gross global revenue of the person responsible or C$10 million (US$7.4 million). Meanwhile, penal provisions in certain cases would establish fines up to the greater of 8 percent of an operator’s gross global revenue or C$25 million (US$18.4 million).11

The Online Streaming Act, which became law in April 2023 (see B3), will require streaming services to help fund Canadian content. After the coverage period, in June 2024, the CRTC enacted a policy that will require streaming services not affiliated with a Canadian broadcaster and making at least C$25 million ($18.4 million) in Canadian revenues, as defined under the regulations, to contribute 5 percent of those revenues to certain media funds. These funds support local news and content related to minority communities, including Indigenous peoples.12

In June 2024, after the coverage period, the federal government enacted a long-awaited “Digital Services Tax” (DST), which requires online companies with annual worldwide revenues of over €750 million ($820 million) to pay a 3 percent tax on Canadian revenues greater than C$20 million (US$14.7 million). The DST will apply retroactively from January 1, 2022 onwards.13 The government moved forward with the tax despite complaints from US tech industry groups and opposition from US trade officials.14

Numerous provinces, including British Columbia, Quebec, and Saskatchewan, had already levied sales taxes for several years on out-of-province digital platforms, including Netflix, Google, Amazon, and, in Quebec’s case, Spotify.15 In December 2021, the Manitoba provincial government also added a sales tax,16 and in July 2022, the British Columbia government began to apply sales taxes to online marketplaces such as eBay.17

B7 1.00-4.00 pts0-4 pts
Does the online information landscape lack diversity and reliability? 3.003 4.004

Score Change: The score declined from 4 to 3 due to Meta’s decision to block news content on its platforms, which severely eroded the diversity and reliability of online content during the coverage period.

The online environment in Canada is relatively diverse and internet users have access to a wide range of news and opinions on a variety of topics. However, Meta’s decision to ban the sharing of news content on Facebook and Instagram because of the Online News Act has reduced access. The ban has also created operating challenges for some independent news publishers (see B6).

All major media organizations operate websites that feature articles and audio and video content. The public broadcaster maintains a comprehensive website that includes news articles and streamed video programming. Paywalls are increasingly used by newspapers that publish online, but many high-quality, independent news and commentary sites remain free. While some sites are partisan, a wide array of political viewpoints are available online. There is a wide range of content available in both official languages (English and French) and many other languages. Additionally, there are online outlets aimed at First Nations peoples1 and LGBT+ Canadians.2

The Online News Act, which became law in June 2023, requires digital news intermediaries, including Meta, to negotiate agreements to compensate Canadian media companies for providing news content on their platforms (see B6). To circumvent the requirement, Meta blocked news content on Facebook and Instagram beginning in August 2023.3 Such restrictions have eroded the reliability of content on these platforms. For instance, studies conducted after the ban found that Canadian Facebook users discussing political events had increasingly shared memes in place of reliable news sources.4 The effects of Meta’s decision have expanded across the online information space. A report published by the Media Ecosystem Observatory in August 2024, after the coverage period, found that 30 percent of Canadian local news outlets that had a presence on social media in June 2023 were no longer active on any platform—including Facebook, Instagram, TikTok, X, and YouTube.5

Misinformation surrounding COVID-19 proliferated in Canada throughout the pandemic,6 and Russian disinformation about the war in Ukraine has reportedly undermined Canadians’ ability to obtain accurate information about the war.7

B8 1.00-6.00 pts0-6 pts
Do conditions impede users’ ability to mobilize, form communities, and campaign, particularly on political and social issues? 6.006 6.006

Digital mobilization tools, including social media platforms and communication apps, are available to build support for political and social movements. Much online activism that targets the information and communications technology (ICT) sector is spearheaded by a popular nonpartisan, nonprofit organization called OpenMedia, which advocates for three pillars of internet rights—free expression, access, and privacy.1

Canadians were especially active as part of the online #MeToo movement,2 which prompted the justice minister to consider updating laws to ensure victims of sexual violence are treated more compassionately in courtrooms.3 This online activism also influenced the government to introduce Bill C-65,4 which became law in 2018 and dramatically updated the legal framework for harassment in the federal government and at federally regulated workplaces.5 Online activism likely played a role in the decision to legalize cannabis countrywide,6 which also went into effect in 2018. As the COVID-19 pandemic progressed, Canadians used the internet to help organize in-person protests once again, around issues ranging from Black Lives Matter7 to mask mandates and other pandemic-related public health measures.8 The so-called “Trucker Convoy” of early 2022 in Ottawa was fuelled by online activity, including crowdfunding efforts to financially support attendees.9

In March 2022, the Ontario Superior Court of Justice granted a temporary injunction against TikTok user Brooke Dietrich, ordering her to stop all activist activities on the platform against the antiabortion group 40 Days for Life. The case remains ongoing (see C3).10

C Violations of User Rights

C1 1.00-6.00 pts0-6 pts
Do the constitution or other laws fail to protect rights such as freedom of expression, access to information, and press freedom, including on the internet, and are they enforced by a judiciary that lacks independence? 5.005 6.006

The constitution includes strong protections for freedom of speech and freedom of the press. Freedom of speech is protected as a “fundamental freedom” by Section 2 of the Canadian Charter of Rights and Freedoms. Under the Charter, freedom of expression is “subject only to such reasonable limits prescribed by law as can be demonstrably justified in a free and democratic society.”1 These protections apply to all forms of speech, whether online or offline. However, there are a few restrictions that apply to online speech (see C2).

C2 1.00-4.00 pts0-4 pts
Are there laws that assign criminal penalties or civil liability for online activities, particularly those that are protected under international human rights standards? 2.002 4.004

Users can face significant criminal penalties for some forms of online expression, as well as civil liability for defamation. Some provincial defamation laws and the general civil liability regime in Quebec also limit freedom of expression online.

Hate speech, along with advocating genocide, uttering threats, and defamatory libel, are also regulated under the Criminal Code.1 Those found guilty of defamatory libel, advocating genocide, and uttering threats can face up to five years in prison. Hate speech is punishable by up to two years in prison. The proposed Bill C-63, which remained in the House of Commons at the end of the coverage period (see B3), would add provisions to the Criminal Code for an “offense motivated by hatred.” Offenders motivated by “hatred based on race, national or ethnic origin, language, colour, religion, sex, age, mental or physical disability, sexual orientation, or gender identity or expression” could face life in prison under the bill.2 Additionally, if an individual suspects that someone will commit a hate crime in the future, they may, with the approval of the attorney general, petition a provincial court judge to restrict the person’s movement or pursue other means to prevent the crime.3

Provincial human rights laws and the Canadian Human Rights Act (CHRA) provide mechanisms for individuals to file defamation complaints rooted in human rights violations.4 The controversial provision of the CHRA prohibiting online hate speech (section 13), which was criticized for being overly broad, was repealed in 2013.5 However, the proposed Bill C-63 would update section 13 and declare it discriminatory “to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication” in certain contexts.6 The inclusion of the Criminal Code and CHRA provisions in Bill C-63 have been widely criticized, with one analyst arguing that they detract from the more proportionate aims of the Online Harms Act.7

In January 2021, an Ontario court broadened the definition of defamation when it recognized a common law tort of “internet harassment” to address the defendant’s online conduct and publications in Caplan v. Atas (see C3 and C7). In this case, the court defined “internet harassment” as “serial publications of defamatory material” used to “harass, harry, and molest” the victim.8

Antispam legislation enacted in 2014 requires opt-in consent to send commercial electronic messages. Critics of the legislation have argued that it is overly broad and overregulates commercial speech.

C3 1.00-6.00 pts0-6 pts
Are individuals penalized for online activities, particularly those that are protected under international human rights standards? 5.005 6.006

Score Change: The score declined from 6 to 5 because multiple individuals who report online were arrested during the coverage period while covering protests or police activities.

Individuals do not typically face severe penalties for online activities that are protected under international human rights standards, though courts have recently increased awards in online defamation cases. During the coverage period, at least three digital reporters were arrested while covering protests or police-related developments.

Generally, writers, commentators, and bloggers are not subject to legal sanction for their online content. Internet users are free to discuss any political or social issues without risk of prosecution, unless the discourse violates the hate speech provisions in the Criminal Code or constitutes harassment, which is both a criminal offense1 and now an actionable civil tort in Canada (see C2 and C7).

During the coverage period, multiple journalists were arrested while reporting for digital media. In January 2024, Edmonton police arrested Indigenous journalist Brandi Morin, a reporter for the investigative outlet Ricochet,2 after she interviewed residents of a homeless encampment as it was raided by police. While police had designated the area as a "media exclusion zone," Morin was inside the encampment before it was closed off, and certain Canadian courts have upheld the right to report in such areas. Morin was reportedly searched before she was detained at police headquarters for five hours. Morin was charged with obstruction, but the charges were dropped in March 2024.3

In April 2024, Savanna Craig, a reporter for Local 514, a Montreal-based television news show with a significant presence on YouTube,4 was arrested as she covered a pro-Palestinian sit-in at a bank. Craig was detained alongside the protesters, though she clearly identified herself as press to police and presented a press pass. Police recommended that Craig be charged with “mischief.”5 In October 2024, after the coverage period, it was reported that prosecutors would not move forward with charges against Craig.6

Canadian courts take a proactive approach to online defamation cases and have increasingly granted large monetary awards. In January 2020, an Ontario judge issued significant awards for defamation against anonymous online defendants for only the second time in Canadian legal history.7

More recently, in April 2023, an Alberta court awarded C$300,000 (US$221,000) in general damages for defamation, C$100,000 (US$74,000) in general damages for harassment, and C$250,000 (US$184,000) in aggravated damages to an employee of a regional health authority. The employee had been repeatedly defamed and harassed by a former candidate for mayor of Calgary on his online talk show.8 In June 2023, an Ontario court ordered a defendant to pay 53 plaintiffs a total of C$4.7 million (US$3.5 million), in what may have been the largest defamation award in Canadian history.9 The defendant was found liable for defamation after making tens of thousands of website posts falsely portraying the plaintiffs as “sexual predators, fraudsters, and criminals.”10

Canadian defamation cases are open to the defense of fair comment, notably in matters of public interest.11 The fair comment defense has been successful in defamation cases focused on both traditional media12 and online media,13 providing a significant safeguard against defamation penalties for speech protected under international human rights standards.

In September 2022, an Ontario court rejected Brooke Dietrich’s motion to dismiss a defamation case brought against her by the antiabortion group 40 Days for Life, in connection with Dietrich’s TikTok campaign against the group (see B8).14 Dietrich appealed the decision and challenged an injunction that prevents her from posting on TikTok (see B2).15 In August 2024, after the coverage period, Ontario’s highest court dismissed Dietrich’s appeal, allowing the defamation case against her to proceed.16

C4 1.00-4.00 pts0-4 pts
Does the government place restrictions on anonymous communication or encryption? 4.004 4.004

The government does not restrict anonymous communication or encryption. Canadians are free to use encryption services and communicate anonymously online without any fear of civil or criminal sanction. While Bill S-210 contains provisions requiring “age verification methods” for adult content on the internet, which could compromise online anonymity, this bill had not passed by the end of the coverage period (see B3).

In August 2019, the minister of public safety and emergency preparedness suggested that technology companies must actively combat the online exploitation of children, which he said is facilitated by encrypted communications.1 In October 2020, the “Five Eyes alliance”—five countries that maintain an intelligence operations agreement, including Canada—joined the governments of Japan and India in requesting a “backdoor” for law enforcement to access encrypted communications services.2 The joint statement expressed support for strong encryption while claiming that end-to-end encryption without a backdoor for law enforcement could undermine public safety.3 In October 2023, during the trial of a Royal Canadian Mounted Police (RCMP) officer accused of leaking secrets, it was revealed that other members of the Five Eyes alliance were concerned that Canada had apparently failed to prevent criminals from purchasing encrypted phones.4

C5 1.00-6.00 pts0-6 pts
Does state surveillance of internet activities infringe on users’ right to privacy? 4.004 6.006

State surveillance of internet users under limited circumstances may infringe on privacy rights. In 2015, the government passed Bill C-51, the Anti-Terrorism Act, which permits information sharing across government agencies for a wide range of purposes, many of which are unrelated to terrorism. Several efforts to reform Canada’s antiterrorism laws have subsequently materialized, most recently with Bill C-59.

Bill C-59, an Act Respecting National Security Matters,1 was passed in June 2019 to address some of the more problematic provisions of the Anti-Terrorism Act.2, 3 The law limits the broad criminal-speech provisions in Bill C-51. Bill C-59 is also meant to enhance parliamentary oversight through the creation of a National Security and Intelligence Review Agency and an Office of the Intelligence Commissioner.4 It still allows the government to engage in cyberoperations, but its powers are more limited.5 Civil society groups raised concerns that Bill C-59 does not fully address surveillance issues posed by previous legislation6 and still grants too much power to the government, including for mass data collection.7 In February 2021, judges began hearing related cases and have set limits on the Canadian Security Intelligence Service (CSIS), including its ability to spy on foreign countries.8

The Office of the Privacy Commissioner (OPC) provides an important oversight function concerning the privacy of users’ data. The privacy commissioner, Philippe Dufresne, is an officer of Parliament who reports directly to the House of Commons and the Senate. The commissioner’s mandate includes overseeing compliance with the Privacy Act,9 which covers the practices of federal government departments and agencies on the handling of personal information. A general right to privacy is not enshrined in Canadian law, though the Canadian Charter of Rights and Freedoms includes protections against unreasonable search or seizure, which are often interpreted as privacy rights.10

In December 2021, Prime Minister Trudeau announced that he would propose legislation to strengthen privacy protections for users, provide for significant monetary penalties for noncompliance, give massive enforcement powers to the federal privacy authorities, and establish a new privacy tribunal.11 The federal government seeks to catch up with provincial privacy laws, notably Quebec’s 2021 privacy reforms. Quebec’s reforms, which are similar to the General Data Protection Regulation (GDPR) of the EU,12 entered into force in September 2023.13

In June 2022, the government introduced Bill C-27, the Digital Charter Implementation Act.14 In addition to the new privacy protections, Bill C-27 also includes the Artificial Intelligence and Data Act to introduce a regulatory framework for AI systems.15 The bill remained under consideration in a parliamentary committee during the coverage period.16

In June 2022, the national police force disclosed that it uses spyware to hack suspects’ phones or laptops and collect data, including by turning on device cameras and microphones remotely. According to the RCMP, spyware is only used during serious criminal and national security investigations when less intrusive techniques are unsuccessful; its use always requires authorization from a judge. The force reported deploying spyware in 10 investigations between 2018 and 2020.17 In response to the disclosure, the Canadian Civil Liberties Association (CCLA) called for a moratorium on the RCMP’s use of spyware in August 2022.18 A report released by a parliamentary ethics committee in November 2022 recommended that the government create a list of banned spyware vendors and require government entities to submit privacy impact assessments prior to the use of “high-risk technological tools,” though it stopped short of reiterating calls for a spyware moratorium.19

In November 2023, media reports revealed that 13 federal departments and agencies were using software that can extract personal data, including information about online activities, from government-issued computers and mobile devices.20 The government bodies have used the software, referred to as “digital forensics tools” by the privacy commissioner,21 for internal investigations without conducting privacy impact assessments.22 A House of Commons committee launched an investigation the following month.23

The COVID-19 pandemic provided authorities with an opportunity to erode privacy rights. The OPC’s annual report released in September 2023 reiterated the need for heightened privacy and reforms to privacy laws in the aftermath of the pandemic.24 In May 2023, the OPC published an investigation into whether health authorities overreached when analyzing Canadians’ cell phone location data during the pandemic. The investigation found that the public health authorities took adequate measures for the de-identification of personal data and implemented protections to prevent re-identification, determining that privacy complaints were unfounded and the Privacy Act had not been violated. However, the OPC also provided the public health authorities with several recommendations to strengthen privacy protections.25

C6 1.00-6.00 pts0-6 pts
Does monitoring and collection of user data by service providers and other technology companies infringe on users’ right to privacy? 4.004 6.006

Both ISPs and mobile service providers may be legally required to aid the government in monitoring communications of their users.

The OPC and the privacy commissioner oversee compliance with the private sector privacy law,1 the Personal Information Protection and Electronic Documents Act (PIPEDA).2 PIPEDA was modified by the 2015 Digital Privacy Act,3 which made it easier for companies to make voluntary warrantless disclosures of personal information under certain circumstances by allowing for such disclosures to be made to any organization, not just law enforcement. The act also established new mandatory security breach disclosure requirements, which came into force in 2018.4 PIPEDA, however, remains relatively powerless. A privacy protection bill presented in June 2022 (see C5), which would implement a new Consumer Privacy Protection Act to replace PIPEDA in large part, includes significant fines for noncompliance with the bill’s data protection framework, similar to penalties found in the GDPR.5

The OPC continues to call for changes to the Privacy Act, which has not been significantly amended since 1983.6 Notably, the OPC has called on legislation to require mandatory data-breach reporting by the government. The commission argues that the act is outdated, does not reflect current digital privacy concerns, and allows the government too much latitude to collect personal information.7

The OPC shocked the legal community in 2018 when it released a draft position paper concluding that PIPEDA contained a European-style “right to be forgotten” provision.8 Commentators questioned the OPC’s conclusions and reasoning.9 In 2018, the OPC submitted a reference question to the Federal Court to clarify whether indexing web pages and presenting results about a person’s name in Google’s search function fall under PIPEDA, which would support their right to be forgotten position. In July 2021, the Federal Court issued its decision and stated that Google searches fall under PIPEDA.10 Google appealed the decision, and in September 2023, the Federal Court of Appeal upheld the lower court’s ruling.11 However, Google can still appeal to the SCC.12

The OPC conducts investigates major data breaches and other matters to determine whether private companies comply with PIPEDA. During the OPC’s investigation of the Cambridge Analytica scandal—wherein Cambridge Analytica improperly accessed the personal data of Facebook users—Facebook refused to take significant corrective measures or implement the OPC’s recommendations.13 In February 2020, the OPC filed an application with the Federal Court seeking a declaration that Facebook had violated PIPEDA and an order requiring Facebook take corrective action.14 In April 2023, the Federal Court ruled that Facebook had not violated PIPEDA.15 However, the OPC appealed the decision,16 which the Federal Court of Appeal unanimously overturned in September 2024, after the coverage period, on the grounds that Facebook had breached PIPEDA.17 In June 2022, the OPC, in a joint investigation with several of its provincial counterparts, found that Tim Hortons, a popular donut and coffee chain, violated PIPEDA and provincial privacy laws by tracking the location of its mobile app users without proper consent.18 The OPC has also launched ongoing joint investigations into the privacy practices of both TikTok and OpenAI’s ChatGPT.19

The SCC has also expanded privacy rights relating to technology and digital communications. In 2018, the court ruled that privacy rights are still protected when a computer is shared with others.20 In 2017, the court extended the right to privacy to text messages in a pair of companion cases. First, the court held that there could be a reasonable expectation of privacy in received text messages, whereas previously, privacy protections only applied to sent messages.21 In the second case, the court held that the sender of text messages has a reasonable expectation of privacy, even when they are stored on the telecommunications provider’s computers.22 However, the SCC has not found a reasonable expectation of privacy on the internet in more egregious circumstances, such as Facebook messages and emails regarding a child luring police sting.23 In March 2024, the SCC found that there is a reasonable expectation of privacy with regard to IP addresses in only some cases, on the grounds that an IP address alone does not necessarily reveal sensitive personal information.24 The court determined that the reasonable expectation of privacy was not absolute, but dependent on the facts of the case and the totality of the circumstances.

Numerous court decisions have made it easier for Canadians to seek legal redress against foreign internet companies for privacy violations. In a landmark 2017 decision, the SCC ruled that residents of British Columbia could bring a class action suit against Facebook for violating privacy rights in a British Columbia court, despite Facebook’s choice-of-forum clause specifying California as the jurisdiction.25 Other courts issued similar decisions, including a Quebec court that decided Yahoo’s choice-of-forum clause was invalid, as its terms and conditions were in a consumer contract that granted jurisdiction to Quebec.26 While Yahoo’s choice-of-forum clause specified another Canadian province (Ontario) and not another country, the court’s reasoning could clearly apply internationally. In a significant 2017 decision, the Federal Court found that PIPEDA could be applied globally and ordered a Romanian website to remove court decisions that contained easily searchable personal information of Canadian citizens. The site was ordered to never post such information again,27 and the court ordered the website to pay damages to the plaintiff.

C7 1.00-5.00 pts0-5 pts
Are individuals subject to extralegal intimidation or physical violence by state authorities or any other actor in relation to their online activities? 5.005 5.005

There were no documented cases of violence or physical harassment in retaliation for online activities during the reporting period. However, cyberbullying, cyberstalking, and general online harassment, particularly affecting young people, is on the rise.1 A government study released in September 2023 found that one-quarter of Canadian teenagers had experienced cybervictimization.2 However, some groups reported higher rates of cybervictimization, with close to 50 percent of transgender or nonbinary teenagers experiencing such harms.3 The Online Harms Act was introduced in part to address these issues (see B3).

Women, including journalists, activists, and politicians, have also reported facing online intimidation and misogynistic messages. Media reports from August 2022 noted a recent intensification in such threats against women. Women journalists, especially women journalists of color, shared anonymous emails they had received that threatened violence and sexual assault, which also contained misogynistic and racist language.4

While individuals have increasingly faced negative consequences for their online opinions on the Israel-Hamas war (see B4), there have been no reports of physical violence or widespread threats of violence. However, individuals have reported targeted threats and harassment. Journalist Saba Eitizaz, who moved to Canada from Pakistan, reported that she received an escalation of violent, intimidatory messages in October 2023,5 including emails threating violence against Muslim refugees in Canada.6

The landmark 2016 civil court decision that ordered a man to pay C$100,000 (US$74,000) to his former partner for publishing intimate videos of her without her consent—has grown in significance in recent years.7 Though the details of this case remained in flux following the early 2016 decision,8 the privacy tort of “public disclosure of private facts” that the judge’s original decision established has since been adopted in several courts. The tort was applied in a 2018 decision that found an individual liable for posting a sexually explicit video of a person without their consent on a pornographic website; they were ordered to pay C$100,000 (US$74,000) in damages.9 The tort was also applied in a different province for the first time in September 2021, when the Court of King’s Bench of Alberta (then the Court of the Queen’s Bench) used the tort in awarding C$185,000 (US$136,000) in damages to a victim of nonconsensual distribution of intimate images.10

The 2016 case continues to be cited by other plaintiffs, authors, and courts.11 The Saskatchewan Court of King’s Bench, for instance, first recognized the tort of “public disclosure of private facts” in a September 2022 decision, awarding damages of C$160,000 (US$118,000) to a victim whose husband uploaded intimate images of her to a pornographic website without her consent.12 There are also increasing calls for tech companies to do more to remove private material published without consent13 and to face criminal penalties for failing to do so.14 The proposed Online Harms Act addresses this issue (see B3), as ”intimate content communicated without consent” is one of the seven targeted harms.

Additionally, many provinces, including Manitoba15 and Alberta,16 have previously passed laws that create civil torts for unauthorized distribution of intimate images and videos. British Columbia enacted such a law in March 2023,17 which came into effect in January 2024.18 Individuals are still prosecuted under Section 162.1 of the Criminal Code, which makes it a crime to publish, distribute, transmit, or sell intimate images without the consent of the person depicted.19 In 2023 alone, there were 3,064 criminal incidents of nonconsensual distribution of intimate images in Canada.20

C8 1.00-3.00 pts0-3 pts
Are websites, governmental and private entities, service providers, or individual users subject to widespread hacking and other forms of cyberattack? 2.002 3.003

Cyberattacks and data breaches have become a serious issue in Canada, generally rising in number every year. During the 2022–23 period, the OPC received 681 data breach reports under PIPEDA, an increase of 6 percent from the previous year.1 The number of reported breaches has increased significantly since a PIPEDA requirement that private companies report data breaches to the OPC came into effect in 2018 (see C6). Whether the number of breaches is increasing or the mandatory reporting requirement has led to more accurate data is unclear. However, many observers believe that cybercrime in Canada remains a larger problem than the statistics reveal due to underreporting.2

Certain federally regulated industries are not covered by the mandatory breach reporting requirements found in PIPEDA. To fill this gap and provide for a more secure infrastructure, in June 2022, the federal government advanced Bill C-26, which would enact the Critical Cyber Systems Protection Act (CCSPA).3 The legislation would create new cybersecurity regulations, such as mandatory breach reporting and requirements to create cybersecurity programs for critical infrastructure designated ”vital services” or “vital systems,” including telecommunications, energy, finance, and transportation.4 Analysts have noted that CCSPA is unprecedented because it would impose mandatory breach reporting requirements on national security grounds rather than to protect personal data, as is the case under PIPEDA.5 The bill progressed from the House of Commons to the Senate in June 2024, after the coverage period.6

Statistics Canada reported that 70 percent of internet users suffered a cybersecurity incident during 2022, up from 58 percent in 2020.7 A 2022 survey by the Canadian Internet Registration Authority (CIRA) indicated that 29 percent of respondent organizations had experienced a breach of customer or employee data, or both, within the previous year.8 A Statistics Canada report released in July 2023 found that some businesses have significantly increased their spending on cybersecurity prevention in the wake of the COVID-19 pandemic.9

An August 2023 report from the government’s Canadian Centre for Cyber Security (CCCS) warned that Canada’s national security and economy will remain threatened by organized cybercrime groups in the coming years.10 The report identified Russia and Iran as major sources of cybercrime by acting as a base of operations for cybercriminals. The report also claimed that “Russian intelligence services and law enforcement almost certainly maintain relationships with cybercriminals and allow them to operate with near impunity.” Ransomware was identified as the most destructive form of cybercrime in the country.11

In December 2022, Amnesty International Canada announced that it was the victim of a suspected Chinese state-sponsored cyberattack two months earlier. The attack was reportedly intended to surveil the work of the organization and to obtain the personal information of its collaborators..12 Amnesty stated that no donor or membership data had been breached in the attack.13

In March 2022, a University of Toronto–based Citizen Lab report on digital transnational repression in Canada found that foreign dissidents and activists living in Canada, after fleeing their countries of origin to evade repression, had increasingly been the targets of hacking and phishing attempts and experienced takeovers of their social media and email accounts in recent years. Some reported cutting off contact with friends and relatives in their countries of origin, out of concern for their safety.14

Cyberattacks and data breaches have also affected federal government agencies and actors in recent years. The OPC reported that breach reports received from the public sector had dropped from 463 during the 2021–22 fiscal year to 298 in 2022-23, a decrease of 36 percent.15 The drop is potentially due to the absence of mandatory data breach reporting in the public sector.

In January 2024, it was reported that a data breach at Global Affairs Canada had compromised the personal information of certain employees,16 reportedly over the course of several weeks between December 2023 and January 2024.17 In February 2024, the OPC opened an investigation to assess Global Affairs Canada’s compliance with the Privacy Act.18 Additionally, the Financial Transactions and Reports Analysis Centre (FINTRAC), Canada's financial intelligence agency, was forced to shut down some systems temporarily in March 2024 following a “cyber incident,” and the RCMP was also targeted by a “cyber event” around the same time.19 Considering the sensitive nature of those agencies, details of the incidents are scarce.

A report issued by the Auditor General after the coverage period, in June 2024, warned that Canadian security agencies, including the RCMP, lack the capacity and capabilities necessary to combat cybercrime.20

On Canada

See all data, scores & information on this country or territory.

See More
  • Population

    38,930,000
  • Global Freedom Score

    97 100 free
  • Internet Freedom Score

    85 100 free
  • Freedom in the World Status

    Free
  • Networks Restricted

    No
  • Websites Blocked

    No
  • Pro-government Commentators

    No
  • Users Arrested

    Yes