top of page

Are Nude Deepfakes Illegal in Canada Today?

  • Writer: Lina Zhang
    Lina Zhang
  • Oct 27
  • 6 min read
Woman in hoodie looks shocked at laptop screen displaying the word "Deepfake" with a silhouette of a woman, symbolizing concern over nude deepfakes and questions about their legality in Canada.

It takes one click to fake a nude. A stranger can upload your photo into an AI app, and within seconds, a sexual image that never existed looks real enough to destroy reputations.


These “nude deepfakes” are spreading fast — and many Canadians are asking the same question: are nude deepfakes illegal in Canada today?


This article breaks down what the law actually says, what’s missing, and what you can do if it happens to you.



What the Law Says — Are Nude Deepfakes Illegal in Canada?


Under section 162.1 of the Criminal Code, it’s a crime to share or threaten to share an intimate image of someone without consent. But that law was written before AI could create lifelike fakes. It only applies when the image shows a real person and a genuine private act.


That means AI-generated or synthetic nudes might not meet the legal definition of an “intimate image.” Lawyers at Collett Read LLP note that the Criminal Code doesn’t yet address fake or fabricated content that appears real. Victims can still report the incident to police if the deepfake is used to harass, threaten, or extort them — but criminal charges depend on how clearly the image connects to a real likeness.


Other offences, such as harassment or extortion, may apply. But Canada still has no federal law that directly bans the creation of nude deepfakes.



Can Police or Courts Actually Help With Deepfakes?


Police can investigate deepfake abuse, but their ability to act depends on how the image is used. When there are threats, blackmail, or harassment, it can fall under existing criminal offences. Still, most police departments have limited training and few digital-forensics tools to confirm whether a photo or video is AI-generated.


The RCMP’s National Cybercrime Coordination Centre (NC3) and local police units can take reports, yet non-criminal cases are often redirected to provincial tribunals or privacy regulators. Even if no immediate charge is laid, filing a report helps create a legal record and strengthens future civil action. Bring evidence: screenshots, URLs, upload times, and any messages that show intent or harm.



How Provinces Are Filling the Gaps


British Columbia’s Intimate Images Protection Act (IIPA), launched in 2024, explicitly includes “real, fake, or altered” content — covering AI-generated nudes. Victims can file a claim through the Civil Resolution Tribunal (CRT) to order content removed, de-indexed, and to seek damages up to $5,000.


Other provinces such as Alberta, Manitoba, and Prince Edward Island have similar laws, though not all mention AI directly. These acts provide takedowns and compensation, but not criminal penalties.

BC’s IIPA remains one of the few Canadian laws that fully recognizes deepfakes as a form of image-based abuse.



What If the Deepfake Comes From Overseas?


Many deepfakes are uploaded to websites or servers outside Canada, sometimes by anonymous users abroad.


If that happens, Canadian law still applies to the harm suffered here — but enforcing removal or prosecution overseas is difficult. Police can request cooperation through Mutual Legal Assistance Treaties (MLATs), but those processes are slow and rarely effective against anonymous users.


Victims usually get faster results by reporting directly to platforms, filing Google de-indexing requests, and using provincial civil options to document harm. Even if the creator can’t be found, these actions can help restore privacy and strengthen any future legal claim.



How the Rest of the World Handles Nude Deepfakes


In the United States, more than 30 states criminalize non-consensual deepfake pornography. California allows both criminal charges and civil lawsuits for anyone who creates or distributes an AI-generated sexual image without consent. The federal TAKE IT DOWN Act (2025) also requires platforms to remove non-consensual intimate or deepfake content quickly.


Across Europe, the EU’s 2024 Directive on Combating Violence Against Women will force member states to outlaw sexual deepfakes by 2027. France has already amended its Criminal Code (Article 226-8-1) to criminalize pornographic deepfakes.


Globally, governments are treating deepfakes as a form of digital abuse — while Canada continues to rely on older privacy laws that weren’t built for AI technology.



What About AI Voice and Video Deepfakes?


Deepfakes aren’t just visual. AI voice cloning is spreading quickly — and anyone who’s ever posted a TikTok, Reel, podcast, or video with sound can be targeted.


With just a few seconds of recorded speech, free online tools can clone your voice and create entire audio clips of you saying anything — even explicit or sexual content that was never real.


Right now, AI voice cloning law in Canada is a grey area. Most intimate-image laws, like BC’s IIPA, only cover visuals. If an AI video or audio uses your voice instead of your face, it may fall under impersonation, harassment, or fraud statutes instead.


For victims, the impact is the same: humiliation, fear, and loss of control over something as personal as their own voice. Lawmakers are beginning to debate whether “intimate image” should include synthetic voices and videos, but protection currently depends on how the content is used and whether harm or intent can be proven.



What To Do If Someone Makes a Deepfake of You


  1. Collect evidence – take screenshots, copy URLs, note dates and usernames.

  2. Report the content – use each platform’s “non-consensual image” form.

  3. File a police report – especially if there’s threat, extortion, or harassment.

  4. Seek civil help – in BC, use the Civil Resolution Tribunal; elsewhere, contact a lawyer.

  5. Request removals – submit Google’s “non-consensual explicit imagery” form.

  6. Reach out for support – VictimLink BC, Cyber Civil Rights Initiative, and Ending Violence Canada offer help and guidance.




The Bottom Line on Deepfakes in Canada


Are nude deepfakes illegal in Canada today?


Not always. Sharing or threatening to share a real intimate image is clearly a crime, but AI-generated sexual content still falls into legal grey zones. Civil laws like BC’s IIPA offer stronger protection, while federal reform remains overdue.


Until Canada updates its laws, your best defence is awareness, documentation, and swift reporting. Knowing your rights — and acting on them — is the most powerful tool you have against digital exploitation.



FAQ: Deepfake and AI Image Laws in Canada


Are nude deepfakes illegal in Canada under current law?


Not fully. Canada’s Criminal Code section 162.1 bans sharing real intimate images without consent, but it doesn’t name AI-generated content. A deepfake might qualify only if it clearly depicts a real person and causes harm. BC’s Intimate Images Protection Act includes “fake or altered” images, giving victims better civil options.


Can you sue someone for making or sharing a deepfake in Canada?


Yes. In BC, victims can use the Intimate Images Protection Act to order takedowns and seek damages up to $5,000. Other provinces allow privacy or defamation suits. These are civil, not criminal, remedies.


What should you do if a fake nude or deepfake of you appears online?


Move fast. Save screenshots, report the content to each platform, file a police report if there’s threat or extortion, and request removal through Google or a provincial tribunal. Acting early limits exposure.


Are AI-generated voices or videos covered by Canadian law?


Not clearly. Current laws focus on visual images. If someone clones your voice from social media to make explicit audio, it may fall under impersonation or harassment laws, but there’s no national rule yet.


Can you report a deepfake to police in Canada?


Yes. Report harassment or threat-related deepfakes to local police or the RCMP’s National Cybercrime Coordination Centre (NC3). Bring URLs, screenshots, and messages showing intent. A report builds legal record even if no charges follow.


What if the deepfake was created or shared outside Canada?


You can still act locally, but enforcement abroad is limited. Use platform takedowns, Google removals, and provincial tribunals to document harm while police request international assistance.


Is it illegal to create a sexualized deepfake of a celebrity?


Likely yes. Parody may be protected, but sexual or defamatory deepfakes of any identifiable person violate privacy and defamation laws. Most platforms now ban this content outright.


Are new deepfake or AI laws coming to Canada?


Yes. The proposed Artificial Intelligence and Data Act (AIDA) aims to regulate harmful AI tools, and legal experts want deepfake offences added. For now, Canada relies on older privacy and harassment laws.


How can Canadians protect themselves from deepfakes?


Keep personal images private, limit voice-heavy posts, reverse-search your name regularly, and report fake profiles quickly. Awareness and prevention remain the best defense.

header.all-comments


bottom of page