Week 35: Scams involving public figures – the dark side of artificial intelligence

02.09.2025 - Last week, we received numerous reports about fake ads and videos featuring President Karin Keller-Sutter. For this reason, this week's review will focus on the use of artificial intelligence to manipulate content. Scammers are using deepfakes to portray the finance minister as a supposed supporter of fraudulent investment platforms. Their goal is to exploit people's trust in public officials to trick them into taking part in a scam – in this instance, making a fake investment. This week's review explains how you can recognise a scam video or ad, and how you can protect yourself.

For some time now, fake adverts have been circulating on social networks such as YouTube and on various websites, featuring President Karin Keller-Sutter. These ads and videos are designed to look like interviews and give the false impression that the head of the Federal Department of Finance is endorsing a lucrative investment platform. It is no coincidence that Ms Keller-Sutter is the subject of these videos: criminals deliberately exploit the trust people place in senior government officials to lend credibility to their fraudulent financial offers. People are generally more inclined to trust videos featuring familiar faces than plain text ads. By linking a well-known public figure with the promise of quick and easy profits, scammers try to cloud their victims' judgement. This is a hallmark of investment fraud.

Faces and voices – the world of deepfakes

These kinds of manipulated images and videos are known as deepfakes. The term comes from a combination of deep learning (a method of artificial intelligence) and, obviously, fake. To make a deepfake, an AI is trained using existing photos and videos of the target person. The AI then manipulates a video, image, or audio to make it appear as if that person said or did something that never actually happened. Thanks to social media and publicly accessible online sources, this type of material is easily found for celebrities – and many other people.

The most common techniques include face swapping, where one face in a video is replaced with another, and facial reenactment, where one person's facial expressions and lip movements are transferred onto another person. This is often combined with voice cloning, where someone's voice is artificially recreated. This used to require specialist skills and expensive equipment. Today, however, the technology is intuitive and available to anyone through freely accessible software and apps.

Example of a deepfake video featuring Federal Councillor Karin Keller-Sutter on YouTube.
Example of a deepfake video featuring Federal Councillor Karin Keller-Sutter on YouTube.
Example of a fake online investment ad.
Example of a fake online investment ad.

How to spot digital fakes

The best defence against this kind of manipulation is a healthy dose of scepticism and critical thinking. Always keep in mind that videos and images are easily altered and uploaded to social networks and other websites. Before focusing on technical details, consider the context and content, and trust your instincts. If an offer seems too good to be true, or a public figure appears to be acting out of character, proceed with caution. Always cross-check information with reliable sources, such as official websites or established news agencies.

There are also other warning signs that may indicate a fake. The following checklist summarises the key points:

Feature Warning signs / What to look for
Face and eyes Unnatural blinking (too much or too little), fixed or empty stare, mismatched skin tone around the edges of the face.

Mouth and voice

Lip movements not in sync with speech, robotic or flat tone, odd pauses, unusual emphases.

Image quality

Blurry or distorted edges around the face and hair, inconsistent lighting or flickering in the video.

Content and context

Is the person saying something completely out of character? Is the offer too good to be true? Does the statement being made actually make sense?

Source

Is the video on an official channel? Are reputable media outlets reporting on it? A quick online search often helps.

Although there are software tools for detecting deepfakes, there is no single magic tool that works flawlessly. The technology for creating fakes is evolving just as quickly as the technology for detecting them. Staying alert and media-savvy remains the most effective way to spot manipulated content.

Recommendations

  • Do not click on links in an ad and never enter personal details. Do not forward dodgy videos or adverts.
  • Report any suspicious content to the platform on which it was published (e.g. YouTube or Facebook). This is the quickest way to have it removed.
  • Report scams to the National Cyber Security Centre (NCSC) using the official reporting form.
  • If you have suffered financial loss as a result of this type of scam, file a criminal complaint with the police immediately. You can use the Suisse ePolice website to find nearby police stations and their phone numbers (available in German, French and Italian).

Last modification 02.09.2025

Top of page

https://www.ncsc.admin.ch/content/ncsc/en/home/aktuell/im-fokus/2025/wochenrueckblick_35.html