Fraud gets personal with social engineering

By Anthony Green
Sep 9, 2024
Fraud gets personal with social engineering
Photo credit: Tomas Knopp/iStock/Getty Images

Fraud is becoming increasingly personal, and can come with a face you know, a voice you recognize, and a knack for witty conversation. It’s all part of social engineering, a type of fraud that uses psychological manipulation to gain the trust of individuals

Threat actors can use social engineering to extract money and sensitive personal information from victims through methods such as forging supposed romantic relationships or impersonating a family member in an emergency. Due to developments in AI, such the use of deepfake and voice cloning, social engineering is becoming more sophisticated and harder to detect. 

Romance scams

Social engineering is making romance scams easier to fall head over heels for. After connecting on dating apps, threat actors research their potential victims via social media. With this information, they form close bonds with their targets by pretending to share common interests and exploiting their vulnerabilities. AI tools like ChatGPT can help threat actors win over their victims with witty, engaging text conversations. 

Video chatting is no longer an issue for threat actors, as they can use deepfake technology to chat with potential victims in real time. In these videos, threat actors swap their faces for faces from stolen and stock photos – the same photos used to create the fake profiles that enticed the potential victims. 

Deep fake uses facial recognition technology to superimpose one person’s face onto another person’s body. To create a deepfake, a creator gathers images of the person they want to generate a likeness of, and inputs this material into a deepfake program, such as any number of widely accessible face swapping apps. 

In romance scams, threat actors are quick to profess their love, and once they’ve gained their victim’s trust and affection, they convince them to send money or make a specific investment, usually in cryptocurrency. 

A group of scammers known as Yahoo Boys have deceived countless individuals worldwide through these methods. Yahoo Boys have openly admitted to their schemes and have posted disturbing content of video chats with unsuspecting women. 

Here are some ways to tell if your romantic match is a fraud:

  • Identify deepfake: When video chatting, watch for abnormalities such as unnatural eye movements, or the same set of repeated motions or recurring facial gestures. Also keep an eye out for glitches such as noticeable jumps or blurring around the edges of the person’s face. Pay attention to whether the person’s lips match up with the audio.
  • Quick to fall in love, and financial requests: Any time someone professes their love too quickly, it’s a red flag. Likewise, if they ask you for money or to invest in something. 
  • Suspicious links: Threat actors are known to send links that allow them to gain remote access to their victims’ devices.
  • Photo confirmation: Do a reverse image search on their photo to see if the person is really who they say they are. 

Related articles

Grandparent scam

Another type of social engineering is the “grandparent scam”, where threat actors call seniors, usually late at night, claiming to be a grandchild or other family member in distress. Examples include being involved in a car accident, getting arrested, or being kidnapped. The caller pleads for their “grandparent” to immediately send money to rescue them from the situation.

In these types of scams, threat actors are increasingly using voice cloning to convince their potential victims. Similar to deepfake, voice cloning technology is widely available, and all that’s required to create a replica of someone’s voice is a short sample. Threat actors often extract these voice samples from social media accounts, while also pulling personal information, such as the names of family members. 

Even if the voice clone isn’t perfect, the call can still be convincing as the caller may muffle their voice under the guise of crying or being in distress. The caller will urge the intended victim not to tell anyone by saying things such as, “Please don’t tell mom, grandma. She’s going to be so upset,” or “If you tell anyone, the kidnapper will hurt me.” Another voice will often take over, under the guise of a police officer, lawyer, kidnapper, etc.

The intended victim is then provided with instructions on how to forward the request funds, often through methods such as e-transfer, wire-transfers, gift cards, or sending cash by courier. Threat actors sometimes even show up at the victim’s door to collect payment. 

It’s important to note that this type of scam can be adjusted to target anyone, as the caller can claim to be any type of family member or other close contact. Alternatively, different versions of this scam can be communicated through social media and text messaging.  

If you suspect you are the target of a grandparent or emergency scam:

  • Don’t answer the caller’s questions: Threat actors will ask questions to probe for information that they can use to add legitimacy to their story. Instead, make sure you’re the one asking questions. Dig for details that only the person they’re claiming to be would know. 
  • Don’t rush: While threat actors will try to rush you to send money, don’t give into this. Remain calm and take your time thinking through the situation. 
  • Contact others: If it’s possible to use another device while staying on the line, try contacting close contacts of the supposed caller, perhaps through social media or text messaging, to verify the caller’s story. You might even contact the person the caller is claiming to be.

As the evolution of social engineering scams continues, it is important to safeguard yourself by regulating the information you share on social media. Having this awareness can help you identify potential fraud. For example, does your romantic interest genuinely share a love of country music, or did they see the Instagram photos you posted from a country music festival?

And while it can be alarming to hear the voice of a loved one in distress, remember that you can’t take everything you see or hear at face value these days. Remember that law enforcement and emergency medical services will never demand immediate payment, much less in unusual forms such as wire transfers or gift cards. 

If you have been or suspect you’ve been a victim of fraud, contact the Canadian Anti-Fraud Centre at 1-888-495-8501 or report it online.


Anthony Green is manager, security operations and compliance at CPABC

In Other News