The email from the boss asking to transfer money to a new supplier. The link sent from a client, but whose return email address is slightly off. The request from somebody in the IT department asking for system login information.
We read about these typical spearphishing attempts, and more often than not we think: Who falls for these? And how could they?
The answer to the first question is simple: You do. We do. Nearly 800,000 people fell victim to cyberscams in 2020, according to the Federal Bureau of Investigation’s Internet Crime Complaint Center. That was an increase of around 69% over 2019, with reported losses of more than $4 billion last year.
As for the second question? Blame it on our brains. Criminals lure smart people into their traps by taking advantage of the unconscious, automatic processes that act as shortcuts to make our decision-making more efficient. These cognitive biases—arising from what’s often referred to as our “lizard brains”—can cause us to misinterpret information and make snap judgments that may be irrational or inaccurate.
“Cybercriminals will do anything they can to trigger the lizard brain,” says Kelly Shortridge, a senior principal at Fastly, a cloud-computing-services provider. They will use corporate logos we’re familiar with, or tell us to act fast or our bank account will be shut down, or hijack personal information from social media to impersonate a friend or an executive—whatever it takes to get users to click on a link, open an attachment, wire money or send compromising information.
Recognizing that we have biases is the first step to overcoming them. It isn’t easy, even if these biases are sometimes painfully obvious. “If we can’t even eat healthy foods and go to the gym regularly, how can we realistically ask users to check every single bias?” asks Ms. Shortridge.
We can start by understanding the big ones.
The idea is simple: People find the pain of loss much greater than the joy of a gain of equivalent value. That’s why, when people are given $100, and then asked if they want to participate in a coin toss in which they have a 50/50 chance of losing that $100 or doubling it to $200, people tend not to take the gamble. The idea of losing $100 is just too hard to swallow.
The result, says Cleotilde “Coty” Gonzalez, a professor of decision sciences at Carnegie Mellon University, is that “if something is presented as a loss, we are more willing to take a risk [to avoid it]; if it’s presented as a gain, we are OK with taking a safe option.”
Prof. Gonzalez says scammers use this insight when sending phishing emails. If an email arrives saying that your alarm service is about to be shut off because you haven’t paid a monthly fee, for example, you may click on the link to prevent losing your security system. If, however, the email had said you can click on the link to lower your monthly alarm payments, you might ignore the request.
Or a scammer might send a message to your work email, claiming that there is a problem with an account at one of your corporate suppliers, and warning that your shipment—one that your boss is counting on—will be delayed unless you verify your account information in a link provided by them. The fake link leads to a fake website that looks like the real thing. By playing on your fear of losing access to your account, the scammer gets your credentials.
As humans, we inherently trust figures with power. It makes sense: How could we get anything done if we doubted everybody equally? But hackers know that if we get an email from a trusted source, we let down our guard.
“The epitome of the exploitation of this bias is ‘business email compromise,’ ” says Kevin Haley, director at Symantec, a division of Broadcom Inc., a global semiconductor and software business. In a typical scam, criminals send an email message that looks like it comes from a known authority figure who is making a recognizable request.
It could be what looks like the boss sending an in-house email, asking you to send a current payment to a new bank-account number. Since you think you’ve seen that email address before, and your job is to fulfill your boss’s requests, you might do what is asked of you. What you didn’t notice is that the email address is slightly different from the real address. Worse, Mr. Haley adds, “the more sophisticated attackers will phish the real email account of the boss.”
According to the FBI, such “business email compromise” scams resulted in about $1.8 billion in reported losses last year.
We all know the tendency to do things that are the most urgent, and this sense of urgency means we may not be as thoughtful as usual.
“If it’s the boss reaching out directly to you, asking you to do something quickly, you jump into action,” says Mr. Haley. “You don’t even realize that the email address isn’t exactly correct, because you are so concerned with pleasing the boss. That is perfect social engineering.”
Mr. Haley offers an example of how a scammer might prime a victim with a few emails that appear to come from the CEO, thus elevating the urgency (as well as triggering the authority bias) and overriding the voice telling you to follow the normal procedures.
The first email may ask: “Are you at your desk?” You wonder: Why is the CEO asking? Did I do something?
The second email asks: “I just sent you a wire request. Did you not get it?” Oh no, I’m going to screw up if I don’t fulfill that request.
The third email says, “I will have the invoice sent as soon I can access my computer. Email me the wire transfer when complete.” This is so urgent that the boss wants me to ignore protocol. I better act now.
Alana Maurushat, professor of cybersecurity and behavior at Western Sydney University in Australia, says emotions contribute to a sense of urgency. If it appears that the email from your boss is about an irate vendor, the anxiety level goes up. “The more emotions a cybercriminal can bring into context, the more likely someone is to play along,” she says. “When emotions are triggered, human brains go into a different mode.”
We all have positive views of brands, companies and people we like, and bad actors can take advantage of that. If an invitation to join an elite club or speak at an exclusive conference lands in your inbox, especially if the email appears to come from an organization you admire, there’s a good chance you’ll click on a link to sign up, and perhaps provide too much personal or company information.
ASK A QUESTION
What kinds of cyberattacks have you seen? What did you do? Email email@example.com.
Rod Simmons, vice president of product strategy at Omada, an identity governance and administration company, offers another example: A scammer impersonating your work credit-card company sends an email saying it has detected fraudulent use on your card, and it directs you to click on a link to verify the most recent transactions. Since you have had favorable experiences with the company, you won’t question whether the request is legitimate. You click through and eventually are asked to log into your account.
It’s natural to choose smaller, immediate wins over bigger, future rewards. It’s all about instant gratification.
“People care about the present self more than the future self,” says Ms. Shortridge.
Imagine, she says, it’s the last week of the quarter and a team is about to land a big new customer. If the sales leader receives an email attachment that seems related to the deal, she will give priority to her immediate preference of closing the deal, rather than giving equal weight to the concern about avoiding a data breach in the future—a breach that seems unlikely to happen, since most of the emails sales leaders receive with links and attachments are legitimate rather than malicious.
We tend to make judgments based on whatever we’ve most recently experienced. If we haven’t seen it before, our alarm bells don’t go off.
Patrick Murray, chief product officer at Tugboat Logic, a security compliance management platform, says this means that scammers may have more luck with novel social engineering attacks that employees at a particular company haven’t seen before. Maybe employees have been trained to spot typical, email-phishing attacks. But then they receive a call from their seemingly legitimate IT help desk alerting them to some issue. The scammer may ask the victim to give them their login credentials, potentially handing over the keys to the entire kingdom.
Illusion of unique invulnerability
Sometimes referred to as the “optimism bias,” this occurs when people think a bad thing is very unlikely to happen to them—so they give their credentials to a “colleague” who is actually a scammer.
“There is an adage that there are two types of people: those who have been hacked and know it, and those who have been hacked and don’t know it,” says Mr. Murray.
Dr. Maurushat says the training manuals given to cybercriminals “actually say that the target is a white male over 40.” They target these men, she says, “because they think they could never be scammed.”
Overcoming biases isn’t easy, because these shortcuts are baked into human thinking. But cybersecurity experts say training—especially gamifying exercises where potential targets get to respond to attacks that feel real—can help.
There also are technical solutions to counter the effect of biases, such as multifactor authentication, password managers, and changing communications channels if something seems fishy.
But potentially the most effective solutions are nontechnical and long term. One is getting employees to slow down when speed isn’t crucial. Ms. Shortridge says that building a culture without unnecessary deadlines, one that doesn’t incentivize constant urgency and allows employees to catch their breath “so they can trigger the philosophy brain, not the lizard brain,” would diminish successful attacks.
Another is to build a company culture that rewards good behavior, like reporting suspicious activity and questioning unusual requests. This, says Mr. Simmons, takes buy-in from the top. “If the CEO doesn’t care about training her employees to be better defenders,” he says, “why should anyone else?”