Cognitive biases are mental shortcuts that our brains use to make decision-making more efficient, but they can also lead to errors in judgment. These biases often arise because our brains are trying to process large amounts of information quickly, using patterns and assumptions to simplify complex situations. While these mental shortcuts can be helpful in everyday life, they can also make us more vulnerable to manipulation, especially in phishing attacks and other social engineering tactics.
Cognitive biases exist because our brains are wired to conserve cognitive energy and make quick decisions. However, this speed sometimes comes at the cost of accuracy. In stressful or high-pressure situations, like receiving an alarming email or phone call, these biases can make us more likely to fall for scams because we rely on instinct rather than careful analysis.
Cognitive biases can be grouped into four main categories:
Here are 10 common cognitive biases that phishing attackers exploit:
See our Cognitive Bias Index for more complete information on cognitive biases.
Cognitive biases are integral to how we process information and make decisions, often allowing us to act quickly and efficiently. However, while these mental shortcuts can sometimes lead to poor judgment or manipulation, they can also be harnessed for good. In fact, “social engineering for good” uses the same cognitive biases that attackers exploit to help individuals and organizations make better decisions, especially in high-stakes environments like cybersecurity.
For example, the availability heuristic causes people to focus on recent or easily recalled events. Cybersecurity training programs can use this to their advantage by frequently reminding employees about the dangers of phishing attacks, keeping the threat top of mind and making individuals more vigilant when suspicious emails appear.
Similarly, confirmation bias, which leads people to favor information that supports their beliefs, can be leveraged to reinforce positive behaviors. Regularly exposing employees to stories and examples of strong security practices can make them more likely to seek out and believe in the importance of safe online habits.
Anchoring bias can also be used defensively. By initially framing cybersecurity policies or instructions as critical, businesses can ensure that employees view these guidelines as non-negotiable. This establishes a solid foundation where security is a priority, and any deviation from it feels like a significant departure.
Negativity bias, which makes us focus more on negative outcomes, can be harnessed to drive better behavior as well. For example, cybersecurity training that emphasizes the severe consequences of a data breach—loss of sensitive information, financial penalties, reputational damage—can motivate individuals to be more cautious with their online actions, knowing what’s at stake.
In some cases, social proof, a bias where people look to others for behavioral cues, can also be an ally. Creating a culture of security, where employees see their peers adopting good practices like using strong passwords or reporting phishing emails, can encourage widespread adoption of secure behaviors. If everyone around you is practicing good security hygiene, you’re more likely to follow suit.
By understanding these biases and deliberately designing security programs that align with how people naturally think, organizations can build a stronger defense against cyber threats. Cognitive biases, often seen as vulnerabilities, can become powerful tools for fostering good decision-making and preventing attacks.
In a way, social engineering for good takes the tactics of attackers and turns them around to protect individuals. Rather than manipulating people to fall for scams, these tactics are used to empower them to stay safe online, reinforcing positive behaviors and making the right security decisions second nature.
Cybercriminals know that emotions are a powerful tool for manipulation, and they weaponize them to exploit vulnerabilities in their targets. Phishing attacks, scams, and social engineering tactics often rely on triggering emotional responses—fear, urgency, curiosity, or even empathy—to bypass rational thinking and provoke impulsive decisions.
For example, fear is commonly used to drive action. A phishing email might claim that your account has been compromised or that you’re at risk of losing access unless you act immediately. By inducing panic, attackers can lower your defenses and make you more likely to click on a malicious link or provide personal information without verifying the source.
Urgency is another emotion frequently exploited. Messages designed to create a sense of limited time—such as warnings about missed payments or expiring benefits—push you into making snap decisions. When feeling pressured, it’s easy to overlook potential red flags or forget to double-check the authenticity of the message.
Attackers also play on empathy and trust. Business Email Compromise (BEC) scams, for instance, often involve impersonating someone the target knows and trusts, like a colleague or executive, and requesting urgent help. The natural desire to assist someone in need can override skepticism, leading to dangerous actions like transferring money or sharing sensitive information.
In many cases, attackers combine several emotional triggers to create a potent mix of fear, urgency, and trust. This emotional manipulation bypasses logical decision-making, making victims act before they’ve had time to fully consider the consequences.
By understanding how emotions can be weaponized against you, it becomes easier to recognize and resist these tactics, making you less vulnerable to manipulation and better equipped to protect yourself from phishing and other cyber threats.
Case Studies: Phishing and Psychological Manipulation
Phishing scams often trick people by playing with their emotions and instincts. These scams can be very clever, making it hard to see through them. This chapter looks at real-life examples to show how phishing works and what we can learn from these attacks.
Case Study 1: The Ubiquiti Networks AttackBackground: In 2015, Ubiquiti Networks, a tech company, was tricked by a phishing attack.
Incident: Criminals sent fake emails pretending to be from the company’s top bosses. These emails were very believable and asked the finance team to send a lot of money to foreign bank accounts for urgent and secret reasons. The finance team followed the instructions and sent about $46.7 million before realizing it was a scam.
Psychological Tricks:
Outcome and Lessons: Ubiquiti got back about $8.1 million, but the incident taught them to have better email security and to double-check important requests.
Case Study 2: The FACC CEO FraudBackground: In 2016, FACC, an Austrian aerospace company, was hit by a phishing scam targeting their financial team.
Incident: Scammers pretended to be the CEO and asked an employee to transfer €50 million to a foreign bank account. The email was convincing, so the employee didn’t check with the actual CEO and sent the money.
Psychological Tricks:
Outcome and Lessons: Only about €10 million was recovered, and both the CEO and CFO were fired. The company learned the importance of having checks in place for large money transfers.
Case Study 3: The Target Data BreachBackground: In 2013, Target’s customer data was stolen starting with a phishing email to one of its vendors.
Incident: Scammers sent an email to employees at Fazio Mechanical, a company that worked with Target. The email had a bad attachment that, when opened, installed malware. This malware spread to Target’s network and allowed thieves to steal credit card information from millions of customers.
Psychological Tricks:
Outcome and Lessons: Target faced huge costs over $200 million. This incident highlighted the need for strong security measures with third-party vendors and being careful with email attachments.
Case Study 4: The Sony Pictures HackBackground: In 2014, Sony Pictures was hacked due to a phishing email sent to a high-level employee.
Incident: Hackers, believed to be from North Korea, sent an email with a bad link to a Sony executive. When the link was clicked, malware was installed, which stole a lot of sensitive data like unreleased movies and private employee information.
Psychological Tricks:
Outcome and Lessons: The hack caused severe damage to Sony, affecting its reputation and finances. This case showed the importance of educating employees about phishing and being careful with any unexpected emails or links.
Conclusion: These cases show how phishing relies on fooling people through trust, urgency, and other psychological tricks. Understanding these tactics can help individuals and organizations spot and stop phishing attempts. Training, awareness, and careful checking are key to staying safe from phishing.