3 min read
April 9, 2021 at 1:00 PM
Previous story← The Anatomy of an IT Policy
Get Email Notifications
No Comments Yet
Let us know what you think
“When dealing with people, remember you are not dealing with creatures of logic, but with creatures bristling with prejudice and motivated by pride and vanity” – Dale Carnegie
In this quote, American writer and lecturer Dale Carnegie touches upon the topic of cognitive bias. A cognitive bias is a strong, preconceived notion of someone or something, based on information we have, perceive to have, or lack. These preconceptions are mental shortcuts the human brain produces to expedite information processing. They quickly help the brain make sense of what it is seeing. Some of these biases are related to memory and the way you remember an event. Cognitive biases can often lead to flawed and irrational judgements, as individual's construction of reality, not the objective input, may dictate their behavior.
At this point, you may be asking yourself, “What does any of this have to do with IT security?”. Cognitive biases play a large role in social engineering, whether it be phishing, vishing, smishing, or physical on-site social engineering. This is most commonly carried out through a form of cognitive bias known as halo effect (sometimes called halo error). Halo effect is the tendency for positive impressions of a person, company, brand, or product in one area to positively influence one's opinion or feelings in other areas. Inversely, the horn effect is the tendency for negative impressions of a person, company, brand, or product in one area to negatively influence one's opinion or feelings in other areas. A good example of a halo effect would be a food that is not healthy but contains no fat and is marketed as “fat free”. In this example, an individual may look at the packaging and make a judgement that the food item is healthy based on the cognitive bias that a fat free food must be good for you, even if that is not truly the case.
The same goes for the tactics used by social engineers. Look at some of the strategies they utilize: sending you an email claiming that you have won a giveaway from a well-known brand, showing up to your office in a company uniform to gain access, sending you a text message that your stimulus check was deposited in your bank account. The social engineers are hoping that your cognitive bias kicks in, and that you only look at the positive clues they have laid out and overlook the red flags: that email came from a fraudulent address (email@example.com), that person dressed as an employee had an access badge that wouldn’t scan, that text message claiming to be from the IRS was directing you to a non-.gov website.
In Daniel Kahneman’s book Thinking, Fast and Slow, the writer explains the two different decision-making processes that occur in our brains, known as System 1 and System 2. System 1 is the brain’s automatic, intuitive, and unconscious thinking mode. It requires little energy or attention, but it is often biased prone. System 1 is more influential and guides our day-to-day decisions. System 2 is a slow, controlled, and analytical method of thinking where reason dominates. Unlike system 1, System 2 requires energy and attention to think through all the choices. It will filter out the automatic instincts and cognitive biases to make the best choice.
Social engineers also play upon these two systems to carry out their exploits (whether they know it or not). For example, you receive the fraudulent email we previously mentioned claiming that you have won a giveaway from a well-known brand. The sender’s email address is firstname.lastname@example.org and the email has been crafted to look exactly like one of Netflix’s actual emails. The social engineer is hoping your brain will make a System 1 decision and unconsciously click the link, without examining the email, noticing the red flags, and making a calculated decision.
By understanding the two systems, we can better understand which situations are causing cognitive biased decisions. By understanding the halo effect, we can better understand why we have cognitive bias. Cognitive bias can lead to blind spots in our decision-making processes that reduce thinking accuracy and can result in inaccurate and irrational conclusions.
So, what can an organization do to mitigate the risk of a social engineer (hacker, cyber-criminal, whatever you want to call them) exploiting the cognitive biases of your staff to breach your environment and compromise your data? Security awareness training is a great starting point. Train your staff to better understand cognitive biases, recognize threats, and respond accordingly. Training should be an annual process (at least), and not a one-and-done event. Conducting social engineering assessments (phishing, vishing, smishing, onsite) takes it even further, putting your staff in simulated real-world threat situations where they will be forced to put their training to use. Compass IT Compliance has spent the past decade carrying out these trainings and assessments for our clients, providing them with valuable insights to the strengths and gaps within their IT security programs. Contact us today to learn more and discuss your unique situation!
These Related Stories
Let us know what you think