Humans are creatures of habit. Whether we’re picking an afternoon snack or a website password, we tend toward familiar patterns, even when we know we could make better choices. Trading in that doughnut for an apple can be tough and using that same password over and over, setting aside security, can be all too easy. For businesses, that natural human preference for ease and familiarity can lead to major cybersecurity vulnerabilities, from social engineering to information security flaws and more.
Can organizations help reduce cybersecurity errors by influencing employee behavior? Yes, quite possibly — but first, we need to understand the role human psychology plays in our cybersecurity mistakes.
What’s driving our cybersecurity behaviors?
Pioneering social scientist Kurt Lewin mapped out in his Field Analysis Model that a variety of factors can change human behavior, broadly speaking, by accelerating it or slowing it down. According to Lewin, some forces in our lives prevent us from completing things, and other forces encourage us.
- Lewin’s behavior “brakes” can take the form of internal or external obstacles, hindrances, or resistance. Change aversion might slow down a behavior. So might limited resources or organization constraints.
- His “accelerators” are catalysts that support movement toward a goal. Influential leadership might spur someone onward. So might feeling supported, having adequate resources, or receiving a reward.
In Lewin’s model, the goal is to identify both brakes and accelerators in a given situation, then devise strategies that reduce the impact of brakes and strengthen the influence of accelerators, ultimately increasing the likelihood of successful change.
While this may sound like a theory you’d come across in an article about how to keep New Year’s resolutions, it has a lot to do with cybersecurity as well.
How do we encourage safer behaviors?
Despite our understanding of cybersecurity — plus the flood of large data leaks in recent years — a single factor still contributes to the majority of breaches: human mistakes. In fact, a recent study found that human error was responsible for 74 percent of cybersecurity breaches.
So, why do we make these mistakes again and again? Because we tend to stick with familiar patterns of behavior, even when we know there’s a better way — even when we know there’s a reward for changing!
For businesses, the traditional approach to cybersecurity involves a steady drumbeat of email reminders and required training. Employees do their job to get paid, so isn’t getting paid enough of an incentive to complete the trainings and abide by the policies?
When employees use old passwords or click on phishing emails, you shouldn’t just look for the accelerators of proper training or financial incentive. Also watch for a brake that might stop them from changing their patterns of behavior. These brakes might take different forms for different individuals.
For example:
- Meeting-packed days and busy work schedules may push cybersecurity to the bottom of the to-do list.
- Secure systems might be slow or hard to use.
- The sheer number of secure passwords an employee needs to remember might be daunting.
- The company’s cybersecurity processes might not be as user-friendly as they could be, discouraging their adoption.
Release the brake.
So, how can IT security professionals “release the brake,” rather than just pressing the accelerator?
Instead of inundating employees with training or requiring increasingly complex processes, many companies are adopting streamlined user processes. For example, some organizations are staying more secure by going “passwordless,” which includes using one-time passcodes (OTPs) as a measure that releases the brake.
By removing barriers to compliance, rather than providing more tools and information, organizations are encouraging safer behaviors that are more likely to be adopted and adhered to.
Accelerators can drive cybersecurity.
If releasing the brake will move people toward better choices, then can incentives — accelerators — also play a positive role?
We say yes. As an example, we recall a conversation with a client in which the topic of employee training came up, but with a refreshing twist. Rather than treating employees like the weak link in cybersecurity, our client asked how to nudge them from basic compliance toward being “cybersecurity superheroes.”
This may be a challenge for any organization, but there’s research to back it up. Consider, for instance, Professor Robert Rosenthal’s work with self-fulfilling prophecies. As described in Rutger Bregman’s book, “Humankind: A Hopeful History”, Rosenthal carried out a study of students who believed they were taking an IQ test.1
After the “test,” researchers randomly labeled some students as “high potential” and identified them to their teachers. Subsequently, the teachers gave their high-potential students more attention and more praise, and the students’ self-image improved.
Here’s the interesting part: not only did the students’ real-life test scores improve, but the most dramatic improvements happened with children whose teachers previously had held low expectations of them. Researchers call this the Pygmalion effect, after the Greek myth.
Creating cybersecurity superheroes
If we apply the Pygmalion effect to workplace behaviors, it seems logical that employees would adopt safer online behaviors and a more positive attitude toward cybersecurity. Tell them that they can help stop these breaches, treat them as knowledgeable professionals, and we could expect to see improved performance.
While positive thinking is no silver bullet, we can start to change the narrative around cybersecurity and how to engage employees. A logical first step is to ensure company communications don’t make employees feel like they are the problem. Providing relevant, thoughtful training and the right tools is important, but creating an environment where employees are empowered to make positive change is also a key component. How much more effective could your employees be if they felt like cybersecurity superheroes?
To learn more about how Liberty Mutual helps businesses take on the cyber risks of today and tomorrow, visit our cyber liability page.
- Bregman, Rutger. Humankind: A hopeful history. New York, NY: Little, Brown and Company, 2020.
Featured insights
This website is general in nature, and is provided as a courtesy to you. Information is accurate to the best of Liberty Mutual’s knowledge, but companies and individuals should not rely on it to prevent and mitigate all risks as an explanation of coverage or benefits under an insurance policy. Consult your professional advisor regarding your particular facts and circumstance. By citing external authorities or linking to other websites, Liberty Mutual is not endorsing them.