Cybersecurity 101 Training: How to build employee habits that prevent cyberattacks
Since the pandemic began, cyber attacks have been on the rise as hackers have exploited vulnerabilities in employees’ work from home practices. Rather than attacking large organizations or governments, which tend to have large security teams and protective software, hackers have shifted their tactics to exploit susceptibilities in employees’ privacy and security practices.1
Part of the reason for increased cyber attacks is employee negligence. Employee inattentiveness has become a more significant problem since the pandemic began, with more than half of attacks on organizations in 2021 being a direct result of employee negligence.2
Address Cybersecurity On Day 1
The onboarding process is a pivotal time to set expectations and create a robust cybersecurity foundation. It also provides opportunities to strengthen an employee’s sense of belonging, which greatly reduces the likelihood of lackadaisical behavior.
Prepare against negligence with knowledge-building
There is much at stake when it comes to properly training employees: on average, attacks caused by employee or contractor negligence cost companies $484,931.2 While there are effective ways to change behavior once an employee has finished onboarding and joined the company (more on that here), the focus of this article is on the importance of creating resilience from the start.
Research findings suggest that the most effective solution to prevent cyber attacks is to create a holistic training program that builds employees’ knowledge base about common hacking methods.3 Onboarding provides ample opportunities to improve awareness of cyber risks given every new employee must complete the same training protocols.
Form habits from the beginning
Another challenge that arises when assigning cybersecurity training to employees who are already onboarded is that it competes with their existing habits and work obligations.4 Similarly, asking them to complete it after work hours may leave some staff unable to attend, or invite inattentiveness.5
according to one corporate training professional, fewer than 20% of employees change their habits upon coaching.6
Behavioral science can tackle employee cybersecurity habits and reflexes, and you can find more on it here. A focus on training at the very beginning of someone’s tenure at a company avoids many of these pitfalls.
Promote belonging to increase security initiative
A key tenet of being able to protect information and act upon best cybersecurity practices is the level to which an employee feels engaged with the organization. Research suggests that committed employees make choices that benefit the organization and go beyond the minimum cybersecurity recommendations because they want to positively affect organizational outcomes.7 Onboarding is a key moment to enhance commitment.8
When a company fosters engagement during onboarding by treating new employees as partners in an organization and boosting their confidence, they experience more assimilation and less stress, both key factors when thinking about cyber risk.8 More broadly, those who successfully assimilate experience greater job satisfaction, higher rates of retention, and increased productivity. Conversely, those who were not onboarded well had higher turnover rates, decreased customer satisfaction, and reduced productivity.
Behavioral Science, Democratized
We make 35,000 decisions each day, often in environments that aren’t conducive to making sound choices.
At TDL, we work with organizations in the public and private sectors—from new startups, to governments, to established players like the Gates Foundation—to debias decision-making and create better outcomes for everyone.
Take Advantage of Behavioral Tendencies to Promote Belonging
One of the reasons why cyber attacks that exploit our basic nature, such as our trust or curiosity, succeed is because individuals often lack the motivation to undergo continued training.9 What’s more, an estimated 85% of cyber attacks across the globe are successful not because bad actors hack code, but because they hack our core human behaviors.10, 11
Research shows that behavior change rarely happens just because we’ve been exposed to a particular message; that message also has to be delivered at the right moment — specifically, one where we have the time, energy, and motivation to properly absorb and reflect on it. This idea encapsulates the elaboration likelihood model, which suggests that our ability to be persuaded on a topic depends on how invested we are in the topic.12 Onboarding is an ideal opportunity to capitalize upon a new employee’s excitement for the position and their desire to build a positive reputation within the company, as well as the fact that they’re not yet bogged down by other work and are more likely to have the resources to deeply process the information they’re receiving.
While only you understand the cultural nuances and cybersecurity needs of your organization, there are a few empirically proven suggestions that you can consider when discussing changes to onboarding procedures with colleagues.
1. Reinforce cybersecurity expectations from the start
How to do it: It is essential to create a high-trust environment from an employee’s first day by defining their role within the organization.13 This helps employees understand the context in which they will work, and gives them an idea of who to turn to in case they receive a phishing email or are hacked. During the onboarding process, provide the employee with a walkthrough of expectations, not just job-related but security-related, too. Make it clear that the employee has been chosen for a reason and why they belong at the organization.
Why it works: When trying to reinforce cybersecurity expectations among more tenured employees, leadership has to consider the role of habits in their employees’ daily lives. Habits can be extremely hard to break,6 even when they know to do better.14 By contrast, new employees have more of a blank slate and are not yet influenced by social norms which can shape how people act (and maintain cybersecurity) within a group.15
The impact: In understanding how important they and their cyber practices are to the organization, employees will value their privacy choices more highly, which can help prevent the type of negligent behavior exploited by hackers.16 When they feel valued in their team and employer, they are also more likely to report suspicious activity and go beyond minimum cybersecurity recommendations.7
2. Practice effective communication about risk
How to do it: Firms run into various issues when communicating risk. They might communicate too frequently (leading to notification fatigue and employees who tune out the messaging) or not frequently enough (leading to employee perceptions of ineptitude, or the belief that the firm values its own reputation over employee protection).17
One effective communication strategy is the Extended Parallel Process Model (EPPM), in which threat messages are balanced with messages about self-sufficiency and empowerment.18 Moreover, be mindful when presenting statistics: while it may attract the attention of some, others with lower numeracy fluency may dismiss the message.19
Why it works: By including behavioral recommendations next to fear-inducing text, CISOs and managers can encourage their teams to take steps for self-protection, rather than simply panic or ignore the message.18 By including specific recommendations, a firm can empower employees to help themselves and take action when something goes wrong.18
The impact: All too often, employees receive communications either during onboarding or throughout their tenure, with varying levels of effectiveness. Poorly communicated messages can lead employees to adopt a careless attitude towards security.19 Through strategic wording and timing, leadership can encourage the results they want to see and ensure better cybersecurity measures among employees.
Even though Human Resource professionals typically run onboarding sessions, an even better solution would be to work closely with your company’s Behavioral Insights team and capitalize on their familiarity with the behavioral fallacies outlined in this article. If you’re looking to launch a Behavioral Insights team within your organization, but you’re not sure where to start, get in touch with us — we’ve helped some of the world’s largest companies introduce behavioral insights into their work by building “Nudge Units” tailored to their unique culture and goals.
Timing your cybersecurity training for maximum impact
Behavior change isn’t just about what you say to people; it’s also about when you have that conversation. By focusing on initial training, the likelihood of negligence-related cyber attacks can be greatly reduced, with the additional benefit of giving employees a greater sense of belonging and responsibility.
There is a broad range of behavioral fallacies that we encounter in cybersecurity preparedness. In their webinar, Strengthen Your Strategy with Cybersecurity, The Decision Lab and Boston Consulting Group discuss two hypothetical scenarios that highlight different levels of readiness and the psychology behind each.
The Decision Lab is a research-oriented consultancy that uses behavioral science to advance social good. We work with some of the largest organizations in the world to spark change and tackle tough societal problems. We’re advised by some of the most innovative minds in cybersecurity, a growing issue in the new WFH landscape. If you'd like to tackle this together, contact us.
References
- Okereafor, K., & Adelaiye, O. (2020). Randomized Cyber Attack Simulation Model: A Cybersecurity Mitigation Proposal for Post COVID-19 Digital Era. 05, 61–72.
- 2022 Cost of Insider Threats Global Report. (2022). Proofpoint. https://www.proofpoint.com/us/resources/threat-reports/cost-of-insider-threats
- Greitzer, F. L., Strozer, J. R., Cohen, S., Moore, A. P., Mundie, D., & Cowley, J. (2014). Analysis of Unintentional Insider Threats Deriving from Social Engineering Exploits. 2014 IEEE Security and Privacy Workshops, 236–250. https://doi.org/10.1109/SPW.2014.39
- Conteh, N., & Schmick, P. (2016). Cybersecurity:risks, vulnerabilities and countermeasures to prevent social engineering attacks. International Journal of Advanced Computer Research, 6, 31–38. https://doi.org/10.19101/IJACR.2016.623006
- Aldawood, H., & Skinner, G. (2019). Reviewing Cyber Security Social Engineering Training and Awareness Programs—Pitfalls and Ongoing Issues. Future Internet, 11(3), 73. https://doi.org/10.3390/fi11030073
- Yakowicz, W. (2015, February 17). 3 Mistakes You’re Making When Coaching Employees. Inc.Com. https://www.inc.com/will-yakowicz/3-mistakes-you-make-coaching-employees.html
- Blau, A., Alhadeff, A., Stern, M., Stinson, S., & Wright, J. (2017). Deep Thought: A Cybersecurity Story. ideas42. https://www.ideas42.org/wp-content/uploads/2016/08/Deep-Thought-A-Cybersecurity-Story.pdf
- Caldwell, C., & Peters, R. (2018). New employee onboarding – psychological contracts and ethical perspectives. Journal of Management Development, 37(1), 27–39. https://doi.org/10.1108/JMD-10-2016-0202
- Mann, I. (2017). Hacking the Human: Social Engineering Techniques and Security Countermeasures. Routledge. https://doi.org/10.4324/9781351156882
- Verizon 2021 Data Breach Investigations Report. (2021). Verizon. verizon.com/dbir
- Iny, A., Khanna, S., Coden, M., & Struck, B. (2021). Strengthen Your Strategy with Cyber Scenarios. Boston Consulting Group & The Decision Lab. https://app.hubspot.com/documents/3834397/view/233481126?accessId=f10950
- Hopper, E. (2019, July 3). What Is the Elaboration Likelihood Model in Psychology? ThoughtCo. https://www.thoughtco.com/elaboration-likelihood-model-4686036
- Leana, C. R., & van Buren, H. J. (1999). Organizational Social Capital and Employment Practices. The Academy of Management Review, 24(3), 538–555. https://doi.org/10.2307/259141
- Gundu, T. (2019, May 13). Acknowledging and Reducing the Knowing and Doing Gap in Employee Cybersecurity Compliance.
- Kelman, H. C. (2006). Interests, Relationships, Identities: Three Central Issues for Individuals and Groups in Negotiating Their Social Environment. Annual Review of Psychology, 57(1), 1–26. https://doi.org/10.1146/annurev.psych.57.102904.190156
- Wiederhold, B. (2014). The Role of Psychology in Enhancing Cybersecurity. Cyberpsychology, Behavior and Social Networking, 17, 131–132. https://doi.org/10.1089/cyber.2014.1502
- Cleaveland, A., Newman, J. C., & Weber, S. (2020, September 24). The Art of Communicating Risk. Harvard Business Review. https://hbr.org/2020/09/the-art-of-communicating-risk
- Zhang, X. A., & Borden, J. (2020). How to communicate cyber-risk? An examination of behavioral recommendations in cybersecurity crises. Journal of Risk Research, 23(10), 1336–1352. https://doi.org/10.1080/13669877.2019.1646315
- Nurse, J. (2013, January 1). Effective Communication of Cyber Security Risks. https://www.researchgate.net/publication/274663654_Effective_Communication_of_Cyber_Security_Risks
About the Authors
Lindsey Turk
Lindsey Turk is a Summer Content Associate at The Decision Lab. She holds a Master of Professional Studies in Applied Economics and Management from Cornell University and a Bachelor of Arts in Psychology from Boston University. Over the last few years, she’s gained experience in customer service, consulting, research, and communications in various industries. Before The Decision Lab, Lindsey served as a consultant to the US Department of State, working with its international HIV initiative, PEPFAR. Through Cornell, she also worked with a health food company in Kenya to improve access to clean foods and cites this opportunity as what cemented her interest in using behavioral science for good.
Dr. Brooke Struck
Dr. Brooke Struck is the Research Director at The Decision Lab. He is an internationally recognized voice in applied behavioural science, representing TDL’s work in outlets such as Forbes, Vox, Huffington Post and Bloomberg, as well as Canadian venues such as the Globe & Mail, CBC and Global Media. Dr. Struck hosts TDL’s podcast “The Decision Corner” and speaks regularly to practicing professionals in industries from finance to health & wellbeing to tech & AI.
Dan Pilat
Dan is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. Dan has a background in organizational decision making, with a BComm in Decision & Information Systems from McGill University. He has worked on enterprise-level behavioral architecture at TD Securities and BMO Capital Markets, where he advised management on the implementation of systems processing billions of dollars per week. Driven by an appetite for the latest in technology, Dan created a course on business intelligence and lectured at McGill University, and has applied behavioral science to topics such as augmented and virtual reality.