This article will explain why what we call ‘The Compliance Mindset’ does not lead to better organizational security.
Why do compliant companies get breached? Because compliance doesn’t always equal security, especially when it comes to your employees and how they behave.
Pure compliance increases cyber risK
Employees with a compliance mindset, who grudgingly do what they are told, have been found to be a significantly higher risk to their organizations. This is due in part to the fact that compliance directives are often generic statements that many employees feel aren’t relevant to their specific organization. This can often lead to a ‘bare minimum’ output that is non-sustainable in the current threat landscape.
‘Internalized’ security – i.e. getting employees to ‘care’ about the security of their organization and their crucial role in it – is key. But what does this mean, and how do we achieve this in practice?
Getting employees to care about the security of their organization
When security is internalized, employees feel that the benefits outweigh the costs, that secure behavior is ‘the done thing’ for people in their position, and proud that their job and organization are of sufficient importance to merit protection. Internalization leads to security even when no one is watching.
However, getting employees to internalize – or ‘care’ – about security is clearly not a simple task.
At phishd, our approach to the psychology of attitude change is scientific and theoretical, boiled down to a system that specifies exactly what we need to do to change attitudes. We identified three essential approaches – Protection Motivation Theory, Theory of Planned Behavior, and Levels of Conformity, – to determine what truly motivates good security behavior. From these, we have determined that:
- Raising threat perceptions is only half of the equation – people must also feel able to take action, which psychologists term ‘self-efficacy’;
- Social perceptions are essential: we observe others and actively adapt our behaviors to fit in;
- Genuine (‘intrinsic’) motivation is much more powerful than the fear of punishment, and more achievable.
Out of these theories emerge eight core components that need to be addressed in order to change security attitudes. These are our eight ‘Behavioral Determinants’ - factors that fundamentally determine attitudes to security:
- Perceived Severity - our feelings and perceptions about the magnitude of the threat
- Perceived Likelihood – our feelings and perceptions about the likelihood of the problem affecting us
- Self-efficacy – the belief that we can do something about the problem
- Response Efficacy - the belief that the recommended response can do something about the problem
- Response Cost - our beliefs about the extent to which security costs us
- Personal Responsibility - the extent to which we feel personally responsible for security
- Value Alignment – the extent to which secure behavior is aligned with our personal values
- Social Perceptions – the way in which we consciously understand relevant social norms
Five common cultural problems
We have applied these behavioral determinants to come up with five common cultural examples that affect employee security behavior:
Example 1: Security/Productivity Trade Off
Employees ignoring security policies because they feel it hinders their ability to do their job
- This creates a culture that sees security as a barrier of success
- Behavioral Determinants: Response Cost, Social Perceptions
Example 2: Ambiguous Responsibilities
An employee receives what may be a sophisticated phishing email, but doesn’t report it as he feels it isn’t part of his job
- This creates a culture where security is seen as something that only IT needs to worry about
- Behavioral Determinants: Personal Responsibility, Beliefs
Example 3: ‘Distant’ Threat Perception
People feel like the threat is far away (‘distal’) as opposed to a pressing concern (‘proximal’)
- This creates a culture that doesn’t really believe there’s a problem, even when they’re told there is
- Behavioral Determinants: Perceived Vulnerability, Perceived Severity
Example 4: The Ostrich Effect
Employees fear they may have downloaded a malicious payload, but ignore it because they don’t feel empowered to react
- This creates a culture where employees ‘put their head in the sand’ because they’re fearful and uncertain of what to do.
- Behavioral Determinants: Perceived Severity and Vulnerability – High, Self-Efficacy and Response Efficacy – Low
Example 5: Moral Disengagement
People feel morally justified in ignoring security concerns, because – for example – awareness training fails to treat them as adults
- This creates a culture where employees feel that it’s justifiable for them to be disengaged from organizational security, often because it’s too hard or too alienating.
- Behavioral Determinants: Social Perceptions, Response Cost, Value Alignment
Why science trumps perceived ‘wisdom’
Many approaches to security awareness programs don’t take an explicitly scientific approach. But in the absence of this understanding, organizations are often groping in the dark for ways to change attitudes and improve the security behavior of employees – potentially even being inadvertently counterproductive. For example, a common pitfall is to raise employees’ fear levels as high as possible, without considering the bounds of their efficacy beliefs, leading to employees who are scared, disempowered, and ultimately avoid the problem.
A rigorously scientific approach will always beat one based solely on received wisdom.The above is an extract from our upcoming whitepaper, Hacking the Mind, which will be distributed on 18 July 2018 at our breakfast briefing in the ‘Gherkin’. To find out more, or to register for the event and the whitepaper, please register your interest.____________________________________________________________
1. Kelman, 1958
2. Albrechtsen, 2007
3. Boss, Galetta, Lowry, Moody, & Polak, 2015; Hanus, 2017; Herath & Rao, 2009
4. Ajzen, 1985; Ifinedo, 2012; Pattinson, Butavicius, Parsons, Mccormac, & Jerram, 2015
5. Kelman, 1958
6. Arachchilage et al., 2016
7. Bandura, 2006; D’Arcy, Herath, & Shoss, 2014