Why security is in denial about awareness

Geordie Stewart explains why refusal to acknowledge legitimate criticisms of information security awareness puts users at risk

Denial has two meanings. It can refer to the refuting of an allegation or assertion. It can also refer to a psychological defense mechanism where criticisms are rejected because they are uncomfortable, despite evidence to the contrary. How a professional group responds to criticism tells you a lot about their ability to evolve and improve.

The awareness practitioner criticism of security awareness has been fascinating. In Why you shouldn't train employees for security awareness, Immunity Inc.'s Dave Aitel outlines reasons why he thinks money spent on security is money wasted. In response to that article, there have been rebuttals, such as Ira Winkler's Security awareness can be the most cost-effective security measure. There has also been an attempt to explain that bad security awareness techniques are all in the past. However, key points have been missed in the scramble to pick peripheral holes in awareness criticisms.

In his blog, Schneier on Security, Bruce Schneier states that security awareness is generally a waste of time. Since there's still a majority who think that awareness campaigns are about locking people in a room for an hour and putting up a few posters, Schneier is probably right.

[9 dirty tricks: social engineers' favorite pick up lines]

At the heart of this debate is a fundamental question: While many would agree that information security awareness techniques need to improve, are we talking about a few tweaks or a complete overhaul? The problem is that if security awareness is all about changing behavior, then why don't security awareness tools and processes look anything like other, more mature industries that take behavioral change seriously?

Compared to other industries, the information security awareness approach to behavioral influence is an embarrassingly amateur affair. In fields such as public health and marketing, there are experts who have spent decades studying behavioral influence, testing their assumptions and making systematic improvements to their methods. The approach in these fields has led to a heavy emphasis on audience research. Why did you buy that particular product and not another? What thought processes were you following when you plugged that in? They go beyond the 'what' of behavior and seek to understand the 'why'. In contrast, information security professionals persist with the delusion that they can manage the what without understanding the why.

Many ways exist to systematically understand the why of an audience. Web designers commonly use personas. Safety risk communicators have mental models. Information security folk models have also been proposed. Ira Winkler was quick in his rebuttal to Schneier to dismiss folk models as 'unworkable' and 'not true'. The reality is that people have rules of thumb that they use to make decisions, such as: Is it growling and showing its teeth? Then I'm not going to pat it. Folk models are just a way of encapsulating these decision-making processes.

Generally, people's rules of thumb are adequate. When they go wrong, the information security tendency is to bombard an audience with facts, which is an extraordinarily inefficient approach. Some facts are more important than others and we need to identify specific 'fulcrum facts' on which decisions hinge rather than blindly 'teaching the topic.' Often, problem behaviors can be traced to a single mistaken perception. A good example that leads to a whole range of problematic behaviors is the belief that 'hackers don't target small businesses.' Information security professionals have been guilty of 'naïve realism' where we've assumed that our way of looking at problems is the only correct one. Despite our good intentions, our efforts will be hit and miss if we don't understand our audiences view of the world.

The cost of our mistaken approaches to security awareness should not be underestimated. How much has been spent on the password complexity topic alone? This problem could have been solved by system design but instead we've set ourselves the goal of trying to teach every last user. The crazy world of information security is such that Schneier was criticized for pointing this out.

[Security awareness quiz: What's wrong with this picture? The NEW clean desk test]

Safety professionals would be shocked at our endemic complacency where high-risk functions with no business benefits exist on our systems with the potential for catastrophic failure. Why do we allow users and administrators to perform unsafe acts such as selecting passwords like 'Password1'? Next time you get on a plane, consider the effort that's been made to systematically design out risk in areas such as pilot training and cockpit ergonomics. If security professionals designed an aircraft cockpit they would include a 'crash plane' button on the dashboard and then spend years training people not to press it.

Is it a good idea to manage human risks? Yes, absolutely. Influencing user security behavior is a very important part of any organization's defense in depth. However, its about time we dropped the enthusiastic amateur approach. Sure, information security awareness has had its handicaps, not least a mistaken perception that changing behavior is easy. However, until we acknowledge that a better understanding of user behavior is needed, and that it's not efficient to use awareness to cover up poor security design, then it's the users who will suffer.

It's likely that due to the mix of specialist skills involved there's an increasing role for information security awareness marketing agencies with experts in communications and behavioral influence. This is very different from where we are now where security awareness is widely seen as an IT job that requires no particular communication skills.

Is it true that security awareness has allowed inefficiencies by compensating for bad design? Yes. Is there room to improve mainstream awareness techniques? Absolutely. Should security awareness be performed with a much better understanding of the audience? Definitely. Will you hear most awareness professionals admit it? Apparently not.

Geordie Stewart is the resident security awareness columnist for the Information Security Systems Association Journal.

Join the CSO newsletter!

Error: Please check your email address.

Tags security

More about Inc.Security Systems

Show Comments

Featured Whitepapers

Editor's Recommendations

Solution Centres

Stories by Geordie Stewart

Latest Videos

  • 150x50

    CSO Webinar: Will your data protection strategy be enough when disaster strikes?

    Speakers: - Paul O’Connor, Engagement leader - Performance Audit Group, Victorian Auditor-General’s Office (VAGO) - Nigel Phair, Managing Director, Centre for Internet Safety - Joshua Stenhouse, Technical Evangelist, Zerto - Anthony Caruana, CSO MC & Moderator

    Play Video

  • 150x50

    CSO Webinar: The Human Factor - Your people are your biggest security weakness

    ​Speakers: David Lacey, Researcher and former CISO Royal Mail David Turner - Global Risk Management Expert Mark Guntrip - Group Manager, Email Protection, Proofpoint

    Play Video

  • 150x50

    CSO Webinar: Current ransomware defences are failing – but machine learning can drive a more proactive solution

    Speakers • Ty Miller, Director, Threat Intelligence • Mark Gregory, Leader, Network Engineering Research Group, RMIT • Jeff Lanza, Retired FBI Agent (USA) • Andy Solterbeck, VP Asia Pacific, Cylance • David Braue, CSO MC/Moderator What to expect: ​Hear from industry experts on the local and global ransomware threat landscape. Explore a new approach to dealing with ransomware using machine-learning techniques and by thinking about the problem in a fundamentally different way. Apply techniques for gathering insight into ransomware behaviour and find out what elements must go into a truly effective ransomware defence. Get a first-hand look at how ransomware actually works in practice, and how machine-learning techniques can pick up on its activities long before your employees do.

    Play Video

  • 150x50

    CSO Webinar: Get real about metadata to avoid a false sense of security

    Speakers: • Anthony Caruana – CSO MC and moderator • Ian Farquhar, Worldwide Virtual Security Team Lead, Gigamon • John Lindsay, Former CTO, iiNet • Skeeve Stevens, Futurist, Future Sumo • David Vaile - Vice chair of APF, Co-Convenor of the Cyberspace Law And Policy Community, UNSW Law Faculty This webinar covers: - A 101 on metadata - what it is and how to use it - Insight into a typical attack, what happens and what we would find when looking into the metadata - How to collect metadata, use this to detect attacks and get greater insight into how you can use this to protect your organisation - Learn how much raw data and metadata to retain and how long for - Get a reality check on how you're using your metadata and if this is enough to secure your organisation

    Play Video

  • 150x50

    CSO Webinar: How banking trojans work and how you can stop them

    CSO Webinar: How banking trojans work and how you can stop them Featuring: • John Baird, Director of Global Technology Production, Deutsche Bank • Samantha Macleod, GM Cyber Security, ME Bank • Sherrod DeGrippo, Director of Emerging Threats, Proofpoint (USA)

    Play Video

More videos

Blog Posts

Market Place