Making our users unlearn what we taught them

Published: 2015-09-23
Last Updated: 2015-09-23 00:02:24 UTC
by Daniel Wesemann (Version: 1)
2 comment(s)


Remember back in the ancient days, when macro viruses were rampant, and we security geeks instructed our flock of virus scared users to never click on a .DOC attachment in an email, but that a .PDF was perfectly fine?  Fast forward a couple years, and we had to tell the users, yes, .DOCs are still somewhat bad, but .PDFs are WAY worse.

It took a while to get that message across. What we call "security education" or "user awareness training" is hard enough, but nothing in it is more difficult than to make users unlearn an earlier awareness message, once it finally stuck with them.

Another example is WiFi. It looks like we were successful in bringing the point across that "open", unencrypted public WiFi is dangerous, because you have no idea who is listening in. This message stuck, to the extent that the average person today seems to firmly believe that a WiFi access point that requires a WPA2 password is not "open", and hence must be secure. It is, but only if you're the only one who has the password. But in a public setting, like an airport lounge or hotel, where the "password" is nicely displayed at the front desk, there is obviously nothing stopping an impostor to run his very own WiFi with the same SSID and the same password. And if that impostor has a strong signal, this is the Access Point that guests will connect to. Hence, in a public setting, a WPA protected access point only offers marginally more security than an open one. Both are equally exposed to a man in the middle impostor.

It looks though like most of us security professionals forgot to tell our users this when we originally instructed them to be careful with public WiFi. Et voila, we have on our hands another instilled behavior that we need to make them unlearn again.

There's plenty other examples like this, where our earlier "advice" comes back to bite us. Kicking myself in annoyance whenever I notice such a situation, I have started to look at "security awareness campaigns" with a different set of eyes. Awareness campaigns need a "risk analysis" of their own. For every "message" that we, as security professionals, push onto our users, we should ask ourselves:

1. Is this indeed the best (applicable, accurate and useful) lesson that we can teach
2. What would need to happen, in technology, process or behavior changes, to make this lesson useless or even harmful?
3. How likely is this to occur?
4. What is our mitigation when it happens?

There isn't always a good solution, but we at least need to start asking these questions, lest we just continue to teach our users the next future bad security behavior.


2 comment(s)
Diary Archives