Human factor derails best-laid security plans

I'd like to believe that for every security problem I face, there's a technical product or suite of products that can solve it. However, some problems are too subtle and entirely too complicated to be solved through a technical approach.

Any computer system today has users, and people never do what the designers expect. Humans are inquisitive and inventive, yet lazy and ignorant. If they're not downright malicious, nearly all are incompetent outside their particular area of skill. They also like to fiddle and play, leading to computer systems that seem bound to fail.

Like everyone else, I enjoy exploring the possibilities of a system. I've sometimes even caused what could genuinely be called security incidents. But in the process, I've learned some valuable lessons. Thanks to the mistakes of others, I've never scored in my own goal while testing a virus. I've seen this happen enough times that I limit my tests to the relative safety of the EICAR test virus.

When I'm tasked with protecting an environment, I consider the human risk. Traditionally, I've done this by trying to make the system foolproof. But every time, a bigger fool than I ever imagined comes along and proves me to be the fool for thinking I could cover all the angles. This has led me to the inescapable conclusion that something must be done to reduce the risk posed by users.

The FUD factor.

Many security teams solve this problem with fear, uncertainty and doubt (FUD). "Do x, or your data will be lost," they say. Fear is a good short-term motivator. But the more we fear something, the stronger our urge to investigate becomes, to check that the fear is justified. And when we find that someone misled us, we never trust his warnings again.

Humans adapt, so we quickly become accustomed to whatever level of fear originally motivated us. When you drive past a prang on the highway, you slow down for the next few kilometres, but soon afterward, you're speeding again.

In my view, our best hope is to show users the consequences of their actions and educate them about the reasons why we ask them to take precautions.

This is a fairly radical approach in information security circles, where much conventional wisdom comes from military and government environments that require one to follow orders without question. The government mind-set leads to policy-based awareness campaigns. List the rules, inform users of what they must do, and your job is complete; Pavlovian training does the rest. But I've never seen this work in the commercial world.

Instead, I want to coerce users into performing in the correct manner. Using techniques from advertising, I hope to modify their behaviour toward my goal. I've started with one of the softer advertising methods: bribery.

It began with the run-up to Christmas. For many security teams, this is a nightmarish time. As staff hours shorten in other departments, account sharing becomes the norm to allow the skeleton crew to provide the same coverage. The annoyance of having to respond to an off-hours incident is at its worst on a holiday.

Christmas is also a time of giving, and for many users of e-mail systems, files are the gift of choice. Disguised as fairy lights for your screen or joke movies with Santa, wave after wave of malicious code washes up against our defences.

This year, we tried something different: We didn't place obvious notices warning of the dire consequences of opening these attachments. We didn't forbid them. We also didn't rely on our detection to spot and stop them, since they mutate too quickly. Instead, we gave people who left these attachments unopened some real tinsel to decorate their work areas, so they didn't have to use a virtual equivalent.

Did it work? Well, we didn't get a malicious code outbreak. We believe the volume of code circulated was lower. Was it our tinsel? Was it our defences? Was this just a quiet year for virus-laden virtual fairy lights? We don't know.

Taking a different tack.

Bribery isn't always enough, however. For every carrot, we also need a stick. One of our biggest risks are users who open attachments without any thought. To politely punish them, we have designed a mind-bogglingly dull course explaining, repeatedly, how to not open a file. "All you have to do is not click here. Watch, I'm not clicking now; you see?"

At first, we give users a 15-minute course. Then we watch for multiple offenders. If we see a cascade of virus-laden e-mails from the same users again, they're given a longer course on how to not open a file. The tedium should put them back on the right track.

Of course, it's possible to be too security-aware. We change our alert level, based on our understanding of the threats to our company. Once, in preparation for a period of expected high risk, we worked very hard to raise awareness and get our staffers to keep their eyes peeled.

Then, at 8 o'clock one evening, a very senior manager phoned me at home and began the conversation by asking, "Where are we on countering the extortion threat about hacking our systems tomorrow?" Did I panic? Oh, yes. This was the first I'd heard of this threat. How did this bypass normal escalation channels?

After a few hours, I tracked down the source of the threat and boiled it all down to something much less panic-worthy. Like many IT teams, our developers use pagers to pass along messages. One team went to a local bar to relax, while a team member -- call him Bill -- stayed in the office to finish coding.

Bill sent what he must have thought was a humorous pager message to the group. Unfortunately, when the group received the page, the combination of a few rounds and their heightened awareness of security led them to misread the intention of the message as malicious.

But instead of contacting the on-duty security staff to check out the threat before escalating it to me, they decided that this was urgent and went directly to top management. Each time the message was passed among managers, it was retold and became more exaggerated until I received the call that an attack was imminent.

As for the message that kicked off this major response, it simply read, "Your ass is on fire."

Join the newsletter!

Error: Please check your email address.
Show Comments

Market Place