The 'cargo cult' comes to information security...Instead of mimicking the competition, organisations that create a culture of security have the best chance of turning users from their weakest link into their best form of protection, says Neil Barrett. As fits with the archetypal image of the techie, I read a lot of science fiction - particularly the near-future, 'cypherpunk' style. One in particular caught my attention recently, Snow Crash by Neal Stephenson. It tells of a world run by words, particularly the three-ring binder of instructions for the operation of different franchises in which the managers know little or nothing of the businesses beyond the holy writ of operational directions for each and every situation that they might face.
It's science fiction but uncomfortably close to the truth for many businesses today, in which poorly understood procedures are nonetheless slavishly followed, giving rise in many cases to a culture of the jobsworth, in which anything out of the ordinary is by definition impossible. The result is companies which follow the received wisdom without clear analysis of whether this is indeed the correct approach to take in any given situation. In terms of information security, this can give rise to the phenomenon known as the 'cargo cult'.
During the Second World War, the South Sea islanders watched in amazement as American cargo planes brought a wealth of unexpected gifts to their islands: Coke, cigarettes, clothes. All the things which the American troops expected, the totems of their home lives, were delivered in flight after flight of big silver birds - until the war ended and with it, the flights.
Stunned at the sudden cessation of these flights, the islanders decided it was because the runways had been allowed to return to the bush and there was no longer anyone in the operations buildings or anyone on the runways, waving the table-tennis bats to draw the silver birds to earth. So, the islanders took it upon themselves to duplicate what they had seen the American troops do. They built wooden radios and gabbled into them; they cleared the runways and walked along them, waving makeshift paddles; they cleared and cleaned the ramshackle hangers - everything to make the silver birds welcome.
Nobel-prize winning physicist Richard Feynman talks of these cargo cults in the context of science - how we try to understand what is happening in the world around us, and how we might misunderstand surface effect versus a deeper appreciating of the underlying processes.
But the observations are equally applicable to any of the processes that modern businesses undertake. Successful companies have distinctive branding, so a failing company will attempt a re-brand rather than examine and address any underlying problems. Successful companies have non-executive directors, so a failing company will appoint 'godfathers' to give the impression that things can and are being addressed. Successful companies advertise, so a failing company will take out adverts in the pious hope that this might solve the problem.
And in information security, companies will follow the received wisdom of what secure companies appear to be doing. They will implement the popular firewalls and intrusion detection technologies, following the fashion of the day. They will introduce and maintain antivirus measures; buy an information security policy; recommend lengths and complexity of passwords - and punish those who transgress these copied controls.
All of these things are appropriate, of course. But implementing security measures is not the same as 'being secure'. Truly secure organisations do indeed have these things in place, as one would expect. But they have something deeper underlying those technical and organisational measures: they have a culture of security.
As an example, consider the simple password controls that might be introduced into a commercial organisation. Passwords may have a prescribed length and complexity - say, a minimum of eight characters chosen from upper and lower case, along with at least one number and/or punctuation character - and be changed on a monthly basis. As an added security measure, users will be refused the ability to choose a password that they have used in the last three months.
It sounds sensible but in many organisations users will seek ways to make their task of remembering the password easier. They might choose passwords based on the name of the month or on the current star sign; they might cycle their passwords, using one of four words on a regular basis, to give them confidence that they will remember them.
That is, the users will - and do - take active measures themselves to undermine the security of a perfectly sensible mechanism so as to give themselves an easier life. This is therefore a technical measure, supported by good organisational practices, which nevertheless leads to an insecure infrastructure.
Of course, truly secure organisations also use these measures - maintaining password controls, refresh cycles and management oversight to ensure compliance. And the cargo cult mentality says that if an insecure organisation was to do these same things then they will become secure. But the underlying culture of the company militates against these mechanisms working in practice - unless, that is, the company introduces a security culture alongside the security mechanisms.
This security culture is the underlying reality which makes the surface effect actually work; it is the culture that these companies need to understand and to implement if the technology is to be allowed to do its job and the company is not to fall foul of the cargo cult of copying effect in the hope of reproducing the cause.
It is one of those well-accepted truisms in information security that the people are the weakest link. But this means that improving the security from the people perspective will necessarily result in an improved level of information security - almost irrespective of the technology chosen.
Of course we need antivirus solutions but teaching users to recognise the sources and effects of computer viruses is also essential. Of course we need intrusion detection systems to catch hackers but teaching system managers to recognise the effects of hacking is also essential. Of course we need websites receiving credit cards to be protected by SSL links but teaching database administrators the importance of also protecting stored details is vital.
In other words, we need to teach all of our users how to protect themselves and their companies - and then support their efforts with the appropriate technical and organisational measures.
Humans may well be the weakest link but they can also be the best form of protection. Let's make sure that everyone is aware of the importance of good information security.