Whether you lean towards the left or right in matters of politics, you’d probably agree with the contention that government obsession with security and data control should make it a blameless role model after which we can all model our own security regimes.
What a pity that, it turns out, they’re as bad at security as the rest of us.
As became painfully obvious recently, the leak of — ironically — an information security manual from a Defence Signals Directorate (DSD) Web site showed with painful certainty that even the (supposedly) best security types amongst us aren’t above totally blowing security where it counts.
In this case, the issue was a minor misconfiguration that resulted in a directory listing being displayed for a folder within the Website hierarchy; instead of loading a main page to control visitors, they were able to browse the file tree and learn in painstaking detail about the DSD’s own security policies.
It’s the kind of embarrassing error that happens all the time — and, in government bodies at least, could have serious repercussions.
Think back to Mission Impossible, where the entire goal of Tom Cruise’s character is to stop the distribution of a ‘NOC list’ containing the identities of secret agents working in Eastern-bloc countries. Were they protected with the same level of security, there’d be more than a few people getting sharpened bamboo sticks under the fingernails at this very moment.
Sure, government bodies are probably better at security than many companies; they have to be, after all. But sometimes you still wonder – as in the DSD case, or in reports that the UK’s Ministry of Defence lost 57 computers and 47 USB sticks last year, or the recent discovery that a USB stick containing ‘Australian Eyes Only’ Defence documents was stolen from the backpack of a major-general’s aide.
If even the most security-conscious government bodies can make such silly mistakes, isn’t that a sign that all of us are human and can be forgiven similar inevitable oversights?
Of course not. There are industries where the inadvertent leaking of one or several documents could materially affect the performance of a company, or expose it to massive liabilities for Privacy Act or other breaches. Government bodies face the same obligations — fitting, since many have the most carefully matched databases of our personal information anywhere in the world — but in every case it’s crucial for CSOs to implement the policies and procedures to keep their data secure.
And then fire all their staff.
Well, OK: it’s hard to fire everybody, since there would be nobody left to make coffee and no one to drink it if there were. But if you hold as a theoretical paragon the elimination of all humans, and the pesky mistakes they make but to which computers are intrinsically immune, then you can work backwards towards your real situation and assess how many different variables are able to put your company on the front page for all the wrong reasons.
The other nice thing about removing people from the equation would be the reduction in the number of people determined to force their way in from the outside. This factor has caused major problems for every government department after hackers compromised RSA’s SecurID security tokens earlier this year, leading the DSD to urge every Australian government department to replace its tokens.
While government departments rush like everybody else to cover their own tracks, many businesses are too busy or too sceptical to follow the government’s lead where it’s even providing security guidance. A recent Symantec Critical Infrastructure Protection (CIP) survey (PDF), for example, found that just 36 percent of surveyed companies were aware of their government’s CIP programs, compared with 55 percent last year. Those companies were also less willing to cooperate with government CIP programs than last year, with 57 percent compliant in 2011 versus 66 percent last year.
Why the cynicism? It’s hard to say. Either companies are too focused on their own problems to take much notice of what the government is doing, or they’ve become so jaded by perceived government incompetence that they simply don’t believe the government’s policies can provide best-practice security models for their own businesses anymore.
Another recent survey strengthens the argument for the latter: speaking at a recent industry conference, ex US CIO Vivek Kundra said governments need to reconsider their own approach to critical infrastructure, noting that more than 70 percent of that country’s infrastructure wasn’t even government-owned. This, he suggested, makes it a liability in strategic planning, both from an availability and a security perspective; he recommended a review of the extent of governments’ reliance on commercial companies.
This is another piece in a worrying jigsaw that’s filling out a worrying picture: despite all its moral authority — it is the DSD, after all, that unflinchingly and fiercely assesses, then rates, every piece of security technology connected to government department networks. We rely on it to set the high-water mark for the corporate world — but if it turns out that government departments are also spilling over the top of the dam, well, what do we do then?