OPINION: Diversity Training

IN 1980, the world health Organisation declared smallpox eradicated. However, by the end of this year, millions of health-care personnel and other first-responders will have to be immunised against smallpox. How does an allegedly extinct disease become a national risk 20 years later? Because the lack of vaccination has homogenised the same vulnerability into a large percentage of the population. And once a virus starts, it's hard to stop it.

This idea is just as relevant to communities of computers as it is to people, and it illustrates an unappreciated principle of systems in general and networks in particular — diversity. Diversity in computer platforms can prevent viruses from taking over.

But, in truth, a large percent of the population does use the same computer platform. The antitrust case against Microsoft was meant to protect free trade, but an argument could be made that the government should also take steps to protect technodiversity for security's sake. Even a benevolent monopoly is dangerous because it becomes indispensable. If a virus or worm targets those ubiquitous systems, we are all affected because there is no vaccinated population able to withstand the attack.

Standardisation, for all its benefits, is insidious because it enables virulent attacks to spread everywhere through common communications protocols, faster than an open-mouthed sneeze in Grand Central Station at rush hour.

Exacerbating this problem are convenience features built upon a homogenised computer environment, such as patching. Patching software used to be a low-priority task for administrators; it was common to see different releases of programs running side by side. It might have been a little bit of an administrative headache, but it actually worked as a benefit to a network's immune system — one system might get infected by a virus while another did not.

Unfortunately, today's applications upgrade themselves automatically. Bugs, glitches and holes that would have affected only early adopters or a few computers on a network can now become an epidemic before they're even spotted. The convenience of automation has led to uniformity, and uniformity in turn has enabled mass exposure to viral threats.

Diversity creates a natural firebreak for computers. I have never seen a virus that can infect both Linux and Windows boxes, and only a few can cross between Macs and PCs. In fact, the earliest warning of a network attack is often a log entry caused by one such system rejecting a virus even as the other system is infected.

I'm not advocating that companies create fully redundant hardware and software environments. That, of course, is not cost-feasible. On the other hand, it's good practice to be wary, in general, of single points of failure, whether hardware, software or human. Single-vendor solutions will always create such a weakness. What's more, homogeneity encourages sloppy internal practices by "certified" security experts who have been trained to use a specific application and who don't have the foundational expertise to adapt to new situations, to diversify.

But introducing even a token number of Unix workstations or servers forces the Windows administrative staff to learn the basics of other systems and reduces the corporate dependence on a single line of technologies.

What does it all mean? The conveniences that homogeneity and features such as patching provide might not be so great after all. It would be telling, for instance, to measure the benefits of homogeneity against one major virus attack like Slammer. Without doing calculations, it's not hard to imagine that one bout with Slammer costs more than the convenience that standard features give you over the course of a year.

Which all leads to some counterintuitive advice for the security conscious: CSOs should make an effort to slow down the rate of standardisation in the enterprise. Use a combination of Linux and Windows, and don't be too quick to apply a patch unless something is already broken. Turn off automatic updates. Buy equipment from more than one manufacturer; a good mix is 70:30. There's also a cost advantage because competition drives better deals.

Hedge your bets — the best containment strategy to avoid catastrophic failure is diversification.

David H Holtzman, former CTO of Network Solutions, also worked as a cryptographic analyst with the US Navy and as an intelligence analyst at DEFSMAC.

Join the newsletter!

Error: Please check your email address.

More about Microsoft

Show Comments

Featured Whitepapers

Editor's Recommendations

Solution Centres

Stories by David Holtzman

Latest Videos

More videos

Blog Posts

Market Place