The ongoing spate of high-profile data thefts is spurring companies to action but much of it is still reactionary and short-lived due to a lack of technical understanding about the security technologies now available on the market, according to one security industry architect.
“I spend a lot of time with customers and the one thing that distresses me is that the number of customers actually achieving vulnerability management is very low,” Tenable Network Security principal architect Dick Bussiere told CSO Australia.
“Even in significant and very important financial institutions, we are finding that the vulnerability assessment process is in some cases performed as infrequently as annually,” he continued.
“Performing the vulnerability assessment has been painful, time-consuming and labour-intensive. But even with the best patch management processes in place, this leaves you with a very large threat exposure.”
The response to the Heartbleed OpenSSL vulnerability, which sent thousands of organisations scrambling and redefined corporate discussions about security, had proved to be a good example – with many organisations rushing to test their systems for the vulnerability and fix it in the short term.
Despite the quick and fevered response, however, even such a high-profile attack had failed to effect long-term change in many organisations, Bussiere said: “it was reactionary as opposed to being precautionary,” he explained. “I wouldn't say I have seen any change in the attitude towards adopting techniques such as vulnerability monitoring.”
In many cases, this was not because of a lack of concern, but because those organisations weren't aware of how well vulnerability-assessment platforms can scale compared to established security tools.
“Traditional technologies such as scanners for larger networks, just don't scale,” Bussiere said. “This is perhaps where a lot of the resistance to adapting vulnerability management as a continuous process comes from.”
“In terms of the effort required to get the result that you need, there seems to be a lack of awareness of automation and an understanding that the entire process of vulnerability assessment and management, report generation and alerting of critical vulnerabilities is automated.”
While individual responses to Heartbleed had varied, the industry's resultant investment in open-source security scanning showed what could be done when serious security investments were made.
Whereas progressive organisations were beginning to understand how automation can be used to build a continuous rather than instantaneous vulnerability-management process, ongoing change to the operating environment meant that best-practice security remained a moving target.
“Things are improving, but on the other hand software complexity is increasing – and we are turning into another Internet of Things,” he said.
“We're progressing from the traditional computing model to another, where everything is talking to everything else. That's going to produce more risk over time. But with intelligence and automation embedded into [cloud-based] products, organisations no longer have to understand how to embed those.”