The disclosure guidance on cybersecurity issued last month by the Securities and Exchange Commission's Division of Corporation Finance is a "game-changer," says Alan Paller, director of research with SANS Institute, a security research and education organization. But not because the rules are now different. In fact, they're not. Rather, what's significantly new is the way the existing rules are perceived.
For most public companies, in the past "the presumption was that (cyber intrusions) were not material," says Paller. "The guidance says that presumption was wrong."
The new guidance takes the position that companies should disclose the risk of cyber risks and cyber incidents if that risk makes an investment in the company speculative or risky. While public companies should avoid "generic risk disclosures," they also are not required to initiate disclosures that would, in themselves, compromise the company's cybersecurity. [Who's quoted? Paller?]
'I'll Be Watching You'
The idea that companies need to report situations or issues that can pose a material risk to the organization's ability to make money is hardly novel, of course. The way David Navetta sees it, the difference now relates to the much greater significance of the disclosure requirements. "The SEC put the financial community on notice that these are serious," according to Navetta, a founding partner with the Information Law Group, and a Certified Information Privacy Professional. "You've got the Robert De Niro moment, 'I will be watching you'," he says, referring to a line from the movie "Meet the Parents."
While the SEC's new guidance doesn't constitute regulation, CFOs shouldn't underestimate its impact. "When something goes wrong, the voluntary nature of the word 'guidance' becomes a little less relevant," Navetta says. "Think of it as requirement, versus guidance."
What's more, the cybersecurity risks about which the SEC is concerned extend beyond potential leaks of consumer data, things like credit card numbers that a company may transmit or store, says Cynthia J. Larose, a member of the corporate and securities section at Mintz, Levin, Cohn, Ferris, Glovsky and Popeo P.C., and chair of the law firm's privacy and security practice. If, for instance, an intellectual property breach would pose a risk, it needs to be disclosed. "You have to look at cyber risk as holistic," Larose says.
The Cost-of-Security Obstacle
The size of the corporate threat from cyber crime is hard to dispute. The second annual "Cost of Cyber Crime Study," issued by Ponemon Institute LLC in August 2011, found that the median annualized cost of cyber crime for the 50 organizations in the study was $5.9 million, with a range being between $1.5 million to $36.5 million. The annualized average was up 56% from the previous year's study. (Note: the study isn't found on its website.)
And certainly, most CFOs, along with other executives, want their firms to be secure, Navetta notes. However, a number of obstacles stand in the way of improving security. One is the cost of security projects, which vie for a portion of the corporate checkbook against a range of other initiatives -- many of which promise to generate revenue.
In addition, the threats that organizations are trying to counter are continually shifting. "Most companies sincerely want to be secure, but it's a matter of economy and nimbleness," he says.
Greg Barnum, vice president and chief financial officer of Datalink Corp., a Minneapolis-based provider of data center infrastructure and services with annual revenues of about $350 million, agrees with the need to disclose significant security breaches, which can represent "a big cost to the company."
At the same time, Barnum expresses concern that Datalink -- which has solid security controls in place -- could be negatively affected because of the resources required to implement this and other regulations, given his firm's relatively small size. "With all these regulations, from Sarbanes-Oxley on down, it's tougher for companies like us to put all the people and systems in place."
Changing the Security Audit
It's hard to say how much the disclosures will help investors. Some are bound to be reassured by the accounting companies give. However, disclosing a risk could lead others to conclude that a company's cybersecurity is weak, and that a breach would impact returns, thus negatively influencing a decision to buy or sell the stock, Navetta says.
In January 2010, Heartland Payment Systems Inc., a processor of credit and debit card payments, announced a settlement agreement under which issuers of Visa-branded credit and debit cards could obtain recovery from Heartland for losses they incurred in a 2008 criminal breach of the company's payment system environment. Heartland said it would pay up to $60 million to fund the program.
To truly improve cybersecurity, the manner in which computer security audits are conducted must change, Paller says. Today, many security audits focus too heavily on ancillary measures, such as the degree to which access to the computers is restricted. While important, these shouldn't be the focus of the audit, he says. Instead, the focus should be on the systems themselves. "In (today's) computer audits, you spend 95 percent of the time NOT checking to see if the systems are safe," Paller says.
Equally important, companies need to continuously monitor system vulnerabilities, rather than checking periodically, according to Paller. "You might check once a year, and once you're done, the bad guys could be in." Finally, any computer used for online banking transactions should never be used for any other task with the network. "Buy a $200 or $300 computer and don't do anything else on it."
Room for improvement exists at many organizations, Paller adds. "To expect them to be perfect is unreasonable. To expect them to be better is quite reasonable."