$154 or 58 cents -- what's the real cost of a breached data record?

Does a data breach cost an average of 58 cents a record -- or $154?

That's a significant difference for companies preparing incident response plans, as well as for insurance companies, regulators, auditors and others looking to ensure that companies are adequately prepared or covered for such an event.

Ponemon Institute's $154 number is based on an analysis of 350 companies that suffered breaches in 2014, and uses an analytical model based on the real costs of a breach that the company has been refining for a decade.

Verizon's 58 cents calculation is based on 191 insurance claims filed in 2014, and this is the first year that Verizon has run these numbers.

In addition to different data sources, Ponemon also includes indirect costs, while Verizon's does not.

But Verizon's estimate seems unreasonably low, said Caleb Barlow, vice president at IBM Security. IBM sponsored this year's Ponemon report.

Caleb Barlow, vice president at IBM Security

At a minimum, a company with a data breach has to send out letters notifying customers that they were breached and pay for credit monitoring, he said.

"Normally, Verizon does some great work," he said, "But we had to discount this because 58 cents doesn't even cover the cost of the postage and printing the letter."

How useful are insurance claims?

Companies usually don't have enough insurance coverage to cover the total cost of a breach, said Larry Ponemon, chairman and founder of the Ponemon Institution, and the insurance doesn't cover indirect costs or loss of business.

For example, he said, Target's latest breach is estimated to cost the company over $1 billion, but it was only insured for $100 million.

In general, he said, companies buy enough insurance to cover 50 percent of the value of their fixed assets -- but only 12 percent of the value of their digital assets, according to a study released last month by Ponemon and sponsored by Aon Plc, a global insurance brokerage.

In addition, Ponemon said, companies typically have deductibles in place, to lower the costs of their insurance premiums.

"That's enough reason why using insurance payouts as a surrogate for cost is really erroneous," he said.

But Jay Jacobs, senior analyst, RISK Team at New York City, NY-based Verizon, argues that there's no evidence that companies are under-insuring their cyber assets.

If companies were under-insuring, he said, then they'd be likely to hit their coverage cap when they file a claim.

"That cap would be a round number, like half a million dollars, or a million," he said. "That round figure would show up as a pattern."

Plus, he said, the cap would only make a difference in the largest breaches, and those are excluded from Ponemon's report.

Ponemon agrees that his institute's report only covers breaches between 5,000 and 100,000 records in size.

But that's because the multi-million-mega-breaches are still relatively rare, he said, and require a different kind of analysis.

For example, since the fixed costs are divided up among more records, the per-record costs would be lower, he said.

"We need more data points to build the model for these data breaches," he said. "And we will, little by little. We're getting there. But so far, we haven't had enough companies with these massive data breaches participate in our benchmarking."

Ponemon has had 11 so far, he said, and at least 30 are needed for meaningful results.

Direct and indirect costs

According to Ponemon, there are many indirect costs that companies have to cover when there's a breach. For example, staffers might be pulled away from their regular jobs to deal with a breach -- but those jobs still have to be covered.

Loss of business is also a significant, and growing, part of the total cost of a data breach. Higher customer turnover, increased customer acquisition costs, and a hit to reputations and goodwill added up to $1.57 million per company, up from $1.33 million the previous years.

"We spent a lot of time building an analytical model based on real costs," he said.

This doesn't show up in the insurance data, he said, and includes some of the biggest costs that companies incur when they suffer a breach.

According to Jacobs, there's a problem with indirect costs, which is why insurance companies don't cover them.

"It's really hard to quantify it," he said.

The reason Verizon decided to look at the breach costs at all, he said, was because of all the attention paid to a number that had very little solid evidence behind it.

For its annual breach report, he said, Verizon gathers hard data from real events, real forensic analysis, and, for the breaches, real claims filed by real people, that are held to a legally binding standard.

"Our entire approach is very high integrity, making sense of data in an open and honest say," he said, "Not go to opinions and surveys."

In particular, he said, with the very largest breaches, the actual cost or record is actually less than a penny.

"So talking about an average cost per record is really misleading," he said. "The reason we showed that 58 cents cost-per-record number was to show how silly it was."

Statistical validity

Finally, Ponemon said, Verizon's regression analysis, while interesting, is based on a very small number of data points which are also not necessarily statistically representative.

"But the main part of the Verizon data breach investigations report is very interesting," he said, referring to the bulk of Verizon's data breach report, which focuses on forensic analyses of the breaches themselves. He advised that the company focus on that in the future.

"They should stick to their knitting," Ponemon said.

"I can't tell you how unbelievably full of crap that statement is," said Jacobs. "It's 191 samples that we looked at, and there is absolutely nothing that says you can't do a regression analysis with that sample size -- 191 is a rather good sample. That statement is just unbelievably false, incorrect, and ignorant."

Join the CSO newsletter!

Error: Please check your email address.

Tags IBMsecuritydata breachPonemon Institute

More about Verizon

Show Comments

Featured Whitepapers

Editor's Recommendations

Solution Centres

Stories by Maria Korolov

Latest Videos

  • 150x50

    CSO Webinar: Will your data protection strategy be enough when disaster strikes?

    Speakers: - Paul O’Connor, Engagement leader - Performance Audit Group, Victorian Auditor-General’s Office (VAGO) - Nigel Phair, Managing Director, Centre for Internet Safety - Joshua Stenhouse, Technical Evangelist, Zerto - Anthony Caruana, CSO MC & Moderator

    Play Video

  • 150x50

    CSO Webinar: The Human Factor - Your people are your biggest security weakness

    ​Speakers: David Lacey, Researcher and former CISO Royal Mail David Turner - Global Risk Management Expert Mark Guntrip - Group Manager, Email Protection, Proofpoint

    Play Video

  • 150x50

    CSO Webinar: Current ransomware defences are failing – but machine learning can drive a more proactive solution

    Speakers • Ty Miller, Director, Threat Intelligence • Mark Gregory, Leader, Network Engineering Research Group, RMIT • Jeff Lanza, Retired FBI Agent (USA) • Andy Solterbeck, VP Asia Pacific, Cylance • David Braue, CSO MC/Moderator What to expect: ​Hear from industry experts on the local and global ransomware threat landscape. Explore a new approach to dealing with ransomware using machine-learning techniques and by thinking about the problem in a fundamentally different way. Apply techniques for gathering insight into ransomware behaviour and find out what elements must go into a truly effective ransomware defence. Get a first-hand look at how ransomware actually works in practice, and how machine-learning techniques can pick up on its activities long before your employees do.

    Play Video

  • 150x50

    CSO Webinar: Get real about metadata to avoid a false sense of security

    Speakers: • Anthony Caruana – CSO MC and moderator • Ian Farquhar, Worldwide Virtual Security Team Lead, Gigamon • John Lindsay, Former CTO, iiNet • Skeeve Stevens, Futurist, Future Sumo • David Vaile - Vice chair of APF, Co-Convenor of the Cyberspace Law And Policy Community, UNSW Law Faculty This webinar covers: - A 101 on metadata - what it is and how to use it - Insight into a typical attack, what happens and what we would find when looking into the metadata - How to collect metadata, use this to detect attacks and get greater insight into how you can use this to protect your organisation - Learn how much raw data and metadata to retain and how long for - Get a reality check on how you're using your metadata and if this is enough to secure your organisation

    Play Video

  • 150x50

    CSO Webinar: How banking trojans work and how you can stop them

    CSO Webinar: How banking trojans work and how you can stop them Featuring: • John Baird, Director of Global Technology Production, Deutsche Bank • Samantha Macleod, GM Cyber Security, ME Bank • Sherrod DeGrippo, Director of Emerging Threats, Proofpoint (USA)

    Play Video

More videos

Blog Posts

Market Place