Bit9's Chris Lord talks about disclosure, bounty programs, and vulnerability marketing talks about disclosure, bounty programs, and vulnerability marketing with CSO, in the first of a series of topical discussions with industry leaders and experts.
Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focus on disclosure and how pending regulation could impact it. In addition, we asked about marketed vulnerabilities such as Heartbleed and bounty programs, do they make sense?
CSO encourages everyone to take part in the Hacked Opinions series. If you would like to participate, email Steve Ragan with your answers to the questions presented in this Q&A, or feel free to suggest topics for future consideration.
Where do you stand: Full Disclosure, Responsible Disclosure, or somewhere in the middle?
Chris Lord, Senior Director of R&D, Bit9 + Carbon Black (CL): The middle, but definitely leaning more toward responsible or coordinated disclosure. Overall, I'm more concerned about nondisclosure when the vulnerabilities that get reported through side channels or internally in a company never see the light of day. Responsible disclosure applies even when there aren't other parties to take it public.
If a researcher chooses to follow responsible/coordinated disclosure and the vendor goes silent - or CERT stops responding to them - is Full Disclosure proper at this point? If not, why not?
CL: I'd love to follow my kneejerk reaction and say "yes," but full disclosure of a vulnerability must consider who carries the risk and seek first to mitigate that risk. And when those who should be partners go dark, the responsibility falls on the researcher.
It isn't like disclosures around food, safety or health where the agent of harm is what is revealed. With vulnerabilities, the agent of harm only comes to exist in a significant way following disclosure.
If full disclosure leaves millions of people exposed and opens the practical exploitability to a broad group of actors, then we've not done the research community or the public a service.
When responsible disclosure is no longer an option, it's better to aim for the slow reveal: put the mitigation and remediation out early and share the exploitable details later. Ultimately, it all needs to be out.
Bug Bounty programs are becoming more common, but sometimes the reward being offered is far less than the perceived value of the bug/exploit. What do you think can be done to make it worth the researcher's time and effort to work with a vendor directly?
CL: As a vendor, I work with researchers on reported vulnerabilities. It's really about partnership. Bounties can't match the value of a vulnerability to actors with malicious intent. Nor can they match the potential loss to a company.
What I like about bounties is the engagement they encourage between researchers and vendors. A bounty says "we're serious" and draws attention and interest. Coordinated action and mutual respect keeps it going.
A couple trends I think will continue to improve things: multi-sourced bounties such as the Internet Bug Bounty, and crowd sourced bounty hunting such as the one run by Bugcrowd and HackerOne.
Do you think vulnerability disclosures with a clear marketing campaign and PR process, such as Heartbleed, POODLE, or Shellshock, have value?
CL: With campaigns that engage the media, we take the conversation beyond the confines of the security community, software vendors, and those affected. This is a good thing despite the scrambles and scurrying that such visible events cause. It reveals the dependence we have on software, the importance of good hygiene and practices and the necessity of working together.
If the proposed changes pass, how do you think Wassenaar will impact the disclosure process? Will it kill full disclosure with proof-of-concept code, or move researchers away from the public entirely preventing serious issues from seeing the light of day? Or, perhaps, could it see a boom in responsible disclosure out of fear of being on the wrong side of the law?
CL: Wassenaar is a Cold War relic and the proposed extension still treats digital information as a physical entity that can be collected and controlled. Free flow of information and decentralization will win out in the end. However, if the changes pass, Wassenaar will quell security research and communities that span borders--communities that never really recognized or cared about those borders before.
We'll eventually test the legal boundaries--keeping another profession busy--and find our way in and around the new regime. (During this time, there will be an uptick in T-shirts bearing 0-day code.)
Commercial security products are excluded from the changes, but they too are dual-use technologies with off-label application in covert surveillance, detection, subversion and control. Will they be next?