Geoff Sanders talks about hacking regulation and legislation with CSO in a series of topical discussions with industry leaders and experts.
Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focused on disclosure and how pending regulation could impact it. Now, this second set of discussions will examine security research, security legislation, and the difficult decision of taking researchers to court.
CSO encourages everyone to take part in the Hacked Opinions series. If you would like to participate, email Steve Ragan with your answers to the questions presented in this Q&A. The deadline is October 31, 2015. In addition, feel free to suggest topics for future consideration.
What do you think is the biggest misconception lawmakers have when it comes to cybersecurity?
Geoff Sanders, CEO at LaunchKey (GS): One of the biggest misconceptions lawmakers have regarding cybersecurity is their belief that the nation's cybersecurity problems can be fixed through legislation.
The rampant hacking and breaches we see around the world isn't due to a lack of legislature, it's due to the pervasive use of inadequate and antiquated technologies along with a general public that's poorly suited to defend themselves. Rapidly increasing the adoption of modern technology and improving end user education is essential to a safe and secure cyber world.
What advice would you give to lawmakers considering legislation that would impact security research or development?
GS: Any legislation that would serve to limit or restrict the research and development of security vulnerabilities is merely serving to aid the bad guys and deter the good guys. While the intent of such legislation is designed to limit the spread of vulnerabilities in the wild, this makes a very poor assumption: that the bad guys haven't already, or won't eventually, find the same vulnerability. In the cyber world, it's typically impossible to know whether knowledge of an exploit exists and who possesses that knowledge until it's too late.
Legislation that ultimately limits or discourages security researchers to discover vulnerabilities, develop fixes, and disseminate that knowledge as quickly as possible, is making the irresponsible and short-sighted assumption that criminals don't, or can't, possess such knowledge. If a lone security researcher with modest resources can find a vulnerability, it should be assumed that the bad guys, whom often have more resources and aren't deterred by the threat of the law, will surely discover it as well.
If you could add one line to existing or pending legislation, with a focus on research, hacking, or other related security topic, what would it be?
GS: I'm no lawmaker, but something to the effect of: "Security researchers that seek to uncover vulnerabilities in a responsible manner that shields their discoveries from the users and organizations affected by those vulnerabilities, and whom reports their findings to the appropriate parties in a timely manner, should be protected against the threat of legal recourse so long as their efforts are made with the sole intent to help and improve the security of the users and/or organizations affected by such vulnerabilities."
Now, given what you've said, why is this one line so important to you?
GS: Lawmakers should do everything they can to create a legal environment, which encourages responsible security research, not discourage it.
Do you think a company should resort to legal threats or intimidation to prevent a researcher from giving a talk or publishing their work? Why, or why not?
GS: Never. Such tactics neither address the vulnerability at hand nor ensures that others don't already possess knowledge of the vulnerability.
Instead, companies should be thankful that there are security researchers out there willing to perform the QA work that they failed to do internally, and work alongside those researchers to implement a proper fix. The only rational reaction by an organization to the discovery of a vulnerability should be the immediate desire to patch that vulnerability.
What types of data (attack data, threat intelligence, etc.) should organizations be sharing with the government? What should the government be sharing with the rest of us?
GS: In general, the more information that is shared among security researchers, organizations, and the government, the better. Defense against a certain attack vector or vulnerability can't happen without the requisite knowledge.
Being that we must assume the bad guys will discover the same vulnerabilities the good guys do, it becomes an issue of whether or not vulnerabilities can be patched quickly enough before the bad guys exploit them. Thus, if speed is necessary to a proper defense, we should be doing everything possible to reduce the time to discovery of vulnerabilities, and sharing such information between relevant parties is a good place to start.