Matthew Goulet talks about hacking regulation and legislation with CSO in a series of topical discussions with industry leaders and experts.
Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focused on disclosure and how pending regulation could impact it. This week CSO is posting the final submissions for the second set of discussions examining security research, security legislation, and the difficult decision of taking researchers to court.
CSO encourages everyone to take part in the Hacked Opinions series. If you have thoughts or suggestions for the third series of Hacked Opinions topics, or want to be included as a participant, feel free to email Steve Ragan directly.
What do you think is the biggest misconception lawmakers have when it comes to cybersecurity?
Matthew Goulet, Chief Operating Officer, Globalscape (MG): One misconception is that the basic conditions underlying cybersecurity can be changed for the better by passing laws, but when in reality finding the right legislation to address the appropriate cybersecurity issues is a very delicate (and difficult) balance to achieve.
It’s important to avoid establishing regulatory standards that are too complex for companies to address, or worse, cause them to devote precious resources to maintaining compliance when that investment could have gone into prevention and mitigation.
The wheels of legislation grind slowly, but the efforts and innovations of the cybercriminal element are rapid and relentless. And it’s also important to remember that a great deal of cybercrime originates overseas, well beyond the reach and jurisdiction of U.S. courts.
Consider, too, that in a global economy, our laws and regulations do not exist in a vacuum. One of the biggest regulatory issues affecting U.S. businesses is the uncertainty that follows the European Court of Justice’s invalidation of the EU-U.S. Safe Harbor Framework—the legal guidelines that participating organizations followed to maintain compliance under European privacy laws and ensure the protection of personal data during cross border transfers from the EU to the U.S.
What advice would you give to lawmakers considering legislation that would impact security research or development?
MG: There are examples of legislation that have established good framework for prodding companies into a stronger, preventative posture. Specifically Massachusetts 201 CMR 17, which requires organizations to establish reasonable measures for the protection of data.
Such a model could be applied to the protection of any data or systems that the federal government deems worthy of protecting, such as critical infrastructure or of intellectual property that may have strategic military value.
If you could add one line to existing or pending legislation, with a focus on research, hacking, or other related security topic, what would it be?
MG: A review date that would ensure any law recently enacted or being considered would be up for amendments – to ensure that they are reevaluated to better mirror the current technology and cybersecurity climate.
Now, given what you've said, why is this one line so important to you?
MG: I would hate to see industry hamstrung in the long term by a law passed with good intention due to natural technology or cybersecurity evolution. Rather, if the hoped-for results were realized, the review law could be used as a framework and improved upon at the appropriate time.
Do you think a company should resort to legal threats or intimidation to prevent a researcher from giving a talk or publishing their work? Why, or why not?
MG: There’s a lot of great work being done by the research and white hat communities to better understand threat dynamics, and it would be unwise for companies whose products are found to have vulnerabilities to dismiss or ignore such findings. Rather, the spirit of cooperation that exists in some corners benefits us all when proper disclosure protocols are followed and issues are addressed before risks become widely known and available for exploit.
The disclosure process is not perfect, and there are also cases where companies try to hide behind it rather than make a good faith effort to fix known problems, but there does need to be some recourse for dealing with researchers whose motives are less benevolent and who may choose to pursue paths of disclosure with an aim of generating publicity rather than protecting users.
What types of data (attack data, threat intelligence, etc.) should organizations be sharing with the government? What should the government be sharing with the rest of us?
MG: Threat intelligence should be shared and, indeed, is already being shared among private companies that see the benefit of recognizing attack profiles and techniques that may be new or not widely known. The concept of sharing this information is not new and it would make sense that information being shared within the cybersecurity community should also be shared with federal agencies that are also working on ways to protect government systems from attack.
Treating threat intelligence as open-source would help advance innovation even in areas beyond strictly IT security. In our own case, Globalscape is not a developer of security products, but of secure products, and our own product development would benefit through more readily available information on emerging threats and how they might impact users of our products.
Likewise, if the government wants the private sector to share threat intelligence, it should be willing to do the same.