The US wants to restrict exports of software vulnerabilities. But is that such a good idea?
Researchers are concerned about a United States Commerce Department plan to treat software vulnerabilities like "dual-use" items that could be weaponized.
Reuters reports that the department's plan, which follows a 2013 agreement to regulate similar software, would make it illegal to share software vulnerabilities outside specific nations that signed the Wassennar Agreement two years ago. (41 nations, including the US and its intelligence allies, signed that agreement.)
Many agree that sharing knowledge about software vulnerabilities with the wrong people could be problematic. Oppressive regimes could use them to surveil their citizens, hackers could take advantage of them to assist in crimes, and others might combine the two in an effort to benefit from spy programs.
The problem is that the existing proposal, which seeks to define "intrusion software" as a new category worth regulating, is too broad. Threatpost reports:
'This was perhaps a way to stop companies like FinFisher and Hacking Team from being able to export targeted surveillance software to governments like Bahrain, which does not seem unreasonable to me,' said Electronic Frontier Foundation global policy analyst Eva Galperin. 'But one of the things they did was write the language messily and broadly, and open to troublesome interpretation. It’s important to tread carefully.'
'One of the biggest problems is that people who are writing this language are not security researchers and likely have a limited understanding of how security research is conducted and how threats and exploits are shared,' Galperin said. 'This is why they have a comment period. What I would like the security community to understand is that this is the junction to step in and set them straight.' It is a little funny that news about the US wanting to limit the disclosure of zero-day exploits comes right as the National Security Agency and its partners have been criticized for keeping many exploits to themselves. As I wrote this morning, after others reported on efforts to spy on Alibaba's UC Browser:
[These] efforts highlight the risks posed by intelligence agencies keeping information about vulnerabilities in popular products to themselves. Allowing these issues to remain in the products might benefit surveillance operations, but it also undermines the security and privacy of many innocent people.
Alibaba told CBC News that it was unaware of the problems with UC Browser and that the app didn’t leak this information on purpose. Others could easily have exploited this vulnerability and learned all kinds of things about the half-billion people who have the Web browser installed on their smartphones.
The reaction to this plan shows that there are no absolutes. Of course zero-day exploits should be revealed, if not to the public, then at least to the companies able to fix the problem. Yet at the same time it's impossible to share too much of this information without running the risk of it being abused by other people.
The software affected by these policies might run binary code, but the policies themselves aren't as neat as the "1s" and "0s" they're attempting to regulate.[illustration by Brad Jonas]