Thursday 23 July 2015

(Image: Public Domain)

Google Fights Export Controls For 'Intrusion Software'

Proposed export rules could hobble cybersecurity research, Google claims.


Google on Monday asked the US Commerce Department to alter proposed rules that would restrict cyber security research.
The rules reflect US participation in theWassenaar Arrangement, a multilateral export-control agreement that includes 41 countries. As it is not a formal treaty, it requires participating states to separately implement their own interpretation of the Arrangement.
Google's objection to the rules being considered in the US reflects unease over the addition of "intrusion software" to the list of goods subject to export limitations.
Intrusion software is defined as software designed or modified "to avoid detection by 'monitoring tools,' or to defeat 'protective countermeasures,' of a computer or network-capable device, and performing: a) the extraction of data or information, from a computer or network-capable device, or the modification of system or user data; or b) the modification of the standard execution path of a program or process in order to allow the execution of externally provided instructions."
It specifically excludes: hypervisors, debuggers, or software reverse engineering (SRE) tools; digital rights management (DRM) software; asset-tracking software; and network-capable devices like mobile phones and smart meters.
Neil Martin, Google export compliance counsel, and Tim Willis, "hacker philanthropist" on the Chrome security team, in a July 20 blog post argue that the proposed rules, if adopted as presently written, would hinder open security research and limit the ability of organizations to find and fix security vulnerabilities in software.
"It would be a disastrous outcome if an export regulation intended to make people more secure resulted in billions of users across the globe becoming persistently less secure," Martin and Willis write.
(Image: Public Domain)
In a letter sent to the US Commerce Department's Bureau of Industry and Security (BIS), Google argues that the proposed rules are too broad and vague, requiring potential export licenses for email, code review systems, instant messages, and perhaps even in-person conversation, despite assurances to the contrary.
The rules, suggest Martin and Willis, could require an export license to report a bug and could limit the ability of companies to share information about intrusion software.
Jeffrey L. Vagle, executive director of the Center for Technology, Innovation, and Competition at the University of Pennsylvania Law School, said in a blog post earlier this month that the government's impulse to limit the flow of potentially dangerous software, while understandable, is fraught with difficulties.
Governments naturally want to control potentially dangerous technologies, Vagle contends, yet they also want to use these same technologies for intelligence and surveillance. The problem with this approach is that offensive and defensive cyber-security research often depend on each other.
The US government's proposed cure might just make its own networks, already compromised too often, less secure.
"Regulating offensive research through limits on international collaboration could very well make impotent an important component in our ongoing struggle to fix buggy code," Vagel wrote. "If the true goal is to maximize information security in our everted cyberspace, the better solution is one that incentivizes defense rather than arbitrarily punishes offense."
Vagel suggests liability for vulnerabilities would offer an incentive for greater defensive investment in software.
Google has requested that the Commerce Department address the problems with its rules at the annual meeting of Wassenaar Arrangement members in December