University of Melbourne researchers Dr Vanessa Teague and Dr Chris Culnane have warned that a lack of clarity around a key protection in a government’s surveillance bill poses a serious threat to the security of online communications services.
The duo appeared on Friday at an inquiry into the Telecommunication and Other Legislation Amendment (Assistance and Access) Bill 2018. The bill introduces a range of new powers for police agencies that the government claims are necessary to combat the increased use of encryption by criminal groups.
“My main concern about this legislation is that by trying to break the security of a certain number of bad actors, we might accidentally greatly undermine the security of the wider Internet community,” Teague said, who has helped unearth a number of high-profile security vulnerabilities.
She told the hearing that international experience shows that the “the possibility that that unintentional undermining of security might happen is not a theoretical suggestion. "It’s a historical fact and a number of very well intentioned efforts by the United States intelligence and security services to weaken or restrict encryption in the hope of improving surveillance and policing have already been shown to cause substantial security problems that have affected large parts of the Internet,” she said.
Although the Australian government says it doesn’t seek to weaken cyber security “none of the other government’s that accidentally had catastrophic impacts on cyber security intended to do it either”.
Much of the controversy of the bill has focused on the introduction of the system of Technical Assistance Requests (TARs), Technical Assistance Notices (TANs) and Technical Capability Notices (TCNs).
The notices can be issued by authorised police and intelligence organisations to a wide range of communications service providers. TARs are a request that an organisation carry an action or actions to assist an enforcement agency. TANs are a legal instruction that the recipient render assistance to an agency, while TCNs require a company build a whole new capability for an enforcement agency to exploit.
The types of assistance that can be sought using all three kinds of notices is virtually limitless. A key restriction for TCNs, however, is that a notice not have the effect of introducing a “systemic weakness” into a communications service (for example, the bill specifically bans an agency from requesting encryption be removed from a service as a whole).
A major concern of the bill’s opponents has been that it does not define the term “systemic weakness”.
Members of government agencies have presented differing definitions of the term, Teague said.
At a previous hearing of the inquiry, which is being conducted by the Parliamentary Joint Committee on Intelligence and Security, Australian Signals Directorate head Mike Burgess offered his explanation of the term: “It would be one thing to ask for assistance to get access to something, but the action undertaken to provide that in that targeted case might actually jeopardise the information of other people as a result of that action being taken. That's not what's being asked.”
Teague said that Burgess’ explanation of the term is a “really good definition”. However, she added: “But it’s not the only definition that’s in circulation here.”
At the same hearing that Burgess gave his definition, Department of Home Affairs secretary Mike Pezzullo said that “’systemic’ intrinsically means ‘pertaining to the whole system’.”
Teague said that Home Affairs has a “strong and restrictive definition” of what counts as a systemic weakness.
In the department’s major submission to the inquiry, it said that the systemic weakness prohibition (contained in Section 317ZG of the bill) “prevents a weakness or vulnerability from being built into a single item (like a target service or device) if it would undermine the security of other, interconnected items”.
The submission adds: “That is, where the weakness in one part of the system would compromise other parts of the system or the system itself."
The department said the purpose of the provision “is to protect the fundamental security of software and devices and not expose the communications of Australians to hacking”.
The submission states that tThis “would capture actions that impact a broader range of devices and services utilised by third parties with no connection to an investigation and for whom law enforcement have no underlying lawful authority by which to access their personal data. Accordingly, weaknesses that impact a range of devices across the market, requirements that force a provider to adopt a less secure means of encryption for its users, or capabilities that introduce a material ‘hole’ in the devices or services of innocent, third-party users, that could be exploited by malicious actors are covered by the prohibition.”
Teague said that Home Affairs has defined systemic weakness as something that “affects all the other users of the system”.
“And I would contrast that with this notion of jeopardising the information of others,” she added. “The key differences are first of all it makes a difference if we’re talking about something whether it’s ‘jeopardising’ or immediately inserting a weakness in; jeopardising is a weaker concept of increasing risk to others as opposed to definitely making a break.
“Second thing is whether it affects every other part of a system, or whether it just affects other users. And the third thing is whether a probabilistic thing that could happen in the future versus a certain thing that happens right now.”
Teague advocates for the “broader definition” adding that she was “highly suspicious” of things that increase the risk of a problem, even if they don’t definitely create an immediate problem.
She said the government needed to seriously consider the future consequences of the definition of systemic weakness “in the context of ever-escalating cyber attacks and complex systems — because something that doesn’t cause a problem right now might nevertheless cause a serious problem down the track”.
Part of the hearing focused on the 2016 legal stoush between Apple and the FBI over the tech company’s resistance to breaking the security on an iPhone used by one of the San Bernardino shooters.
Apple argued that doing so would “jeopardise the information of other users even if it wasn’t directly targeted at the other users,” Teague said.
The iPhone-maker argued that the “existence of that software update signed with Apple’s trusted software signing key increased the likelihood that a similar exploit could be exposed and then targeted against other users of the system”.
In that case, the FBI’s request would have likely met the definition of “jeopardising the information” of people who weren’t the targets of the investigation, but not the more restrictive definition of “systemic weakness” pushed by Home Affairs, the Melbourne Uni researcher argued.
Culnane told the hearing that he was concerned over the lack of limitations on what can be requested from a tech company via a TAR.
“Home Affairs appears to be justifying this on the grounds that the TARs are voluntary and therefore a communications provider can refuse to comply,” he told the hearing.
“However, this fails to acknowledge a number of issues… firstly the government wields enormous ‘soft power’ which could be brought to bear on a communications provider which refuses to voluntarily comply. Secondly, we cannot expect communications providers, many of whom are multinational tech companies, to be guardians of the privacy and digital rights of the public. That is a role that belongs to parliament.
“Thirdly many of those communication providers have business models which are based on surveillance capitalism.”
“How can we expect a tech company to act in a manner that is counter to their own business model?” he told the hearing.
TARs “risk creating a conflict of interest for government,” he added. “At a time when individual privacy is increasingly under threat from large multinationals, how can we expect governments to hold those organisations to account if doing so could reduce the government’s interception capability?”
Culnane told the hearing that some of the largest tech companies had participated in the US National Security Agency’s PRISM surveillance program.
“We should not forget that one of the biggest drivers of what has been termed ubiquitous encryption was the revelation of the scale and the invasiveness of that surveillance,” he added.
“We are where we are in part because those same tech companies did not push back enough in the first place. To entrust the protection of privacy of the Australian public to those same companies again in our opinion would be a grave error.”
The inquiry’s next public hearing is scheduled for 27 November.