Hey! Get paid to surf the internet and experience a faster internet browser. Save data and battery life by blocking tracking software and banner adds! Browse Faster!! 2X Faster than Chrome. Click here 👉 https://brave.com/ilg626
Australia is now the first Western nation to ban security, following a decision by its parliament to pass a bill forcing companies to hand over encrypted data to police upon demand. The government will be allowed to demand this without judicial review or oversight of any kind, beyond the requirement to get a warrant in the first place.
Furthermore, the law requires corporations to build tools to give them the ability to intercept data sought by police when such tools do not already exist. While the bill has only passed Australia’s lower chamber, the upper chamber has indicated it will pass the legislation provided there are later votes on unspecified amendments to the current bill.
Australia has become the first nation to enact into legislation what both the UK and US governments very much want — government-mandated backdoors into encryption systems that require corporations to hand over data on demand. The response of the tech industry has been straightforward: There is no way to perform this task that does not fundamentally weaken security. And for all that journalism is often the process of laying out multiple sides to an argument or debate, there’s no actual debate to be had, here — not, at least, as far as the security principles are concerned.
We can certainly debate whether people should be entitled to privacy, or if the governments of nominally free countries should have access to this information in the first place. But as to whether it’s actually possible to build secret backdoors into security systems without fundamentally weakening them, the evidence is simple: No.
As Cindy Cohn wrote in a recent post on Lawfare Blog:
Even without compromising the cryptography, there is no way to allow access for only the good guys (for instance, law enforcement with a Title III warrant) and not for the bad guys (hostile governments, commercial spies, thieves, harassers, bad cops and more). The NSA has had several incidents in just the past few years where it lost control of its bag of tricks, so the old government idea called NOBUS—that “nobody but us” could use these attacks—isn’t grounded in reality.
Putting the keys in the hands of technology companies instead of governments just moves the target for hostile actors. And it’s unrealistic to expect companies to both protect the keys and get it right each time in their responses to hundreds of thousands of law enforcement and national security requests per year from local, state, federal and foreign jurisdictions. History has shown that it’s only a matter of time before bad actors figure out how to co-opt the same mechanisms that good guys use—whether corporate or governmental—and become “stalkers” themselves.
There simply is no debate within the security community on this topic. Creating keys to an encryption system, or, alternately, maintaining the encryption but forcing companies to create tools that allow them to attach a “stalker” to the system to monitor communications invisibly (the UK is proposing this method of surveillance, and the aforementioned Lawfare Blog post has more on this), automatically creates an enormous incentive for anyone aware of the existence of such tools to either try to steal them (if they’re black hats) or leverage them for their own use (if they’re governments).
Once companies are forced to create these tools to operate in the Australian market, they’ll be pressured to bring them to other countries.
The idea that corporations can be trusted to safeguard these vital tools or hold vital data in escrow accounts doesn’t survive contact with reality. Even without government-mandated backdoors, companies regularly suffer breaches and attacks, often leaking personal details of dozens to hundreds of millions of people. The need for better data security is enormous and the solution to this problem is not to create tools that can be used to attack the very concept.
Products from Facebook, Google, Apple, Microsoft, and all such similar efforts will now be required to include systemic weaknesses, while open source products will not be affected for now. In case you’re wondering, according to a survey of the 343 comments made on the bill while it was under discussion, only one of them — and not an Australian citizen at that — was in support. The Australian Parliament simply didn’t care.