The Australian government is going to force Big Tech to scan your photos and emails for illegal things
Developed by key service providers and delivered to eSafety in February for review, the problematic proposed Designated Internet Service (DIS) and Relevant Electronic Services (RES) codes are just two of eight sectoral guidelines established after the passage of the Online Safety Act 2021.
DIS covers providers of apps, websites, and file and photo storage services like Apple iCloud, Google Drive and Microsoft OneDrive while RES relates to dating sites, online games, and instant messaging.
Despite making “significant amendments” following her September demands and feedback to the February drafts, eSafety Commissioner Julie Inman Grant said the revised draft DIS and RES guidelines – two of eight codes set to be finalised – “still don’t meet our minimum expectations”.
Shortfalls, she explained, include the failure of the DIS to “detect and flag known child sexual abuse material” in file and photo storage services as well as the failure of RES providers to detect and flag “horrendous” material in email and partially encrypted messaging services.
“We know there are proactive steps they can take to stem the already rampant sharing of illegal content,” Inman Grant said, adding that her office saw a 285 per cent year-on-year increase in reports of child sexual exploitation and abuse material (CSAM) during the first quarter of this year.
Now that tech companies have failed to meet the requirements of the Act, Inman Grant will exercise her powers under section D145(1)(a)(ii) of the Act – which empowers her to “determine a standard” if a draft code “does not contain appropriate community safeguards”.
That will see her office develop standards for DIS and RES service providers that will be mandatory and enforceable – meaning that Australians will be able to lodge complaints about breaches with eSafety, which can investigate and impose injunctions, enforceable undertakings, and financial penalties of nearly $700,000 per day.
Five other codes – covering social media services, Internet carriage services, app distribution services, hosting services, and equipment codes – were accepted and will take effect six months from the day they are officially registered.
Inman Grant also deferred a ruling on an eighth code – pertaining to search engine operators – by giving companies four additional weeks to address the implications of rapidly-growing generative AI services that are increasingly being enmeshed with search engines like Google, Bing, Opera, and Brave.
Picking a fight with the world
Inman Grant may be talking tough on filtering, but the new regulations – which will apply both to Australian service providers, and overseas vendors providing services to Australians – will put her on a collision course with tech companies that have already been there and done that.
In mid 2021, Apple announced plans to automatically warn users if they send or receive naked images, and to scan user content stored in iCloud by comparing the ‘hashes’ of stored files with those of known CSAM; users with too many flags would be referred to authorities.
It’s an approach that is already regularly used by Google – sometimes with unintended consequences – but by late last year, Apple had paused the feature after a massive backlash from researchers, civil liberties advocates, and others alleging that content scanning amounted to a gross invasion of users’ privacy and a slippery slope to mass surveillance.
Whether using legal instruments to mandate compliance will prove more effective is yet to be seen. But by drawing a new line on CSAM filtering – and forcing Apple, Google, Microsoft, Meta, Twitter and others to toe it – eSafety risks rekindling a vitriolic international privacy and civil rights debate.
Increasing government scrutiny of their operations may have helped tech giants work more closely than ever with Australian regulators – but this latest mandate could, if past experiences are any indication, see those companies restrict Australians’ access to core services in the name of user privacy.
Filtering of “horrendous” illegal content “and other basic requirements are non-negotiable,” an undeterred Inman Grant said.
“While we don’t take this decision lightly, we feel that moving to industry standards is the right one to protect the Australian community.”
Credit: Source link
Comments are closed.