Knee-deep in biometric oversight, Australia’s OAIC juggles age assurance, digital ID, FRT

Australia is continuing its push to beef up online privacy and safety laws with a strategy focused on biometrics, having put several major milestones on the books in 2024, with more on the way in 2025. The second “tranche”of privacy reforms to the Australian Privacy Act is forthcoming. The Office of the Australian Information Commissioner (OAIC) is working to deliver an Online Children’s Code by the end of 2026, as it also works to manage Australia’s digital ID system and document verification scheme. And the Age Check Certification Scheme (ACCS)’s Age Assurance Technology Trial is set to release its findings by September.
The OAIC has just received a $14 million cash injection in the government’s latest budget. Yet according to InnovationAus.com, it has also been asked to cut more staff, having already endured a major restructure and job losses amounting to a 30 percent reduction in staffing levels over the past year.
The move is ostensibly to make the OAIC leaner and more agile – and thus better positioned to crack down on invasive tactics such as data harvesting and identity theft. The new budget includes $8.7 million over three years to support enforcement. The remaining $5.3 million will go toward continuing regulatory oversight of the national Digital ID system and the identity document verification service.
However, as cases pile up in backlog and appropriations are set to drop significantly in the following years, the question remains as to whether the Australian regulator can shoulder the growing weight with fewer people.
New Prism taskforce to manage social media age restrictions
A post from consulting firm Privacy 108 runs down key developments in Australian privacy to monitor in 2025. On the question of the children’s code, it looks to the UK’s Age Appropriate Design Code as a potential model for transparency, data minimization principles and age assurance regulations – the latter of which have a new overseer.
Government task forces increasingly love acronyms that give them cool names, and online safety regulators are no slouches in the matter. Witness the new Privacy Reform Implementation and Social Media task force – to be known as Prism – which the OAIC has announced will oversee the implementation of age assurance laws for social media.
Last year, Australia infamously passed a law banning social media platforms for users under 16 years of age, putting Facebook, TikTok and their ilk alongside pornography in the category of age restricted content.
Bunnings decision looms over future of biometric privacy laws
Fallout from the highly publicized Bunnings case on facial recognition in retail has the OAIC shifting to a more “harm-focused approach,” and the case increasingly looks to have significant implications for laws and regulation. A report from MLex looks at what’s at stake, noting that the clash is shaping up as a “significant test” for the OAIC’s ability to enforce the 1988 Privacy Act.
In October, the Administrative Review Tribunal will hear the case, which sees retail giant Bunnings defend itself against allegations that its use of CCTV cameras equipped with facial recognition capabilities in 63 stores “failed to take reasonable steps to implement practices, procedures and systems required to comply with the Privacy Act.” The system captured face biometrics and compared the biometric data against a database of flagged customers.
Bunnings’s argument hinges on staff safety. It says that the OAIC interprets the law in such a way that it would bar “legitimate attempts to thwart shoplifting and violent attacks against staff.” And it points to legal exemptions designed for that purpose.
The retailer also says that because it only stored customers’ face biometrics for 4.17 milliseconds, it shouldn’t really count.
The OAIC has dismissed the time issue, saying its concerns are more broad. For one, it objects to the database of problematic customers, which it says shows shortcomings in image management and quality that may have led to false positives. Its determination notes that “matched individuals, who were the subject of a ‘false positive’ match, were likely treated in the same manner as enrolled individuals, with those consequences aggravated by the fact that they had done nothing to warrant suspicion.”
For its part, Bunnings has agreed to stop using facial recognition technology. But its appeal with the new Administrative Review Tribunal, which in October 2024 replaced the “troubled and politicized” Administrative Appeals Tribunal, asserts that it “reasonably believes” that the collection, use or disclosure of biometric data is “necessary to lessen or prevent a serious threat to the life, health or safety of any individual, or to public health or safety.”
Per MLex, the argument hinges section 16A of the Privacy Act, which lists the situations in which companies are exempt from the provisions of the law.
In effect, Bunnings’ case amounts to a plea of self-defense. Whether or not it can sell that to the tribunal will be a test of the Privacy Act and the OAIC’s ability to enforce it with limited resources.
Article Topics
Australia | biometrics | data privacy | digital ID | facial recognition | Office of the Information Commissioner (OAIC) | regulation
Comments