Consumer group Choice has referred retailers Kmart, Bunnings and The Good Guys to the Office of the Australian Information Commissioner (OAIC) after its investigation found they were using facial recognition technology to capture the “faceprint” of customers who entered selected stores in a possible breach of the Privacy Act.
The investigation asked 25 leading Australian retailers whether they used facial recognition technology, and analysed their privacy policies. It found that only Kmart, Bunnings and The Good Guys were capturing biometric data of customers.
Choice noted 76% of customers were unaware retailers used video cameras to capture their facial features, and described the use of facial recognition by those retailers as “a completely inappropriate and unnecessary use of the technology”.
According to Choice, how Kmart, Bunnings or The Good Guys are using facial technology is akin to “collecting your fingerprints or DNA every time you shop”. Choice added that it was also unethical and would erode customer trust.
But Bunnings chief operating officer Simon McDowell told SmartCompany that Choice’s characterisation was “inaccurate” and that Bunnings was “disappointed.”
“This technology is used solely to keep team and customers safe and prevent unlawful activity in our stores, which is consistent with the Privacy Act,” McDowell said, adding that it was an important tool in handling a “number of challenging interactions” the Bunnings team had faced, and that it helped “prevent repeat abuse and threatening behaviour towards our team and customers”.
“There are strict controls around the use of the technology which can only be accessed by specially trained team. This technology is not used for marketing, consumer behaviour tracking, and images of children are never enrolled.
“We let customers know if the technology is in use through signage at our store entrances and also in our privacy policy, which is available via the homepage of our website.”
The Good Guys echoing similar sentiments, saying in a statement that it was trialling face and feature recognition technology in two stores using a new CCTV system. The stores also included signage alerting customers, as did the privacy policy, available on its website.
“This technology is used solely for the purposes of loss prevention and the safety of our store team members and customers,” a spokesperson told SmartCompany.
In a statement to SmartCompany, OAIC director of strategic communications Andrew Stokes said the OAIC “will consider the information from Choice in line with our regulatory action policy”.
“Biometric information, as collected by facial recognition technology, is sensitive personal information under the Privacy Act,” Stokes said.
“In most situations, an organisation that is covered by the Privacy Act must not collect sensitive information unless the individual consents to the collection and the information is reasonably necessary for its functions or activities.”
According to the OAIC’s four key elements of consent, the individual has to be adequately informed before giving consent, they have to give consent voluntarily, the consent has to be current and specific, and the individual has to have the capacity to understand and communicate their consent.
“When asking for your consent, an organisation shouldn’t ask for a broader consent than is necessary,” the OAIC said.
In October 2021, the OAIC determined that retailer 7-Eleven interfered with customers’ privacy by “collecting sensitive biometric information that was not reasonably necessary for its functions and without adequate notice or consent”.
The information was collected through 7-Eleven’s customer feedback mechanism, which the Australian information commissioner and privacy commissioner Angelene Falk said was “not reasonably necessary for the purpose of understanding and improving customers’ in-store experience”.
Customers “in the dark” about “creepy and invasive” technology
As part of its investigation, Choice also approached more than 1000 Australian customers in a nationally representative survey to understand whether they were aware of the use of facial recognition technology. Choice found that three in four (or 76%) of respondents did not know retailers were using such technology.
Most of the respondents (83%) said that stores should be required to inform customers, while 78% expressed concern about data security. Nearly two-thirds (or 65%) said they were concerned it would be used to create profiles of customers, and that it could be a cause of harm.
Respondents also said the use of this technology was “creepy and invasive”, while others said it was “unnecessary and dangerous” and discouraged them from entering stores that used it.
Are conditions of entry enough?
At the heart of concerns about privacy are the conditions of entry some stores place up front. According to Jeannie Paterson, Professor of law and co-director of the Center for AI and Digital Ethics at the University of Melbourne, signage signaling conditions of entry are “insufficient to adequately inform” customers.
Paterson told SmartCompany that while stores can impose conditions or restrictions of entry (for instance asking customers to wear shoes), using facial recognition was “extremely intrusive in terms of personal privacy”.
“Under the Privacy Act, you have to tell people if you’re going to collect that kind of sensitive personal information,” Paterson said.
“Strictly speaking, stores are doing that (by stating conditions of entry) but it’s not complying in any substantive way, because we know that people don’t read conditions of entry, as they’re usually focused on other things. The sign is not sufficient to genuinely inform people about what is happening,” she said.
Paterson added that it was not fair on customers to make a choice between having their faces surveilled and getting the products they need.
“The sign is not enough to protect people’s interest,” she said.
Choice consumer data advocate Kate Bower says that while some stores had signs at the entrance informing customers about the use of technology, these signs were inconspicuous and small, and could easily be missed by most shoppers. Such collection of biometric data could therefore be a breach of the Privacy Act, she says.
She also added that most companies had their privacy policies online, “but because we’re talking about in-person retail shops, it’s likely that no one is reading a privacy policy before they go into a store”.
Reform needed
Tim Miller, co-director of the Center for AI and Digital Ethics at the University of Melbourne, says that while there was nothing inherently wrong with the technology itself and it’s used in many ordinary applications — like unlocking smartphones — it becomes problematic the moment it’s used for surveillance.
“It’s primarily a privacy issue, and people would probably find it invasive if they knew they could be tracked,” he said.
Paterson and Miller said even using the technology as a response to shoplifting or unlawful behaviour in stores was a problematic justification.
Paterson says such a response was completely disproportionate given the potential for error, as facial recognition technology was not completely accurate, and there is existing bias in the systems as they were constructed on incomplete data sets. Paterson also pointed out that these data sets are often trained on a narrow, frequently white demographic and as a result can misidentify and discriminate against others.
Miller added that this data was being stored somewhere.
“There’s not much transparency on it. It’s not clear if they’re storing your face, if they know the days you’re coming in, are they going to track you around the store and see what you like,” Miller said.
“What is the use of it, there’s no detail on how they’re using your data, where they’re storing it and what they’re storing it for?”
Edward Santow, a UTS industry Professor for responsible technology and a former Australian human rights commissioner, told SmartCompany that the use and storage of such data raised human rights concerns.
“This report by Choice highlights how our laws are inadequate in protecting against harmful forms of facial recognition technology.
“Reform is needed in this area,” Santow said. “Australian law should encourage positive innovation, while protecting against exploitative uses of technology.
“[This technology] should set ‘red lines’ to prohibit harmful facial recognition; it should set stronger human rights protections for higher-risk uses of facial recognition; [and] it should enable facial recognition that has a clear public benefit.”
A five-year review of the Privacy Act is currently being conducted by the Attorney General, with Choice’s consumer data advocate stating that it was an opportunity “to strengthen measures around the capture and use of consumer data, including biometric data.”
Kmart did not responded to SmartCompany‘s requests for comment.
Comments