A security guard working at a government building has won a fight against his employer over the use of facial recognition technology, in a case that raises privacy, cyber security and industrial relations concerns.
Subscribe now for unlimited access.
$0/
(min cost $0)
or signup to continue reading
Robert Beverly is a security guard stationed at the Department of Prime Minister and Cabinet building in Barton, employed by Certis Security, a Singaporean-owned private subcontractor that provides security guards to a number of government buildings.
Earlier this year the company introduced new software that required employees to take a photo of themselves on an app that used facial recognition software to monitor their attendance at work, a development Mr Beverly said raised concerns for him, both as someone who closely guards his privacy and online presence and with his work guarding an Australian government building.
"My first inclination was 'this is unnecessary, this is overreach,'" Mr Beverly told The Canberra Times.
Despite being told using the app was "voluntary," Mr Beverly was given three written performance notices for refusing to use the app.
This meant the company didn't truly give the opportunity for consent to use the system he said.
"They're requiring consent because the Privacy Principles require consent for any kind of biometric data, that's why they put a consent button in the system, but it's not real consent anyway because they proceeded with disciplinary action."
His concerns were heightened by the nature of his work, that requires government security clearance, and the lack of assurance the company gave him that the data would be kept securely.
He was particularly worried because Certis Security's Singapoeran parent company is owned by Temasek Holdings, which is owned by the Singaporean government.
After involving the United Workers Union, the company agreed to allow Mr Beverly to sign in using other methods, but he says other staff who also had concerns but didn't feel able to speak up are still using it.
"It is highly alarming to know these practices are taking place," said United Workers Union director of property Lyndal Ryan.
"Any direction to provide biometric data, which includes facial recognition software, breaches the Privacy Act 1988 and is unlawful. Every security officer has the right to opt out without being threatened with disciplinary action."
A spokeswoman for Certis Security said the new software had been introduced to ensure the highest-possible level of security at key sites across Australia.
"The system uses facial recognition technology in order to verify that the right security personnel are performing the duties assigned to them," the spokeswoman said.
"Facial recognition technology is one of the most effective methods of minimising the risk and likelihood of identity fraud, creating a safer environment for our team members, our customers and the wider community."
The company maintains the system is fully compliant under Australian privacy laws.
"All physical infrastructure used to store the data is located within Australia in a highly secured data centre, using infrastructure which complies with global industry standards. All image data is encrypted when stored using the federal standard encryption."
Since publication the company has provided further comments, saying the system was only for operational use.
"Data is stored in servers in Australia, housed in highly secured data centres, and no data is transmitted from the system to outside of Australia. The Singapore government does not have access to the system nor its associated data."
The company didn't comment on Mr Beverly's case in particular and said all clients - including those government departments - had been consulted before the new software was used on their premises, and was only used after being approved.
Use of biometric data by employers is increasing, but it isn't an easy area for employees to voice their concerns.
An employee at a saw mill in Queensland was fired after refusing to use a sign in system that used fingerprint readings.
He eventually won an unfair dismissal claim against the company after the Fair Work Commission found the company had failed to uphold the Privacy Act by not giving employees sufficient notification or allowing for a process of informed consent.
Professor Lyria Bennett Moses, Director of the Allens Hub for Technology, Law and Innovation at the University of New South Wales, said it was important people were told where their data was being stored, and assured their photos wouldn't be used for any other purposes, like in software development.
READ MORE:
"The whole argument is data is a toxic asset, and in fact the more data you have the more risk you create," Professor Bennett Moses said.
The use of biometric data raises privacy concerns due to the wealth of extra information about a person it can provide, says Dr Monique Mann, Senior Lecturer in Criminology at Deakin University and board member of the Australian Privacy Foundation.
"it's a very serious biometric identifier because it does act as a conduit between an individual's presence in a physical space," she said.
"There's a potential for tracking through public places through CCTV, also the connection of that information to other information that is out there about them. That could be in a AGSVA database or it could be an individual's social media profile, which is what we saw with Clearview AI.
"It's serious information indeed it's defined under the Privacy Act as sensitive information."