That selfie you posted can be used to train machines – which might not be a bad thing

Did you hear the one from the person who shared a steamy selfie on social media?

It’s had a lot of likes, but the biggest one came from tech companies that used it to train their artificial intelligence systems.

This week, consumer group CHOICE revealed that retail giants Kmart, Bunnings and The Good Guys were using facial recognition technology to capture consumer biometrics in some of their stores.

Bunnings told CHOICE it was used to “identify persons of interest who have previously been involved in concerning incidents at our stores.”

Kmart and The Good Guys did not respond to ABC requests for comment.

As part of its survey, CHOICE asked over 1,000 people what they thought of the technology – 65% said it was concerning and some described it as “scary and invasive”.

You might agree, but it’s worth considering how this kind of surveillance compares to the decisions we make every day to voluntarily disclose our personal data and images.

Sharing selfies on social media platforms, using a streaming service or using a loyalty card all disclose more personal information than the CHOICE facial recognition technology probed.

Facial recognition technology used when airline passengers arrive at a terminal.(ABC)

Although Meta, the global giant that owns Facebook and Instagram, stopped using the technology last year, that doesn’t mean your photos aren’t still being harvested by companies that create databases of searchable faces.

This might be new to many people, and it may even prompt some to delete their accounts.

Most won’t.

“Algorithms will interpret these photos and use these results to better identify the individual captured in the surveillance image,” said University of the Sunshine Coast cyberintelligence expert Dennis B Desmond.

And it doesn’t matter whether it’s high or low quality images.

“Bad or blurry images are also useful for training the algorithm, because surveillance images are not always front, front or clear and not pixelated,” he said.

Are the benefits misunderstood?

Dr Desmond said many people don’t understand what they can get in return for giving up some level of privacy.

In Osaka, Japan, facial recognition technology is being used at some train stations to allow people to pass through turnstiles without having to pull out their travel card.

Retailers rely on it to reduce shoplifting. They can be notified if someone has stolen from the store before entering it again.

Law enforcement across Australia uses it to disrupt serious and violent crime, as well as identity theft.

Dr Desmond said if people heard more about these tangible benefits, they might have a different attitude.

“People see a lot about the collection, but they rarely hear about the use of this data for interdiction or prevention,” he said.

A middle aged man in a black suit and tie
Former FBI agent Dennis Desmond has seen how facial recognition can be used to identify terrorists.(Provided)

But our feelings about this technology, which was first conceived in the 1960s in the United States, are unlikely to change until there is more transparency.

Online, users can limit the data stored somewhat by changing their cookie preferences.

But, unless we’re wearing a disguise, we can’t turn facial recognition on or off when used in environments like retail.

In fact, there is no privacy law in Australia and no specific law on how facial recognition data can be used unless you live in the ACT.

Sarah Moulds, lecturer in law at the University of South Australia, doesn’t think facial recognition is inherently bad, but said three crucial elements need to be clarified: consent, quality and use by some thirds.

If people haven’t consented to the surveillance, if they can’t believe the view is of high quality, and if they don’t know who the data is being shared with, they have every right to be concerned, a said Dr. Molds.

Dr Sarah Moulds, UniSA legal expert.
Sarah Molds said she was optimistic about facial recognition as long as vulnerable people can be protected.(UniSA)

High-quality imaging is especially important for reducing racial bias, as the technology is the most accurate for identifying white males.

For this reason, the Australian Human Rights Commission has warned against the use of facial recognition images in “high risk circumstances”.

“We shouldn’t rely on this technology when it comes to proving someone’s innocence or determining someone’s refugee status or anything like that,” Dr Molds said.

Technology will continue to guess how we live

For many people, it doesn’t matter if their face is scanned in certain settings.

Things get more complicated if this biometric information is combined with other data, such as their financial or medical records.

Experts say that from this combination, someone could deduce things like how much you earn, what level of education you have and what you do on weekends.

It could also help companies make decisions about whether to give you credit, insurance, employment. The list continues.

“It becomes a bit like the minority report… people are concerned that companies are making decisions about my health care based on the fact that I buy chocolate every week,” Dr Desmond said.

Minority report
Steven Spielberg’s minority report centered on fictional technology that can predict crimes before they happen.(Supplied: 20th Century Fox)

But the reality is that this kind of data is already available through our online business, which has been harvested and sold for years.

Despite this, Dr Desmond said lawmakers need to take a more proactive approach to regulating facial recognition.

“I think there absolutely needs to be more government regulation and control over technologies used in the commercial sector when they involve privacy and the rights of individuals,” he said.

Dr Molds said tougher laws are also imperative if vulnerable people – such as those without official identity papers – are to be protected from the potential misuse of artificial intelligence.

And, at the end of the day, there will always be people who just want to enjoy the freedom of being forgotten when they walk out of a supermarket.

[Because] as we have seen, neither the government nor the private sector have a very good track record of protecting our data from improper loss,” said Dr. Desmond.

James G. Williams