close
close

first Drop

Com TW NOw News 2024

Privacy watchdog drops investigation into Clearview AI over use of Australians’ images in facial recognition technology | Artificial Intelligence (AI)
news

Privacy watchdog drops investigation into Clearview AI over use of Australians’ images in facial recognition technology | Artificial Intelligence (AI)

Australia’s privacy watchdog has dropped a lawsuit against Clearview AI over its use of images of Australians’ faces in its facial recognition service, despite the company still failing to comply with a court order ordering it to remove the images.

Clearview AI is a facial recognition service that has been used by law enforcement agencies around the world, including in limited trials in Australia. It claims to have a database of more than 50 billion faces scraped from the internet, including social media.

In 2021, the Office of the Australian Information Commissioner ruled that Clearview AI had breached the privacy of Australians by collecting these images without consent, and ordered the company to stop collecting the images and delete the images it contained within 90 days. Clearview initially appealed the decision to the Administrative Appeals Tribunal, but abandoned its appeal in August last year before the AAT could make a ruling, meaning the original decision stands.

There is no indication whether Clearview has since complied with the order and the company did not respond to a request for comment.

On Wednesday, a year after Clearview dropped its appeal, Privacy Commissioner Carly Kind announced that the OAIC would no longer pursue Clearview to enforce the ruling.

“I have thought extensively about whether the OAIC should invest more resources in investigating the actions of Clearview AI, a company that has already been investigated by the OAIC and has become the subject of regulatory investigations in at least three jurisdictions around the world, as well as a class action in the United States,” she said.

“Given all the relevant factors, I am not convinced that further action is warranted at this time in the specific case of Clearview AI.”

In June, Clearview AI agreed to settle a class-action lawsuit brought against the company over alleged privacy violations of Americans enrolled in its system for an undisclosed amount without admitting wrongdoing. The settlement is subject to court approval.

A 2022 settlement with the American Civil Liberties Union (ACLU) also prohibited Clearview AI from selling its database to most companies in the United States and all agencies, including law enforcement agencies in Illinois, for five years.

Kind said Wednesday that Clearview’s practices have become increasingly common and problematic over the years as part of the movement toward generative models for artificial intelligence.

The OAIC and 11 other regulators issued a statement in August last year calling on publicly accessible sites to take reasonable steps to prevent personal information on their sites from being scraped unlawfully.

Kind said all regulated entities in Australia that use AI to collect, use or disclose personal information must comply with the Privacy Act.

“The OAIC will soon issue guidance for entities wishing to develop and train generative AI models, including how the APPs (Australian Privacy Principles) apply to the collection and use of personal information. We will also issue guidance for entities using commercially available AI products, including chatbots.”

Correspondence between Clearview AI and the OAIC as part of the AAT’s unsuccessful appeal against a Freedom of Information Act request last year shows that Clearview believed it did not fall within Australian jurisdiction as it had decided not to do business in Australia. It had also blocked its web crawler from obtaining images from servers in Australia.

Australian and British users were able to opt out of the system, but when the company began scraping the internet again in January last year, it took no steps to assure users that the scraped faces were not those of Australians whose images were located on servers outside Australia, such as those used by social media sites like Facebook.