Daily Bulletin

The Times Real Estate

.

  • Written by Katharine Kemp, Associate Professor, Faculty of Law & Justice, UNSW Sydney
Photos of Australian kids have been found in a massive AI training data set. What can we do?

Photos of Australian children have been used without consent to train artificial intelligence (AI) models that generate images.

A new report from the non-governmental organisation Human Rights Watch has found the personal information, including photos, of Australian children in a large data set called LAION-5B. This data set was created by accessing content from the publicly available internet. It contains links to some 5.85 billion images paired with captions.

Companies use data sets like LAION-5B to “teach” their generative AI tools what visual content looks like. A generative AI tool like Midjourney or Stable Diffusion will then assemble images from the thousands of data points in its training materials.

In many cases, the developers of AI models – and their training data – seem to be riding roughshod over data protection and consumer protection laws. They seem to believe that if they build and deploy the model, they will be able to achieve their business goals while law or enforcement is catching up.

The data set analysed by Human Rights Watch is maintained by German nonprofit organisation LAION. Stanford researchers have previously found child sexual abuse imagery in this same data set.

LAION has now pledged to remove the Australian kids’ photos found by Human Rights Watch. However, AI developers that have already used this data can’t make their AI models “unlearn” it. And the broader issue of privacy breaches also remains.

If it’s on the internet, is it fair game?

It’s a misconception to say that because something is publicly available, privacy laws don’t apply to it. Publicly available information can be personal information under the Australian Privacy Act.

In fact, we have a relevant case when facial recognition platform Clearview AI was found to breach Australians’ privacy in 2021. The company was scraping people’s images from websites across the internet to use in a facial recognition tool.

The Office of the Australian Information Commissioner (OAIC) ruled that even though those photographs were already on websites, they were still personal information. More than that, they were sensitive information.

It held Clearview AI had contravened the Privacy Act by failing to follow obligations about collection of personal information. So, in Australia, personal information includes publicly available information.

AI developers need to be very careful about the provenance of the data sets they’re using.

Can we enforce the privacy law?

This is where the Clearview AI case is relevant. There are potentially strong arguments LAION has breached current Australian privacy laws.

One such argument involves the collection of biometric information in the form of facial images without the consent of the individual.

Australia’s information commissioner ruled Clearview AI had collected sensitive information without consent. Additionally, this was done by “unfair means”: scraping people’s facial information from various websites for use in a facial recognition tool.

Under Australian privacy laws, the organisation gathering the data also has to provide a collection notice to the individuals. When you have these kinds of practices – broadly scraping images from across the internet – the likelihood of a company giving appropriate notice to everybody concerned is vanishingly small.

If it’s found that Australian privacy law has been breached in this case, we need strong enforcement action by the privacy commissioner. For example, the commissioner may be able to seek a very large fine if there is a serious interference with privacy: the greater of A$50 million, 30% of turnover, or three times the benefit received.

The federal government is expected to release an amendment bill for the Privacy Act in August. It follows a major review of privacy law conducted over the last couple of years.

As part of those reforms, there have been proposals for a children’s privacy code, recognising that children are in an even more vulnerable position than adults when it comes to the potential misuse of their personal information. They often lack agency over what’s being collected and used, and how that will affect them throughout their lives.

What can parents do?

There are many good reasons not to publish pictures of your children on the internet, including unwanted surveillance, the risk of identification of children by those with criminal intentions and use in deepfake images – including child pornography. These AI data sets provide yet another reason. For parents, this is an ongoing battle.

Human Rights Watch found photos in the LAION-5B data set that had been scraped from unlisted, unsearchable YouTube videos. In its response, LAION has argued the most effective protection against misuse is to remove children’s personal photos from the internet.

But even if you decide not to publish photos of your children, there are many situations where your child can be photographed by other people and have their images available on the internet. This can include daycare centres, schools or sporting clubs.

If, as individual parents, we don’t publish our children’s photos, that’s great. But avoiding this problem wholesale is difficult – and we should not put all the blame on parents if these images end up in AI training data. Instead, we must keep the tech companies accountable.

Authors: Katharine Kemp, Associate Professor, Faculty of Law & Justice, UNSW Sydney

Read more https://theconversation.com/photos-of-australian-kids-have-been-found-in-a-massive-ai-training-data-set-what-can-we-do-233868

Business News

Insulation Solutions for Meeting Modern Industrial Standards

As global energy costs soar and environmental regulations tighten, industries face unprecedented pressure to optimise their operations while minimising their ecological footprint. Modern industrial ...

Daily Bulletin - avatar Daily Bulletin

How Australian Startups Should Responsibly Collect, Use and Store Customer Data?

Owing to the digital landscape, data is the most important currency in the market. From giant e-commerce sharks to small businesses, every company is investing heavily to responsibly collect data an...

Daily Bulletin - avatar Daily Bulletin

Revolutionising Connections - The Power of Customer Engagement Software

As time goes by, customer expectations keep on rising ever so rapidly. Businesses that must keep pace will need future-ready tools to deliver connectedness at every touchpoint. Customer engagement a...

Daily Bulletin - avatar Daily Bulletin

LayBy Deals