Daily BulletinDaily Bulletin

The Conversation

  • Written by Emma Schleiger, Research Scientist, CSIRO
Artificial intelligence in Australia needs to get ethical, so we have a plan

The question of whether technology is good or bad depends on how it’s developed and used. Nowhere is that more topical than in technolgies using artificial intelligence.

When developed and used appropriately, artificial intelligence (AI) has the potential to transform the way we live, work, communicate and travel.

New AI-enabled medical technologies are being developed to improve patient care. There are persuasive indications that autonomous vehicles will improve safety and reduce the road toll. Machine learning and automation are streamlining workflows and allowing us to work smarter.

Read more: To protect us from the risks of advanced artificial intelligence, we need to act now

Around the world, AI-enabled technology is increasingly being adopted by individuals, governments, organisations and institutions. But along with the vast potential to improve our quality of life, comes a risk to our basic human rights and freedoms.

Appropriate oversight, guidance and understanding of the way AI is used and developed in Australia must be prioritised.

AI gone wild may conjure images of The Terminator and Ex Machina movies, but it is much simpler, fundamental issues that need to be addressed at present, such as:

  • how data is used to develop AI
  • whether an AI system is being used fairly
  • in which situations should we continue to rely on human decision-making?

We have an AI ethics plan

That’s why, in partnership with government and industry, we’ve developed an ethics framework for AI in Australia. The aim is to catalyse the discussion around how AI should be used and developed in Australia.

The ethical framework looks at various case studies from around the world to discuss how AI has been used in the past and the impacts that it has had. The case studies help us understand where things went wrong and how to avoid repeating past mistakes.

We also looked at what was being done around the world to address ethical concerns about AI development and use.

Based on the core issues and impacts of AI, eight principles were identified to support the ethical use and development of AI in Australia.

  1. Generates net benefits: The AI system must generate benefits for people that are greater than the costs.

  2. Do no harm: Civilian AI systems must not be designed to harm or deceive people and should be implemented in ways that minimise any negative outcomes.

  3. Regulatory and legal compliance: The AI system must comply with all relevant international, Australian local, state/territory and federal government obligations, regulations and laws.

  4. Privacy protection: Any system, including AI systems, must ensure people’s private data is protected and kept confidential and prevent data breaches that could cause reputational, psychological, financial, professional or other types of harm.

  5. Fairness: The development or use of the AI system must not result in unfair discrimination against individuals, communities or groups. This requires particular attention to ensure the “training data” is free from bias or characteristics which may cause the algorithm to behave unfairly.

  6. Transparency and explainability: People must be informed when an algorithm is being used that impacts them and they should be provided with information about what information the algorithm uses to make decisions.

  7. Contestability: When an algorithm impacts a person there must be an efficient process to allow that person to challenge the use or output of the algorithm.

  8. Accountability: People and organisations responsible for the creation and implementation of AI algorithms should be identifiable and accountable for the impacts of that algorithm, even if the impacts are unintended.

In addition to the core principles various toolkit items are identified in the framework that could be used to help support these principles. These include impact assessments, ongoing monitoring and public consultation.

A plan, what about action?

But principles and ethical goals can only go so far. At some point we will need to get to work on deciding how we are going to implement and achieve them.

There are various complexities to consider when discussing the ethical use and development of AI. The vast reach of the technology has potential to impact every facet of our lives.

AI applications are already in use across households, businesses and governments, most Australians are already being impacted by them.

There is a pressing need to examine the effects that AI has on the vulnerable and on minority groups, making sure we protect these individuals and communities from bias, discrimination and exploitation. (Remember Tay, the racist chatbot?)

There is also the fact that AI used in Australia will often be developed in other countries, so how do we ensure it adheres to Australian standards and expectations?

Your say

The framework explores these issues and forms some of Australia’s first steps on the journey towards the positive development and use of AI. But true progress needs input from stakeholders across government, business, academia and broader society.

Read more: Careful how you treat today's AI: it might take revenge in the future

That’s why ethical framework discussion paper is now open to public comment. You have until May 31, 2019, to have your say in Australia’s digital future.

With a proactive approach to the ethical development of AI, Australia can do more than just mitigate against any risks. If we can build AI for a fairer go, we can secure a competitive advantage as well as safeguard the rights of Australians.

Authors: Emma Schleiger, Research Scientist, CSIRO

Read more http://theconversation.com/artificial-intelligence-in-australia-needs-to-get-ethical-so-we-have-a-plan-114438

Slow to adjust to the pandemic's 'new normal'? Don't worry, your brain's just learning new skills

arrow_forward

The S P 500 nears its all-time high. Here's why stock markets are defying economic reality

arrow_forward

How You Can Use Essential Oils at Home

arrow_forward

The Conversation
INTERWEBS DIGITAL AGENCY

Politics

Did BLM Really Change the US Police Work?

The Black Lives Matter (BLM) movement has proven that the power of the state rests in the hands of the people it governs. Following the death of 46-year-old black American George Floyd in a case of ...

a Guest Writer - avatar a Guest Writer

Scott Morrison: the right man at the right time

Australia is not at war with another nation or ideology in August 2020 but the nation is in conflict. There are serious threats from China and there are many challenges flowing from the pandemic tha...

Greg Rogers - avatar Greg Rogers

Prime Minister National Cabinet Statement

The National Cabinet met today to discuss Australia’s COVID-19 response, the Victoria outbreak, easing restrictions, helping Australians prepare to go back to work in a COVID-safe environment an...

Scott Morrison - avatar Scott Morrison

Business News

20 year old Aussie marketing genius helping billion dollar household brands

Australian digital marketing agency, Co Media, founded by 20 year old marketing genius Lucas Cook, is making its mark on the world stage by gaining a number of high profile clients and quickly b...

News Company - avatar News Company

5 Reasons to Choose Pipe Relining

There’s nothing like a damaged pipe to stress you out and keep you up throughout the night. Depending on the extent of the damage, a pipe repair can be a complicated and time-consuming process. ...

News Company - avatar News Company

Reinventing The Outside Of Your Office

Efficient work is a priority in most offices. You need a comfortable interior that is functional too. The exterior also affects morale. Big companies have an amazing exterior like university ca...

News Company - avatar News Company



News Company Media Core

Content & Technology Connecting Global Audiences

More Information - Less Opinion