Passionately curious about Data, Databases and Systems Complexity. Data is ubiquitous, the database universe is dichotomous (structured and unstructured), expanding and complex. Find my Database Research at SQLToolkit.co.uk . Microsoft Data Platform MVP

"The important thing is not to stop questioning. Curiosity has its own reason for existing" Einstein

Thursday 2 January 2020

Asilomar AI Principles

Data Ethics has been brought to the fore by AI algorithms showing bias. There are various insightful articles which discuss data ethics. The Asilomar Conference on Beneficial AI organized by the Future of Life Institute was held January 5-8 2017 at the Asilomar Conference Grounds in California. The conference aimed to address and formulate principles of beneficial AI. With more than 100 thought leaders and researches in economics, law, ethics and philosophy at the conference, it resulted in the creation of a set of guidelines for AI research. There are 23 Asilomar AI Principles of which many are related to ethics and values.

This is a significant enhancement on the Isaac Asimov's "Three Laws of Robotics" which were shared in his 1942 short story "Runaround". The Three Laws he listed were:

  • A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  • A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  • A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

In 2016 Satya Nadella did share a vision for more relevant AI rules.

  • AI must be designed to assist humanity.
  • AI must be transparent. 
  • AI must maximize efficiencies without destroying the dignity of people. 
  • AI must be designed for intelligent privacy. 
  • AI must have algorithmic accountability. 
  • AI must guard against bias. 

From this it has led to data ethics becoming its own branch of ethics.

No comments:

Post a Comment

Note: only a member of this blog may post a comment.