Articles 3 min read

Are Asimov’s 3 laws really about digital transformation? by Owen Tribe

How Asimov’s 3 laws of robotics could hold the answer to one of the biggest digital transformations, ever – mass redundancy.

AI, robots, automation – something I think we can all agree is dawning on the current workforce. Depending upon your viewpoint, is a probability that large segments of the workforce will suddenly become redundant, having been replaced by robots.

What happens to all those displaced workers and how does this relate to digital transformation?

With mass unemployment and poverty, there are those who argue the time has come for a Universal Basic Income.

Citizens would be entitled to a minimum guaranteed salary that provided enough income to live on – no strings attached – work if you want to. Tax would be apparently need to be higher, to compensate.

On its own, that’s a fairly significant slice of digital transformation.

It’s more complex than that but as I have no qualification, at all, with regards to politics or economics, I won’t offer an opinion. However, I do understand digital economies.

As digitisation (isn’t automation part of the same thing?) becomes ubiquitous, a new form of Governance will need to be applied, because, with a high-degree of autonomous (AI-based) automation, there will need to be laws to govern the use of this technology. It seems highly likely that we will need to implement some form of safety, moral and ethical code to prevent misuse.

It is a profoundly dangerous idea that all the power to control this vast new intelligence should reside solely with those that own it.

This got me thinking of Asimov’s three laws of robotics:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

So, on the one hand we have a load of new autonomous technology and on the other, the people whose jobs these machines have replaced. If robots need laws to govern their use and behaviour, surely they should be taxed accordingly to pay for the Universal Basic Income to support the people they have replaced.

I would expect robots to be more efficient, less error-prone and more cost-effective: cheaper and more profitable, in other words.

But, there is a steep human cost associated with this redundancy and corporate social responsibility would suggest the organisation replacing workers balances that cost against the savings made and pays accordingly.

Once again, when we talk about digital transformation, the focus is often on the technology not the people. But, here is an example where the technology is driving the transformation and the digital part is related to the human cost. I would argue that it should be the other way around: figure out how to transform the workforce in a way that would enable the automation.

Going back to Asimov’s first law, “A robot may not injure a human being or, through inaction, allow a human being to come to harm”. If the worker the robot replaces cannot fend for themselves, it would break the first law. I would argue, therefore, that laws of robotics and the role Government plays in regulating and enforcing it would need to be aligned. Ergo, the robot, or its owner, should provide restitution to a former employee to prevent them coming to harm.

Would this be sufficient?

There are those who argue that the laws do not work in practice and that a form of empowerment would be more suitable. There is too much context to consider which could lead to inaction and Asimov’s stories were all about the breakdown of the three laws.

The rule of empowerment adds context to the laws of robotics. For example, instead of always following the rule “don’t push humans”, a robot would generally avoid pushing them but still be able to push them out of the way of a falling object.

I think that empowerment is probably where we want to end up but I also think that we are light years away in terms of the technology and power required to make that a functioning reality. Whilst this technology is in its infancy, a form of the three laws will be required.

However, given the human costs involved, maybe we should call this the three laws of humanity?

———————————-

Owen Tribes has spent the last 25 years, since the dawn of the Internet, building and leading teams to harness the power of digital and the web. In that time, he has held many senior roles and done some amazing things for his clients. He gets results, fast, whilst hitting very challenging targets in difficult situations. He can help you get something impossible done.

Hear it first

Stay up to date with our latest content and events

Watch, read or listen to content from the brightest leaders across the world of People, Process & Technology.

Find out about the latest events across Europe

Network with like-minded professionals in your industry

Find and apply for the best jobs

See content that you like?

Share your experience by joining your exclusive roundtables, or contribute to our content like industry peers.

Get involved