Return to site

The Dilemma of AI: The Lady or the Tiger?

A lot has been mentioned about the recent Microsoft debacle of the AI bot Tay on twitter which, in less than 24 hours, had suddenly started to rant anti-semitic, misogynistic tweets, embarrassing Microsoft, which lead to its disappearance from social media.

Microsoft and its publicity team decided that Tay was influenced or rather, "hacked" by people on 4Chan who trained it to tweet remarks of a misogynistic and racist nature, but in reality, let's face it, algorithms are not designed to make ethical judgements, because meta-ethics often derives from a priori learning or via societal conditioning- something non-biological entities are not capable of.

However, recently, Azeem Azhar made an interesting post on linkedin in regards to ethical decision-making with regards to A.I. which I myself could not answer. Azeem Azhar curates a rather interesting selection of A.I publishings in his newsletter, The Exponential View, which I had been reading since Sept of 2015, after I had found the list as recommended by Nic Brisbourne in his equitykicker blog.

Azeem talks about the famous trolley problem of a simple moral dilemma that has probably been the opening topic in contemporary philosophy courses at university: You see a train running down a track. There are five people on the track who will surely die if the train continues this path. You happen to be standing by a switch that can change the path of the train. If you do that and pull the switch, the train will change tracks and hit and kill a single innocent standing on the other path. Do you pull it?

broken image

He explains that this is a simple dilemma that illustrates an ethical consideration that many AI systems will need to handle, but that the trouble is that we as humans don't agree with what to do in the Trolley problem:

A recent paper looks at cultural differences and variations of the Trolley problem. It finds that ordinary British people will pull the switch, and sacrifice the one to save the five, between 63 and 91% of the time. Chinese people faced the same quandary were more inclined to let nature run its fate. They would pull the switch about 20-30% less often; or between 33% and 71% of the time.

To be honest, I was never sure which I have would've chosen either. I do know that in Asian cultures, such as in South Korea, that people tend not to interfere in the business of other people who they do not know well, very similar to the norms of British society. However, if there is a wrongdoing, I tend to think Americans and Brits will make a point to interfere, even with people they do not know, whereas Asians might not be inclined to do so. Still, I am not sure how I would've reacted because it would be placing higher value on one life over another's life.

broken image

Hear no evil, see no evil, speak no evil. Does society teach us to look the other way when we see something that offends our sense of morality?

So which future system should we follow: the British or Chinese ethical standards? Azeem further asks: "How do we prevent unelected, unaccountable product managers and AI programmers determining personal or social outcomes through veiled black boxes? What if those programmers are untrained in ethics, philosophy or anthropology?"

This ethical dilemma also reminds me of one of my favourite childhood stories: The Lady or the Tiger? I also recall in 5th grade, one of the books I had been required to read was Mark Twain's Huckleberry Finn. In the book, Huckleberry has to make a moral choice- between his friend, the 'negro' and society's mores at the time which prevented him from becoming friends with someone outside his race, which was considered "unethical". In the end, he followed his heart.

broken image

The Lady or the Tiger? Are we taught to be selfish or generous?

We, as a society look towards "logic" to solve so many of our problems, and they end up being wrong. We disregard our instincts and feelings as being "irrational", but I do not think our biology is wrong. Demis Hassabis of DeepMind thinks that our learning is a result of the release of dopamine. Dopamine is released to reward positive behaviours which fuels our learning abilities. But is it as simple as that?

Steve Jobs said to "Never settle. Follow your heart." And I think he is right; in terms of ethical decisions, we should follow our heart. Our intuition abides by a non-linear logic that encompasses our entire unconscious knowledge base, and when we go against what we not only believe or feel is wrong, we often feel the most disconnected with our own selves. Bertrand Russell says in Principia Mathematica that often children are "smarter" than adults because they have not yet been desensitised by institutionalised education. Our heart understands the bigger picture, our long-term investments, what matters to us in the long run, what we know is true, aside from the illusions of our ambitions. So often we become derailed by the lure of short-term profit, by the tyranny of sociological mores and peer pressure.

We feel this disconnect most when we don't trust our own intuition. Intuition has a higher form of logic than rationale and societal mores, and as Einstein said, "Imagination is more important than knowledge." We need to trust our own selves [and not an algorithm or other people] to make our most important decisions.

"Be hungry, be foolish." Steve Jobs.