Artificial Intelligence: a nuclear reactor or an atom bomb?

The following article is a narration of a dialogue between a couple of different schools of thought. In order to preserve the readers discretion and to eliminate any biases associated with proper nouns, we shall call them Zero and One. I trust the reader to make up their own mind about who they may relate to and therefore choose to believe.
This is how the conversation took place:

Zero: I’ve figured out the algorithm for AI.

One: No way? Artificial Intelligence?

Zero: Yes!

One: You? Off all the people? I mean you're not even a computer programmer.

Zero: Its easier to find something when you know exactly where to look. In fact, I am quite surprised why no one else has.

One: You mean off all the research labs all across the world with all the advanced technology and the most brilliant programmers, you are the one to figure it out?

Zero: Thats the problem with all those companies, the people working on it are mostly computer programmers. They cant think beyond coding and programming.

One: What about all the neuroscientists?

Zero: Again! Wrong door!

One: And how did you figure it out?

Zero: Haha! Thats a little too much information. Isn't it?

One: Have you told anyone about it? Or are you working on it?

Zero: Well! I have a programmer from my office working on the coding. Poor thing doesn't even know what he’s been working on; he thinks its an application.

One: How much time will it take you? Are you creating an artificially intelligent robot like Sophia?

Zero: Nope! Just an artificially intelligent computer program at the moment. Robots are just the hardware, anyone can create that. And I’ve asked the person to spend an hour a day because he has other things to work on. So I’m not really sure how much time it will take. Maybe a month, a year, who knows?

One: You’re saying that you can develop artificially intelligent program in a year? Like Google Deep Mind’s Alpha Go Zero or Hanson Robotics Sophia?

Zero: Lesser! And it will be far superior to the ones which are already in place. It will be actually intelligent; so much so that no human can prove otherwise.

One: And you’re saying that your program will be self conscious?

Zero: Not self conscious, or maybe. In fact, I’m not even sure if you're self conscious. Except your own self, no one can prove whether you are. I’m actually not sure where the program will lead us. But I can guarantee that it’ll be intelligent.

One: Do you even know what you're saying? The anticipated problem with artificial intelligence may not necessarily be self consciousness or self awareness; I am not sure if that’ll happen and it’s quite irrelevant at this moment to fathom such a thought. Like you said, the human self consciousness is itself quite debatable. Intelligence on the other hand is quite viable and can be explained to an extent! But it will only be mine or your understanding of intelligence. What good is intelligence without values?
Computers only understand a binary set of commands. Our values aren’t defined in black and white. Its not a yes or no! I mean we often neglect, change or redefine our values based on the situation. Sometimes we go against our best instincts and yet arrive at a result. Its the imperfections that make us human. A result can be quite different from what we anticipate and we usually don’t have a “defined win”. A computer, however, doesn’t understand that. It’s a yes or a no! We can go deeper into defining a “win” for a computer but with infinite possibilities, can we be absolutely sure that the result will be as desired? With just four ground rules, a simple algorithm like Conway’s game of life could create infinite possibilities. And you think you can control the outcome of AI?
Anyway! My problem is not with artificial intelligence per se, my problem is with singularity!!
Are we, or will we ever be willing to surrender the entire control of all electronic equipment to one person or one set of laws?
How can we think of surrendering the entire control to one program?
It’s the diversity of power that has let us survive millions of years. Imagine if you had absolute control over all electronic equipment?
Would you really think you’ll be able to handle it? Rather, would you be able to surrender that amount of control to someone else?
Cleisthenes, known as the Athenian father of democracy is said to have believed that governance was too massive a power to be with one person. It should be with the masses.
Throughout the evolution of mankind, democracies survived as against autocracies because as humans we couldn’t surrender absolute power to one person.
With our lives becoming more and more dependent on machines and electronics, can we imagine surrendering all control to one set of algorithms?
Power as they say, corrupts and absolute power corrupts absolutely, it’s as true for you as it is for me. Unless you come to me saying that you’ve never ever made a mistake, you have no right to hold absolute power! Neither does a programmer who writes the code for an artificially intelligent program.
Again, I don’t fear artificial intelligence, I would love to see a future with enormous prospects where artificially intelligent programs eliminate the risks associated with mankind! The problem however is singularity. Our species is far from perfect. Our values differ from person to person. Its our imperfections make us what we are! Human!
The loss of memory, the ignorance of rules, the reason beyond our intellect not to mention the abstracts like love, compassion, pain, empathy, fear, mortality etc.
Can you define love? Or empathy, compassion or pain?
For the sake of argument, lets say you are able to define it for your own self but can you define it for every human?
How about defining it in a set of yes or no answers?

Zero: AI is inevitable my friend, if I don't do it, someone else will.

One: I’ve heard that the scientists working at the Manhattan project said they were solving a problem. A challenge which was right in front of them. They wouldn’t think of the repercussions! That would be silly, not to mention a waste of time! AI is the beginning of singularity and the day you write that code, you surrender all control to the machines.

Zero: You surrendered control to the machines the moment you let your machines do your work for you. You accepted AI the day you bought your first piece of electronic equipment.

One: Thats different. Without AI, we still have control over our machines. Don't we?

Zero: Do we? Okay! I’ll believe you if you switch off your mobile phone right now and never turn it back on again!



Comments