Tuesday, May 21, 2013


I was recently discussing with a friend, the possibility of humans creating machines that are smarter than humans; something that fiction has warned us of in agonizingly repetitive detail. She was absolutely sure that it can and will be done in the future (quoting various technological singularity advocates), once there is enough understanding of neurobiology and the molecular basis of intellect and sentience. I differ. While I am sure that humans have already created many beautiful machines that are perfectly capable of ending all life on earth, or subjugating human life adequately, that is no indication that these machines are necessarily smarter. 

The fact that humans would be able to create a smarter sentient machine would mean that that smarter machine would be able to create something even smarter (setting up a possibly infinite series of smarter entities). This has a few logical issues. If any of these entities are smart enough, they would simply not allow a series of this nature. A series of this nature would be both super-intelligent and mindless at the same time, introducing a paradox of sorts. And what really defines intellect? Is it the ability to make faster decisions? Does it fuel the drive to be more perfect with decisions to a level that the super-intelligent beings would live prefect lives? What defines perfection? Each of these questions create their own paradoxes and anomalies. A truly perfect decision-making intelligence should be able to predict the future.   

My primitive perspective suggests that the factors that would influence quality decision making would depend on IQ, experience, sanity and emotional intellect. These factors are mostly independent of each other and therefore defining a "smarter" intelligence would really depend on which of these factors would be required (and at what level) to make the optimal decision. Sometimes ignorance would be better suited! And what really is an optimal decision? In what time frame? You get the idea.

We would like to think that we have a good-wrap on the ideas of IQ, experience and even sanity. It is emotional intelligence that seems most elusive; the ability to make quality decisions despite thinking from the heart (figuratively speaking). The very idea that you are thinking from the heart (riskier but more emotionally fulfilling) seems to stack the odd up against good decision making. It is highlighted by the fact that plenty of people that are conventionally accepted as intelligent are poor with making emotional decisions. Think of your friends that have made obviously bad personal choices, despite having the IQ, the experience and the sanity to do otherwise. However, making unemotional decisions at all times is a bad solution because it is devoid of empathy. This makes emotional intellect a very critical element in ensuring our well-being. It is this emotional intellect that determines how much of your knowledge translates to realization. Outside of that, there are no quality ways of measuring emotional intellect yet, it is not really a markovian thing.     

This complicated mesh forms an unstructured network - your generalized decision tree. This unstructuredness is probably the most definitive aspect of sentience, a hallmark of which is the ability to make mistakes and accidents that turn out to be a good idea in an intangible future. It allows us to strike an effective balance between a gratifying aimlessness and a functional logical order. We require this unstructuredness for sustenance and are meant to treat this unstructuredness as a boon and embrace it if we wish to be content and happy.  

The point is that our own programming involves functional ordered chaos; three contrasting and mostly conflicting ideas. Our predicted super-intellect would be programmed to strike that perfect balance between the three. To have the "right kind" of randomness and more importantly have it pop up at the right times. To be perfectly imperfect. Lolz.