Artificial Intelligence and Humans

March 1987, almost three years after its world premiere, the Terminator opened in Polish cinemas under the title Elektroniczny morderca (Cyborg assassin). The artistic value of the film is debatable, but surely nobody at the time could have seriously considered the possibility that one day machines would be smarter than humans. In 1984, ZX Spectrum+ premiered. At that time, it was a dream computer with a 3.54 MHz processor and (in the best available configuration) 48KB RAM. It was definitely not a machine Skynet could ever fly on.
 
fot. Freepik

 

The premiere of another "Terminator" movie is now behind us. Thirty odd years have passed and the world is a very different place. Boston Dynamics’ robots jump and run on two legs, autonomous cars are nothing unusual, popular smartphones have CPUs clocked at GHz.

It might be time we took another look at machines: could they be smarter than us, humans?

This brings us to the concept of technological singularity. Simplifying it somewhat, it is a time (whether hypothetical or not is controversial) in the development of civilization when technological progress is so fast that all human foresight is obsolete. When that time comes, machines may emerge whose intelligence will exceed human intelligence and which will be able to produce even more efficient artificial intelligences, causing gigantic changes in technologies.

Is Skynet going to be?

Let's consider the likelihood of that happening. Natural skepticism leads us to believe that the current rate of progress will remain the same in the future and therefore there is a long way to go before AI is more intelligent than us. The problem is that humans tend naturally to extrapolate the current rate of advancement to the next 10 or 100 years. What if growth exhibits an exponential trend, only we, seeing a fragment of it, assume it to be linear? My grandfather was born when the Wright brothers were merely discussing the possibility of flying (de facto jumping) a machine heavier than air. During his lifetime, the airplane has become commonplace, whereas the grandfather watched with his own eyes the man landing on the moon. Looking back, you can see what humongous technological transformation has taken place during this time.

Moore's Law (somewhat like a self-fulfilling prophecy) still holds for next generations of processors.

Also, the time needed for innovation to catch on decreases exponentially. Just think how long it had been before the classic (wire) phone was widespread across households compared to GSM. Of course, the comparison pertains rather to Western countries, the development of telephony in the People's Republic of Poland is a completely separate issue. We may be in the thick of changes whose rapidity we cannot fully grasp. However, it has been criticized that expectations for intelligent machines to become available in the near future are, after all, premature because computers have yet to develop intelligence higher than that of an insect.

The theory of the imminent technological singularity fires up scientists' imagination and has been criticized from a number of viewpoints. Some researchers, including Hubert Dreyfus, Stuart Russell and Peter Norvig, predict that no machine will ever reach the level of human intelligence. The assumptions of Kurzweil's theory, the main proponent of technological singularity concerning accelerating technological change, have also been questioned.

Theodore Modis and Jonathan Huebner have even claimed that technical progress is slowing down.

This is evidenced by the slowdown in the rate of CPU frequency growth, although the rate of increase in the complexity of integrated circuits still remains at the level predicted by Moore's law. Human creativity is difficult to gauge but it seems that a comparison of the number of patents per thousand people may be a useful criterion. Interestingly, in the case of the USA, there has been a gradual, slow decline from the peak of 1850-1900. The ever-increasing complexity of the technology is also becoming problematic. Nowadays, not even an outstanding scientist is able to comprehend the totality of her/his research as scientists at the turn of the 19th and 20th centuries used to do. One can hardly imagine, for example, that anyone today could build a high-tech jet in their garage having fully grasped all of the issues of advanced avionics and on-board electronics.

Other researchers argue that the trend towards "singularity" can be observed in other areas such as global population and GDP. For each of these trends, however, there are fundamental physical limitations, and after a period of rapid growth, there is always a slowdown or even a decline.

Therefore, it remains an open question whether machines will ever be able to dethrone us intellectually. Another important issue is also whether such machines will pose a threat to us. It seems, however, that humans will hold control of the switch and will be able to shut the machine down. The reliability of this arrangement is, however, a topic for a different post.