THE BRAIN is wider than the sky, For, put them side by side, The one the other will include, With ease, and you beside.
The brain is deeper than the sea, For, hold them, blue to blue, The one the other will absorb, As sponges, buckets do.
The brain is just the weight of God, For, lift them, pound for pound, And they will differ, if they do, As syllable from sound
— Emily Dickinson
The thing with connecting dots, as per Steve Jobs, is that they can only be connected backwards. If we go back in time and search for an answer to the question: where is computing leading us?, we would only get a partial answer. That’s because Artificial Intelligence (AI) was the stuff of science fiction.One could argue that the first computer was the abacus — a device used for relatively simple calculations. Between 1833 and 1871, Charles Babbage developed the first ‘Analytical Engine’ or the first computer. Since then, mainframe, mini, micro and finally the personal computer have all made computing ubiquitous. Then came the shift to mobile computing. All throughout this journey, the human brain has been the model guiding future developments.
If scientists are artists, then the human body is their muse. From some of the most complex theories in Artificial Intelligence (AI) such as Karl Fristons’ ‘Free Energy Principle’ to Sigmund Freuds ‘Interpretation of Dreams’, the richness and complexity of the human brain has guided computing since its early origins. If one were to extrapolate the current trends in computing, it is clear that we are trying to understand human consciousness at a fundamental level. For now, developments in the field of Machine Learning and AI are leading us to a future where human intelligence can be replicated in machines. The nature of these machines could morph from digital assistants to full blown humanoids or androids. This article outlines the only inescapable conclusion — computing will free us to study human consciousness. Until now, an understanding of consciousness has alluded us all. AI will free up our capacity to purse that dream
The history and timeline of computing is very rich. As a simplification, computer hardware has progressed on two dimensions i.e. speed and miniaturization. Both of these dimensions are very unique. While the chips on our mobile phones are both incredibly small and more powerful than old computers, super computers have gotten bigger and bigger. The most powerful supercomputer in the world — the IBM Summit consumes enough electricity to power a small town. Dave Turek, VP of high performance computing and cognitive systems, in an article by CNET.com says: “The marketplace is beginning to recognize that AI and high-performance computing are not separate domains but things that need be viewed as integrated. The incorporation of machine learning dramatically reduces the amount of simulation that needs to be done to get to optimal results.”
Second, there is a divergent approach to computing because Moore’s law is not likely to operate infinitely. Thus, computing will have to go sub atomic in a classic case of quantum technology eating technology based on classical physics. On January 08, 2019, IBM unveiled the first quantum computers for commercial use. As per its website, IBM today unveiled IBM Q System One™, “the world’s first integrated universal approximate quantum computing system designed for scientific and commercial use. IBM also announced plans to open its first IBM Q Quantum Computation Center for commercial clients in Poughkeepsie, New York in 2019. This landmark development is just the precursor to quantum computing”. Thus, there will be two divergent approaches to the hardware problem — super computing and quantum computing.
The world of video games with lifelike graphics gave rise to GPU’s. In recent times, Graphics Processing Units (GPU’s) were the chips powering most crypto graphic currencies (cryptos) and AI algorithms. Later, Field Programmable Gate Arrays (FPGA’s)- chips which could be programmed by users for specific purposes post manufacture gained traction. On May 01, 2019, The MIT Technology Review published a story on a special purpose chip designed to boost AI computing. The chip was unveiled at Amazon’s MARS event. If the chip is any indication, special purpose chips designed specifically for imbuing computers with AI will be the next hardware boom.
Lastly, in the future, AI chips could run in computers as small as a mobile or wearable device. The other possible evolution of computing hardware would be a fully functional android with a human body on the inside and an AI enabled computer inside. In the future, whole brain emulation i.e. brain uploads could allow people to live in a mechanical body creating digital avatars of people long after they are gone. This ‘creepy’ possibility is already being researched.
First, AI algorithms today only offer a peek of things to come. They can beat the human brain in games such as chess, GO and perform computations at an incredibly fast pace. They can crunch massive amounts of data, identify patterns and perform limited functions. However, they are yet to advance to a higher state. Language and speech recognition have many more hurdles to clear. The AI algorithms also learn on datasets which are not diverse or large enough to start to develop human intelligence. Also, we are modelling neural nets based on our limited understanding of the human brain. Thus, the search for truly intelligent AI has some inherent limitations. A good book to understand Machine Learning (ML) is Andriy Burkov’s ‘The Hundred-Page Machine Learning Book’.
Secondly, a lot of the components of the ‘Internet of Everything’ are being built as I write this story. Blockchains have just become mainstream and will mark the beginnings of the ‘Internet of Value’. Cryptocurrencies have just begun to impact cross border payments. The ‘Internet of Things’ is not yet complete as fully autonomous vehicles are not yet in place. The software and plumbing for the highway of the future is still being built. Everything from banking to payments is likely to recede into the background with an extremely intelligent front end interface hooked to the ‘Internet of Everything’. Of course, we also have to solve for privacy and ethics issues.
Lastly, in the future, the software will have to mimic the human brain. In other words, it will take a better understanding of the 100 billion neurons firing in our brains. We have just begun to understand terms such as the ‘Connectome’
Naturally, the problem then becomes multi-dimensional. We will need to rely on disciplines such as quantum computing, advanced genetics and psychology to begin to develop human level intelligence. Of course, today, harnessing a human brain remains a pipe dream. It is also one of the most powerful reasons to not sound the death knell of humans as masters of machines.
Add to this the fact that there exists a movement called ‘New Mysterianism’. The members of this movement (which is also a philosophical position) including some of the smartest scientists in the world believe that the hard problem of consciousness cannot be resolved by humans.
The Next Level
Perhaps, we may see many more AI winters before we realize computing’s true destiny.
First, the promise that Artificial Intelligence (AI) holds for us is liberation from lower level tasks. Think about all the error prone MS Excel models, the monthly accounting reconciliation process and all the data analytics that can be completely automated. In the future, the management can simply ask the AI hiding behind digital assistants to perform analyses. The digital assistant would then throw out the results in a more visual manner. Thus, freeing up labor for higher order decision making.However, we are many years away from the promise of true liberation. As it is obvious from the above, the hardware and software needed to mimic the human brain has not even begin to take shape.
The advent of AI is not without exponential disruption. Even Prometheus had to face the wrath of the gods when he stole fire for humanity’s sake. Similarly, unless there are well designed contingency plans, AI will cause massive disruption in employment by eliminating millions of jobs across the planet. As per AI expert Kai Fu Lee, about 40% of all jobs will be lost to AI. Thus, civil strife would be a very heavy price for liberation.
Secondly, AI has developed in spurts. Therefore, it is hard to predict whether there will be a third AI winter. An AI winter is a general decline in research and investment in AI. Today, both research and investment in AI is growing exponentially. According to a new report from consulting major PwC, Venture Capital (VC) investment increased by 72% to hit an all time high of $9.3 billion. There is also an exponential increase in students and researchers interest in AI. There are both sound economic, business and scientific reasons behind the growth of AI as an industry. Even if this trends slow down for a while, AI as a technology holds more promise than other pervasive technologies such as electricity. In other words, AI is here to stay and to accelerate our efforts to make computers as intelligent as human beings.
Lastly, if you ask most wise men what we know about our brain, they will say ‘not much’. We don’t know much about diseases such as Alzheimers to treat them completely. Reducing consciousness to awareness is doing a disservice to the complexity of the human brain. However, think about it- robots or computers are not ‘aware’ they are serving us. Logically speaking, Artificial General Intelligence (AGI) or AI beyond singularity would be aware of its role with humans. Robots, then would resemble the human race-some can utilize their new found intelligence to follow ‘evil’ objectives while others would live in harmony with humans and yet others would be content with humans as drivers. AI will free humans of everything they do in the material world. This would spark a new found appreciation for spirituality. Spirituality is the quality of being concerned with the human spirit or soul as opposed to material or physical things. Thus, the promise of AI is to herd more people to a search for meaning beyond material pursuits.
Enlightenment, in the late 17th and 18th centuries, was a European intellectual movement emphasizing reason and individualism over tradition. Enlightenment, post AGI, will be a movement emphasizing the human spirit over material pursuits. It will be an era where we finally begin to understand that the destiny of computing was to put us on a journey to understand and unlock the vastness of the human mind. We will not need to ask HAL 9000 about the purpose of life. We would have discovered it already. However, there are still many more miles to go before we realize the true destiny and promise of computing.