|"How the mind works is still a mystery. We understand the hardware, but we don't have a clue about the operating system."|
|James Watson, Nobel laureate|
A head full of chips? Artificial Intelligence? Computers that think?
Machines smarter than people? These grandiose proclamations were spawned from hubris and the techno
optimism that ran rampant early in the 21st century by computer experts/futurists.
Perhaps the most intellectually arrogant of the bunch, Raymond Kurzweil, researcher, entrepreneur and artificial intelligence pioneer, found popular success selling books predicting computer breakthroughs and became a media darling peddling a scenario where the human brain would be surpassed by the computer within the 21st century, whether it be chip- or molecular-based. "We'll see 20,000 years of progress in the next 100 years," he averred in 2001.
Here's a sample of the "progress" that will actually happen according to Kurzweil:
"We'll have very powerful little computers that can travel through our bloodstream, that will be the size of blood cells and they'll actually communicate wirelessly with our neurons, so we'll be actually able to enhance our own thinking capacity, speed up our thinking, increase human memory, increase our cognitive abilities and pattern recognition by combining our biological intelligence with these new forms of non-biological intelligence."Phew!
In 2002, Kurzweil adopted a "new" phrase, "The Singularity," which unfortunately inspired more staccato gibberish:
"We are entering a new era. I call it 'The Singularity.' It's a merger between human intelligence and machine intelligence that is going to create something bigger than itself. It's the cutting edge of evolution on our planet... It is part of our destiny and part of the destiny of evolution to continue to progress ever faster, and to grow the power of intelligence exponentially. To contemplate stopping that—to think human beings are fine the way they are—is a misplaced fond remembrance of what human beings used to be. What human beings are is a species that has undergone a cultural and technological evolution, and it's the nature of evolution that it accelerates, and that its powers grow exponentially, and that's what we're talking about. The next stage of this will be to amplify our own intellectual powers with the results of our technology."Phew! "The cutting edge of evolution"? "The destiny of evolution"? "What human beings used to be"? "[Evolution's] powers grow exponentially"? Oh yeah! On what planet?
Then in September 2003, Scientific American released a single-topic issue titled "Better Brains" with a leading editorial acknowledging Kurzweil's artificial intelligence poster boy status — but with blunt skepticism of his predictions.
Predictably, he jumped on his soapbox with a letter to the editor, spouting and touting a "doubling [of] the paradigm shift rate" [ugh!] and acknowledging the magazine's skepticism with his observation that "scientists are trained to be conservative in their outlook and expectations,
Then again (ugh!) in July 2006, Scientific American perhaps committed editorial heresy by lending Kurzweil their "Forum" column for him to get up on his slippery soapbox yet again declaring, "As an information technology, biology is subject to 'the law of accelerating returns.'" Our exasperated founder just had to respond with a letter to Scientific American's editor (November 2006 issue) stating that, "Kurzweil's tendency to analogize biology to information technology is arrogant and naïve..." and that "...we'd be wise to take Kurzweil's grandiose predictions with a grain of salt."
Almost a year later Kurzweil popped up yet again (ugh!) in Fortune magazine (May 14, 2007 issue), claiming among other things far-out that he "reprogrammed" his 59-year-old body's chemistry to that of a 40-year-old. Our still-exasperated founder had to respond again with a letter to Fortune.
Kurzweil and others, lopsidedRealityCheck! in their expertise — did not have the training in biology and biotechnology to appreciateRealityCheck! the grand complexity of the human mind that is the result of hundreds of thousands of years of evolution.
Dazzled by hubris and real accomplishment — particularly the emergence of quantum computing utilizing subatomic particles that made computers billions of times more powerful — they dared to predict that man's own creations would surpass the power of Nature. Their optimism was a product of ignorance however, and was similar to the Futurama-like technology highs in the 1960s, and to what occurred after the atom was split; when it was envisioned that atomic energy would be
Granted, our knowledge of the brain is rapidly expanding. But merely understanding how a biological entity functions does not imply that it can be artificially created. Human blood is a liquid with a comparatively simple function — to supply tissues with nutrients and remove waste products. Yet, we are incapable of producing synthetic blood and still critically rely on blood donations.
Man has done a pretty good job in creating his own complexities here on Earth, but they're still nowhere near what Nature has done on her own — using an incomprehensible amount of time and with respect to (re)establishing states of equilibrium.
Per the Phrenicea scenario, by midcentury we give up trying to duplicate the brain's function using man-made hardware and software. Instead, we acknowledge our limitations
Finding brain donors — surprisingly — was not a problem. The majority of the world's citizens agreed to donate their brains upon death, with the understanding that their consciousness would be preserved forever and that their life experience would add value to the "vault."
The accumulation of knowledge and facts was facetiously described as "mind boggling," and it was impressive. Just as impressive was the human interface, which was invoked by mere thought.