"How the mind works is still a mystery. We understand the hardware, but we don't have a clue about the operating system."
James Watson, Nobel laureate
Home A head full of chips? Artificial Intelligence? Computers that think? Machines smarter than people? These grandiose proclamations were spawned from hubris and the techno optimism that ran rampant early in the 21st century by computer experts/futurists.

Perhaps the most intellectually arrogant of the bunch, Raymond Kurzweil, researcher, entrepreneur and artificial intelligence pioneer, found popular success selling books predicting computer breakthroughs and then became a media darling peddling a scenario where the human brain would be surpassed by the computer within the 21st century, whether it be chip- or molecular-based. "We'll see 20,000 years of progress in the next 100 years," he averred in 2001.

Here's a sample of the "progress" that will actually happen according to Kurzweil:

"We'll have very powerful little computers that can travel through our bloodstream, that will be the size of blood cells and they'll actually communicate wirelessly with our neurons, so we'll be actually able to enhance our own thinking capacity, speed up our thinking, increase human memory, increase our cognitive abilities and pattern recognition by combining our biological intelligence with these new forms of non-biological intelligence."

In 2002, Kurzweil adopted a "new" phrase, "The Singularity," which unfortunately inspired more staccato gibberish:
"We are entering a new era. I call it 'The Singularity.' It's a merger between human intelligence and machine intelligence that is going to create something bigger than itself. It's the cutting edge of evolution on our planet... It is part of our destiny and part of the destiny of evolution to continue to progress ever faster, and to grow the power of intelligence exponentially. To contemplate stopping that—to think human beings are fine the way they are—is a misplaced fond remembrance of what human beings used to be. What human beings are is a species that has undergone a cultural and technological evolution, and it's the nature of evolution that it accelerates, and that its powers grow exponentially, and that's what we're talking about. The next stage of this will be to amplify our own intellectual powers with the results of our technology."
"The cutting edge of evolution"? "The destiny of evolution"? "What human beings used to be"? "[Evolution's] powers grow exponentially"? Oh yeah! On what planet?

Then in September 2003, Scientific American released a single-topic issue titled "Better Brains" with a leading editorial acknowledging Kurzweil's artificial intelligence poster boy status — but with blunt skepticism of his predictions.

Predictably, he jumped on his soapbox with a letter to the editor, spouting and touting a "doubling [of] the paradigm shift rate" [ugh!] and acknowledging the magazine's skepticism with his observation that "scientists are trained to be conservative in their outlook and expectations,
The glare of naïveté can be blinding when the veil of nescience is at first removed.
which translates into an understandable reluctance to think beyond the next step of capability." Phew! (Never mind [sorry!] that the neuroscientists he dismisses have spent lifetimes learning how the brain functions and realize just how little is understood.)

Then again (ugh!) in July 2006, Scientific American perhaps committed editorial heresy by lending Kurzweil their "Forum" column for him to get up on his slippery soapbox yet again declaring, "As an information technology, biology is subject to 'the law of accelerating returns.'" Our exasperated founder just had to respond with a letter to Scientific American's editor (November 2006 issue) stating that, "Kurzweil's tendency to analogize biology to information technology is arrogant and naïve..." and that "...we'd be wise to take Kurzweil's grandiose predictions with a grain of salt."

Almost a year later Kurzweil popped up yet again (ugh!) in Fortune magazine (May 14, 2007 issue), claiming among other things far-out that he "reprogrammed" his 59-year-old body's chemistry to that of a 40-year-old. Our still-exasperated founder had to respond again with a letter to Fortune.

After several years of relative quiet, Google in 2012 figuratively injected adrenalin into Kurzweil by hiring him to be their chief of engineering. Predictably he leveraged the opportunity in June of 2013 by reiterating, "Somewhere between 10 and 20 years, there is going to be a tremendous transformation of health and medicine. By treating biology as software [ugh!], and reprogramming cells to treat diseases and other ailments, humans have already made tremendous progress in medicine. These will be 1,000 times more powerful by the end of the decade, and a million times more powerful in 20 years."

Kurzweil and others, lopsidedRealityCheck! in their expertise — did not have the training in biology and biotechnology to appreciateRealityCheck!   the grand complexity of the human mind that is the result of hundreds of thousands of years of evolution.

Dazzled by hubris and real accomplishment — particularly the emergence of quantum computing utilizing subatomic particles that made computers billions of times more powerful — they dared to predict that man's own creations would surpass the power of Nature. Their optimism was a product of ignorance however, and was similar to the Futurama-like technology highs in the 1960s, and to what occurred after the atom was split; when it was envisioned that atomic energy would be
History is littered with unfulfilled predictions by so-called experts.
applied safely to almost everything — from heating homes, powering cars and planes, to manufacturing. History is littered with unfulfilled predictions and erroneous theories by so-called expertsRealityCheck!   steeped in dogma. Unfortunately, the glare of naïveté can be blinding when the veil of nescience is at first removed.

Granted, our knowledge of the brain is rapidly expanding. But merely understanding how a biological entity functions does not imply that it can be artificially created. Human blood is a liquid with a comparatively simple function — to supply tissues with nutrients and remove waste products. Yet, we are incapable of producing synthetic blood and still critically rely on blood donations.


Man has done a pretty good job in creating his own complexities here on Earth, but they're still nowhere near what Nature has done on her own — using an incomprehensible amount of time and with respect to (re)establishing states of equilibrium. We've dramatically increased our life expectancy from just 37 years in 1800, but not via rocket-science-like achievement. It's been mainly by improving sanitation and nutrition, rudimentary surgical techniques, and the use of drugs based on serendipity and empirics. We're fooling ourselves if we believe it is much more than that.

Per the Phrenicea scenario, by midcentury we give up trying to duplicate the brain's function using man-made hardware and software. Instead, we acknowledge our limitations
Man has done a good job creating complexity, but nowhere near Nature's.
and actually harness the brain's power via a brain bank housing thousands of human brains, each enclosed in a honeycomb-like chamber providing oxygen and nutrients.

Finding brain donors — surprisingly — was not a problem. The majority of the world's citizens agreed to donate their brains upon death, with the understanding that their consciousness would be preserved forever and that their life experience would add value to the "vault."

The accumulation of knowledge and facts was facetiously described as "mind boggling," and it was impressive. Just as impressive was the human interface, which was invoked by mere thought.

Click for more quotes!

Click for even more quotes!

Click for yet more quotes!

Two Cents

The Future — It's All In Your Head! ®

Use of this website constitutes acceptance of the Phrenicea® Terms and Conditions.

This page belongs to

Entire site ©2000-2014 John Herman. All rights reserved.