"The future is the worst thing about the present."
Gustave Flaubert (1821-1880), French novelist
Home

Tuesday April 1, 2014

Phrenicea — The ultimate goal of evolution?

Given current world tensions that hark back to pre-WWII simmerings that led to widespread death and destruction — we'll visit a Phrenicea page uploaded way back in 2000 depicting a scenario that might finally put an end to history repeating itself.
*****
There's no crime with Phrenicea!

Artificial genes collectively called "brainerama" manipulate gene expression and produce hormonal parameters and biological switches — in simplistic terms — that could be set or reset by Phrenicea to control or monitor an individual's behavior and actions "officially" defined as antisocial, deleterious, criminal, etc.

All forms of behavior and thinking were pigeonholed into just two categories: "right" and "wrong." This compilation of binaries became known as the "Black and White Dictum,"
Technology would empower the individual with the potential to destroy on a scale formerly limited to superpower nations.
and was the root of Phrenicea's judgmental construct.

Initially, there was a small concern that George Orwell's Big Brother had finally arrived. But the reality was that Phrenicea's controlling nature was welcomed with open arms (and minds!).

Early in the 21st century, it became evident that technology would empower the individual with the potential to change or destroy on a scale formerly limited to mighty superpower nations. Distributed knowledge and capability threatened the very existence of civilization and humankind. It was déjà vu for those who could remember the Cold War fears of worldwide obliteration. Only this time the source of fear could be counted in the billions, and not on one hand.

A few sci-fi fanatics among the early proponents of Phrenicea, members of the aging population of baby-boomers hoping to preserve their memories forever within the Phrenicea braincomb, melodramatically recalled the epilogue by the "Control Voice" from an early TV episode of The Outer Limits way back in 1963:

"May we not still hope to discover a method by which within one generation the whole human race could be rendered intelligent, beyond hatred or revenge, or the desire for power? Is that not — after all — the ultimate goal of evolution?"

The boomers' last gasp of activism fueled further worldwide concerns about safety and security to finally — after many decades after that "Golden Age of TV" — discover a method to preponderate the whole human race.

"Implemented" just in time, Phrenicea was given the ability to simultaneously monitor every person's thinking and behavior constantly.
All forms of behavior and thinking were pigeonholed into just two categories: "right" and "wrong."
That is not to say that freethinking was abolished. Those generally recognized as "law-abiding and normal," whose moral or intellectual beliefs precluded what is considered anti-social or criminal, had little to be concerned about. Phrenicea could on occasion intervene to bolster behavioral weaknesses to prevent one from actually acting out thoughts deemed unacceptable or dangerous to oneself or society.

On the other hand, individuals not irenically inclined — with mostly violent or destructive thoughts — were subjected to constant behavioral manipulation for the sake of everyone's safety.



NON SEQUITUR ©2004 Wiley Miller. Dist. By UNIVERSAL PRESS SYNDICATE. Reprinted with permission. All rights reserved.

So, to paraphrase the The Outer Limits' Control Voice in the here and now:

"Is Phrenicea not — after all — the ultimate goal of evolution?"

Time will tell (for us all)...

posted by John Herman 4:23 AM

Saturday March 1, 2014

Memories: Firsthand & First Person

Our memories, and the memories of us, are precious. Are they not? Well, at least to some authors they appear to be.

Cliff Pickover — author and columnist in the fields of science and mathematics is a prominent research staff member at IBM's Thomas J. Watson Research Center, admits that much of his motivation to publish comes from his desire to exit this world with something to leave behind for future generations. He laments:

After you die, will the world remember anything you did? Most of us rarely leave marks, except on our immediate family or a few friends. We'll never have our lives illuminated in a New York Times obituary or uttered by a TV news anchorperson. Even your immediate family will know nothing of you within four generations. Your great-grandchildren may carry some vestigial memory of you, but that will fade like a burning ember when they die — and you will be extinguished and forgotten.

That's pretty depressing. As we try to live each day to the fullest — being productive, learning, and ultimately creating memories for ourselves and others — we rarely ponder the ephemeralness of it all. (Although the often-heard dispassionate phrase, "Who will care a hundred years from now?" stems from the sad reality of our short time here on earth.)

But does it have to be this way?

The Phrenicea scenario envisions a time when all our memories and experiences will be stored forever, within our own brain as well as within others' — firsthand memories that are deemed rich enough to be bought and sold at auction — like memorabilia traded today on eBay.

Imagine sharing the actual memory of one accepting a Nobel Prize or an Academy Award, winning a marathon, falling head-over-heels for         your favorite actor         , starving in the third world, learning of a terminal disease; even of dying!

Most are tempted to say, "Yeah right! No way."

Memories bought and sold at auction — like memorabilia on eBay.

It seems impossible now, but there will come a day when the mechanism for memory assembly and storage within the human brain is elucidated. A next logical step would be to try to save or replicate these memories, perhaps for recollection by others.

When it does come to fruition, imagine the regret for the many first-person memories already — or soon to be — lost forever:
- the excitement of witnessing man's first flight
- the despair of the 1929 Stock Market Crash
- the horror of the Holocaust and Hiroshima
- the excitement of purchasing a new color TV in the 1950s
- the anticipation of finding out "Who Shot JR?"
- the relief of learning of a cure for polio
- the nostalgia of catching the last feature at the local drive-in movie theatre just before its closing forever
- the shock of President John F. Kennedy's death
- the thrill of setting foot on the moon
- the marvel at the first "horseless carriage," telephone, "talkie" talking movie, ballpoint pen, transistor radio, Polaroid camera, VCR

Of course you could read or see videos about these events. But nothing can approach an actual memory — just ask Neil Armstrong.

Gee, when you think about it, memories really are precious.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 4:39 AM

Saturday February 1, 2014

Looking Back on the Future

There was big news in the auto world recently when Cadillac announced a major change to its iconic crest and wreath logo that since 1982 was representative of General Motors' flagship brand.

Messing with logos is no small matter. According to the Houston Chronicle, corporate logos are intended to be the "face" of a company: They are graphical displays of a company's unique identity, and through colors and fonts and images they provide essential information about a company that allows customers to identify with the company's core brand. Logos are the chief visual component of a company's overall brand identity.



Cadillac Logo History to 2014

Cadillac's logo design evolved over many years — changing with the times, tastes, and sophistication of its target customers.


New Cadillac Logo 2015

The new logo is designed to attract a younger demographic, and project a bolder look to more effectively compete with BMW, Mercedes and Audi. It is sleeker and more streamlined as described by GM. As with many other things, logo design can be cyclic. This new edition harks back to 1959, when Cadillacs were streamlined with huge tail fins. Once again, the crest is wider and the wreath is gone. It is to make its debut on the 2015 ATS coupe, and photos reveal the logo to be a very prominent component of the front grill. It is significantly larger, approaching the size of the Mercedes' three-pointed star.

*****

So, why is this important?

It's significant for Cadillac — but more importantly for us is that it follows Phrenicea's logo evolution that transpired through the first half of the 21st century. Confusing yes, but that's the consequence of looking back on the future!

Like Cadillac, the goal of Phrenicea's founders was to create a striking identity for Phrenicea. They were optimistic that it would come to represent riveting
Confusing yes, but that's the consequence of looking back on the future!
change for the entire world — the 21st-century's most influential and eventually best known brand. Seeking that elusive quality that world class brands possess did not come easily however.

The design of the Phrenicea logo went through many iterations. Various marketing agencies were secured through the years, and the drastic changes in direction reflect the uncertainty with regard to the how to present an image for Phrenicea. The following shows the struggle to reach Phrenicea's final identity, which by mid-21st century conveys its comfortable and mature place in our world's future.


Infinity circa 2014

This first attempt to create a timeless image for Phrenicea makes use of the already tired "infinity" graphic. It was a flop and was replaced within just two years.



Thought Circle circa 2016

This concept presented a circle and a line. The circle was supposed to suggest the collective consciousness aspect of Phrenicea. The marketing agency touted, "This design has a futuristic feel yet includes ancient and simple icons." It was questionable whether the general population interpreted that much from the design.


Thought Circle II circa 2019

The same marketing firm that produced the original Thought Circle was fulfilling their contractual agreement by sprucing up their first concept. This was touted as more ambitious with more emphasis on the word Phrenicea. Again, would anyone get it?


Transcendental Wisdom circa 2023

A newly hired marketing firm completely changed direction with something from the psychedelic 1960s! What were they thinking? The firm's description of the new identity: "The use of the eye represents a window to the mind. The circular objects may resemble a galaxy. The image as a whole is quite colorful and is intended to represent the future." What were they smoking? (They were quickly fired.)


The Mind's Eye circa 2025

The third marketing firm unbelievably wanted to build upon the previous firm's effort. Their concern was that each logo iteration was obliterating whatever identity might have been garnered, since there was absolutely no continuity. They therefore recommended keeping the eye theme. Intellectually it's a sound approach, but that color!


The Mind's Eye II circa 2027

It became obvious that the red color was a big mistake. There were many red-eye themed jokes which did help awareness, but certainly did not gain the respect that was thought crucial to enabling Phrenicea to fulfill its mission to fully integrate with each individual's mind. With this iteration it became apparent that they finally got it right. The logo was warmly and universally accepted to represent the capabilities of Phrenicea.


The Eye's Mind circa 2027

The same marketing firm could not help themselves when they volunteered this second contrasting concept to The Mind's Eye above. Their spiel: "The Eye's Mind is suggestive of the brain or mind but is non-specific. The dot within the 'P' subconsciously carries over the 'eye' theme. And we find the reversal of the title's wording irresistibly clever." Phrenicea's founder's liked the idea as well. But they could not reach an agreement on which was superior, so they utilized both.


Phrenicea the Entity circa 2051

The Eye's Mind and The Mind's Eye served Phrenicea well and were used for many years. Although there was no intention of replacing them, a simple and elegant representation of Phrenicea was desired.

As Phrenicea gained world-wide acceptance, there was less of a need to force itself upon the masses with bold logo identities. The simple elegance of this final version would take Phrenicea well past the turn of the 21st century.


The evolution of the Phrenicea logo reflects Phrenicea's history in society — from a brash newcomer with then-unbelievable aspirations to a respected entity with universal acceptance and usage after just several decades.

Time will tell...

posted by John Herman 7:19 AM

Wednesday January 1, 2014

The Twelve Blogs of '13

Here we are again, lamenting yet another year's passing while suffering post-celebratory writer's block. Rather than trying to cobble together a half-hearted essay, we'll sheepishly follow the last seven years' cop-outs to lethargically review the "Twelve Blogs of 2013" — with premature nostalgia and perhaps feigned interest as to whether they're still relevant.

December's blog observed that history is littered with unfilled predictions by so-called experts:

Computers that think? Machines smarter than people? These grandiose proclamations are spawned from hubris and techno optimism by many computer experts/futurists. Perhaps the most intellectually arrogant of the bunch, Ray Kurzweil — researcher, entrepreneur and artificial intelligence pioneer — finds popular success peddling a scenario where the human brain will be surpassed by the computer within the 21st century. (Read more)

• In November we acknowledged that Yogi Berra did not anticipate Phrenicea with his famous quote, "This is like déjà vu all over again."

Nevertheless, he was humorously astute enough to observe that change is inevitable; and that it can ironically come "full circle." The Phrenicea scenario of the future analogously envisions that by 2050 life will be so simple as to resemble 18th and 19th century living. (Read more)

October's blog tried to ensure that an adequate aire of skepticism is maintained when viewing this site:

Envisioning the future is not an easy endeavor. For one brave (or crazy!) enough to attempt it, he or she might end up looking like a genius if correct (hopefully during their lifetime) — or a colossal fool (preferably posthumously) if proven wrong. (Read more)

• In September true Phreniceanados admitted that we donned our Phrenicea T-shirts the entire summer to help celebrate the now-universal symbol of leisure's centennial.

As reported in Forbes, "The iconic T-shirt was born in 1913 when the U.S. Navy issued crewnecks for soldiers to wear under their uniforms." We were feeling jubilant too, given the unusual symbolic nature of our shirt — representing a prophecy or vision of the 21st century. (Read more)

• In August we identified six distinct ages that depict humankind's drive and desire to achieve and command our natural world:

We propounded that our arrogance can be categorized into different "ages," each defined by our new capabilities at the time. The last two will lead to a scenario that we call Phrenicea. (Read more)

• In July we suffered the recurring affliction of "water on the brain":

Maybe we should instill among ourselves a new idiom-cum-mantra to "spend water like it's money." We all tend to waste water and take its current abundance for granted.

Wouldn't it be prudent to pay more today to change our wasteful habits, while adopting a mindset focused on conservation to build "aqua equity" for future generations? (Read more)

June's blog celebrated one student's epiphany:

Many of today's high school and college students often wonder (vociferously!) why they need to memorize boring equations, formulae and other seemingly trivial or useless information. What many times is not emphasized by their teachers is the origin and significance of man-made expressions of what is essentially describing the workings of nature. They're not taught that many of these discoveries required lifetimes of effort — often by iconoclasts, eccentrics, heretics and recluses...(Read more)

• In May we compiled "Then and Now" images to acknowledge the passage of time and visibly observe how things have changed through the years:

Time is fascinating when you take the time to think about it. We tend to consider time a man-made commodity as if we created it — when if fact all we really do it measure it. Fooling ourselves in the process, we parse time into nanoseconds, microseconds, seconds, minutes, days, weeks, months, years, decades, centuries, millenniums, periods, ages, eras, eons, epochs... (Read more)

April's blog braved the waves of offshoring crashing upon these shores:

Will there be new faddish, jargon-laced management-speak still to come whose consequence will reduce jobs and salaries even more — and further level the highest standards of living with the poorest on earth? How many more employees with years of dedicated service, along with new job seekers with expensive academic degrees, will yet find that their career path is ultimately a horizontal plank? (Read more)

March's blog reassessed whether the world is catching up with the Phrenicea scenario of the future:

If the current rate and course of technological change continues, which indeed it will, our cubicle scenario of the future will be minimally a metaphorical interpretation of the social isolation that will ultimately result...(Read more)

February's entry was up-front about rear-end vanity:

Have we come that far in auto technology that it's laughable now to think that Ford would deem the now mundane TC [Traction Control] worthy of rear-end vanity? But Ford is not alone in silly trunk bunk. Through the years I've studied many a car's derrière and have seen banality forged in chrome... (Read more)

January's entry was tinged with nostalgia and torpor — and guilt for not writing a new essay:

Here we are again, lamenting yet another year's passing while suffering post-celebratory writer's block. Rather than trying to cobble together a half-hearted essay, we'll sheepishly follow the last six years' cop-outs to lethargically review the "Twelve Blogs of 2012" — with premature nostalgia and perhaps feigned interest as to whether they're still relevant... (Read more).

You may or may not agree, but after reviewing the "Twelve Blogs of 2013" it appears most are still relevant as we roll into the New Year. And somehow after recalling the energy that instigated the blogs to begin with, we don't feel quite as lethargic.

Maybe that's the real benefit of looking backward, reminiscing and singing "Auld Lang Syne" — not for nostalgia per se, but to recharge our batteries and begin another year with renewed energy and enthusiasm.

Time will tell...

Happy New Year!

posted by John Herman 12:01 AM

Sunday December 1, 2013

A Head Full of Chips?

A head full of chips? Artificial Intelligence? Computers that think? Machines smarter than people? These grandiose proclamations are spawned from hubris and techno optimism by many computer experts/futurists.

Perhaps the most intellectually arrogant of the bunch, Raymond Kurzweil — researcher, entrepreneur and artificial intelligence pioneer — finds popular success selling books predicting computer breakthroughs while peddling a scenario where the human brain will be surpassed by the computer within the 21st century, whether it be chip- or molecular-based. "We'll see 20,000 years of progress in the next 100 years," he averred in 2001.

Here's a sample of the "progress" that will actually happen according to Kurzweil:

"We'll have very powerful little computers that can travel through our bloodstream, that will be the size of blood cells and they'll actually communicate wirelessly with our neurons, so we'll be actually able to enhance our own thinking capacity, speed up our thinking, increase human memory, increase our cognitive abilities and pattern recognition by combining our biological intelligence with these new forms of non-biological intelligence."
Phew!

In 2002, Kurzweil adopted a "new" phrase, "The Singularity," which unfortunately inspired more staccato gibberish:
"We are entering a new era. I call it 'The Singularity.' It's a merger between human intelligence and machine intelligence that is going to create something bigger than itself. It's the cutting edge of evolution on our planet... It is part of our destiny and part of the destiny of evolution to continue to progress ever faster, and to grow the power of intelligence exponentially. To contemplate stopping that—to think human beings are fine the way they are—is a misplaced fond remembrance of what human beings used to be. What human beings are is a species that has undergone a cultural and technological evolution, and it's the nature of evolution that it accelerates, and that its powers grow exponentially, and that's what we're talking about. The next stage of this will be to amplify our own intellectual powers with the results of our technology."
Phew! "The cutting edge of evolution"? "The destiny of evolution"? "What human beings used to be"? "[Evolution's] powers grow exponentially"? Oh yeah! On what planet?

Then in September 2003, Scientific American released a single-topic issue titled "Better Brains" with a leading editorial acknowledging Kurzweil's artificial intelligence poster boy status — but with blunt skepticism of his predictions.

Predictably, he jumped on his soapbox with a letter to the editor, spouting and touting a "doubling [of] the paradigm shift rate" [ugh!] and acknowledging the magazine's skepticism with his observation that "scientists are trained to be conservative in their outlook and expectations,
The glare of naïveté can be blinding when the veil of nescience is at first removed.
which translates into an understandable reluctance to think beyond the next step of capability." (Never mind [sorry!] that the neuroscientists he dismisses have spent lifetimes learning how the brain functions and realize just how little is understood.)

Then again (ugh!) in July 2006, Scientific American perhaps committed editorial heresy by lending Kurzweil their "Forum" column for him to get up on his slippery soapbox yet again declaring, "As an information technology, biology is subject to 'the law of accelerating returns.'" This is a naïve statement that arrogantly compares man-made technology to the infinitely more complex machinations of our very own biology.

Almost a year later Kurzweil popped up yet again (ugh!) in Fortune magazine (May 14, 2007 issue), claiming among other things far-out that he "reprogrammed" his 59-year-old body's chemistry to that of a 40-year-old. By now exasperated, I had to respond with an editorial letter to Fortune.

After several years of relative quiet, Google in 2012 figuratively injected adrenalin into Kurzweil by hiring him to be their chief of engineering. Predictably he leveraged the opportunity in June of 2013 by reiterating, "Somewhere between 10 and 20 years, there is going to be a tremendous transformation of health and medicine. By treating biology as software [ugh!], and reprogramming cells to treat diseases and other ailments, humans have already made tremendous progress in medicine. These will be 1,000 times more powerful by the end of the decade, and a million times more powerful in 20 years."

Kurzweil and others, lopsided in their expertise — do not have the training in biology and biotechnology to appreciate the grand complexity of the human mind and body that is the result of hundreds of thousands of years of evolution.

Dazzled by hubris and real accomplishment, they dare to predict that man's own creations will surpass the power of Nature. Their optimism is a product of ignorance however, and similar to the Futurama-like technology highs in the 1960s, and to what occurred after the atom was split; when it was envisioned that atomic energy would be
History is littered with unfulfilled predictions by so-called experts.
applied safely to almost everything — from heating homes, powering cars and planes, to manufacturing. History is littered with unfulfilled predictions and erroneous theories by so-called experts steeped in dogma. Unfortunately, the glare of naïveté can be blinding when the veil of nescience is at first removed.

Granted, our knowledge is rapidly expanding. But merely understanding how a biological entity functions does not imply that it can be artificially created. Human blood is a liquid with a comparatively simple function — to supply tissues with nutrients and remove waste products. Yet, we are incapable of producing synthetic blood and still critically rely on blood donations.

*****

Man has done a pretty good job in creating his own complexities here on Earth, but they're still nowhere near what Nature has done on her own — using an incomprehensible amount of time and with respect to (re)establishing states of equilibrium. We've dramatically increased our life expectancy from just 37 years in 1800, but not via rocket-science-like achievement. It's been mainly by improving sanitation and nutrition, rudimentary surgical techniques, and the use of drugs based on serendipity and empirics. We're fooling ourselves if we believe it is much more than that.

Time will tell...

posted by John Herman 9:18 AM

Friday November 1, 2013

Déjà vu All Over Again!

full circleadverb
Through a series of developments that lead back to the original position or situation — usually used in the phrase come full circle.
(Merriam-Webster)
*****

Although Yogi Berra did not anticipate Phrenicea with his famous quote, "This is like déjà vu all over again," he was humorously astute enough to observe that change is inevitable; and that it can ironically come "full circle," or "back to square one."

The Phrenicea scenario of the future envisions that by 2050 life in many ways will be less complicated, so simple in fact as to resemble 18th and 19th century living, if not before.

Witness:
????  - Reading & writing is yet to evolve.
2050 - Reading & writing is extinct (made moot with Phrenicea).

35,000 years
ago
   - Teeth become symbolic for communicating kinship and status.
Early modern humans sew extracted teeth of their dead ancestors on their clothing to show off a sense of community and distinction among their budding population.
2050 - Teeth become symbolic for communicating kinship and status.
As Phrenicea renders teeth vestigial with the consumption of Polynutriment custard, early adopters show off a sense of community and distinction by extracting and stringing their teeth upon their necks. Edentulous — and proud of it! (Colloquially referred to as "vestige prestige"!)

1895 - Radio, TV, VCRs, cell & wired telephone, fax, records, tapes, CDs, pagers, PCs and the Web/Internet do not exist (not yet invented).
2050 - Radio, TV, VCRs, cell & wired telephone, fax, records, tapes, CDs, pagers, PCs and the Web/Internet no longer exist (not needed with Phrenicea).

1860 - Automobiles, planes, jets, powered ships do not exist (not yet invented).
2050 - Automobiles, planes, jets, powered ships no longer exist (made moot with Phrenicea, as well as other factors).

200,000 years
ago
   - One language — the first — exists in the world; Proto-Human language that becomes the common ancestor to all the world's languages.
2050 - One language — the last — exists in the world; tacit, non-verbal communication facilitated by Phrenicea, that becomes the common successor to all the world's languages.

1882 - Centralized energy-generating power plants not yet utilized.
2050 - Centralized energy-generating power plants no longer feasible.

1450 - No news is good news. Well, not quite. Newspapers — or forerunners thereof — have yet to be invented.
2050 - No news is good news. Well, not really. Newspapers — and derivatives thereof — are moot. There's nothing to report in a world facilitated by Phrenicea.

1700 - Food is grown locally and is consumed locally. Long-distance shipping not possible.
2050 - Food is grown locally and is consumed locally. Long-distance shipping economically infeasible.

2500 (B.C.!) - A person's financial worth is on their person. Coin, banks and financial institutions were yet to be invented.
2050 - A person's financial worth is on their person (DNA!). Coin, banks and financial institutions not needed with Phrenicea.

3,500,000,000 years
ago
   - the most significant form of life on Earth at the time, arguably the simplest life form, prokaryotes, reproduce asexually.
2050 - the most significant form of life on Earth at the time, arguably the most complex life form, humans, reproduce asexually.

1837 - One computer — the first — exists in the world; the mechanical Analytical Engine invented by Charles Babbage.
2050 - One computer — the last — exists in the world; the biological brain-based "braincomb," aka Phrenicea.

*****

You are invited to explore the Phrenicea site to better understand the scenario touched upon above. Many of us assume that the tremendous change witnessed in the 20th century, and especially in the 1990s with the Internet/Web phenomenon, will continue throughout the 21st century to further complicate and speed up our lives.

It will get worse, and it will become unbearable. The loss of privacy, proliferation of electronic gadgets as well as technology-enabled violence and environmental factors will give rise to conditions facilitating the infiltration of Phrenicea into our lives and culture.

Full Circle

Time will tell...

posted by John Herman 5:27 AM

Tuesday October 1, 2013

Oops!

Envisioning the future is not an easy endeavor. For one brave (or crazy!) enough to attempt it, he or she might end up looking like a genius if correct (hopefully during their lifetime) — or a colossal fool (preferably posthumously) if proven wrong.

Of course we at Phrenicea hope for the former, but we are not afraid to stick our necks out and put ourselves at risk for the latter.

To ensure an adequate aire of skepticism is maintained when viewing this site, we will present here the first of what we hope will be many predictions gone wrong — very wrong.



Atomic Cars! THE BIG CAR: End of the Affair
"Lately, there have been multiplying signs that the long American romance with the big car may finally be ending.

"A small, unostentatious car will be the workhorse for commuting and shopping. Car pooling will have to increase, despite massive psychological resistance to it. The one-occupant-per-car habit is simply too expensive to be continued. The heavy car will linger as a limited-purpose, special-use auto, but not again become the basic American vehicle."
TIME magazine, December 31, 1973



Aerocar! Flying Cars are Here!
"You can buy an Aerocar today and become a pioneer in one of tomorrow's most exciting forms of travel. Cost of the first few vehicles is high, but watch cost drop as production climbs!"

Motor Trend magazine, December 1951

Ford's EDSEL! "The EDSEL LOOK is here to stay —
— and the 1959 cars will prove it!"

- EDSEL sales brochure

On the Beatles:
"The only thing different is the hair, as far as I can see. I give them a year."

- Ray Bloch, musical director, "The Ed Sullivan Show"

In the comics:

Non Sequitur © Wiley Miller. Dist. by uclick. Reprinted with permission. All rights reserved.

GM's Futurama! Welcome to Futurama!
"Welcome to a journey into the future — a journey for everyone today, into the everywhere of tomorrow..."

General Motors' futuristic pavilion — Futurama — at the 1939 New York World's Fair, was a huge hit. So when New York again hosted the fair in 1964, GM went all out with the fabulous Futurama II exhibit, arguably the best attraction at the fair.

For those lucky enough to experience the ride, it showcased a utopian-like vision of the future using incredibly detailed, miniature working models. The vision was based not on Man's good will or worldwide political harmony, but on omnipotent technology.

Here's a sampling of fanciful prose from GM's vivid brochure shown here:

  • "A dynamic new spirit enlivens the "City of Tomorrow." People, vehicles, goods — even some of the sidewalks — are moving. The city, cured of its traffic ills is reborn..."
  • "Man turns ice-clad Antarctica into a global weather center and probes its frozen wastes for resources needed by a growing world population."
  • "...[T]urbine-driven environmental cars, promise new heights of comfort and convenience in automatic motoring."
  • "On earth, automatic highways pierce the mountain barriers and turn the highlands into holiday hideaways. Resort hotels beneath the sea serve a new vacation playground."
Now that your appetite for "all things future" has been technologically whetted, take a vicarious trip through the Futurama exhibit by clicking here! Read the entire transcript of the audio narration accompanying the ride. Listen to a recording of an actual ride through the exhibition hall. Let your imagination take you to a world that was supposed to be here by now!



Atomic Cars! Atomic Power! — in your car...
"Some day, the atom, or one of its currently little known by-products, may even drive your own personal car! Does the idea sound fantastic?

"Many scientists predict that nuclear fission will be the source of power which drives all productive tools in the auto factory of the future. There is no doubt in the minds of those engaged in atomic research today...

"A professor of mechanical engineering at the University of Michigan, who asks to remain nameless, believes that some day atomic cars will be commonplace. In fact the illustration on the cover of this issue of Motor Trend is based upon his vision of such vehicles. To him, the fact that horses and buggies gave way to gasoline-powered cars during his lifetime is proof enough that atomic cars are in America's future.

"The professor dreams of a day when each automobile customer will receive a 'bound box of fissionable stuff together with pills that will create enough energy to last the life of the car.'

"So, on this April day of 1951, who is to say whether an atom car is desirable, practical, or feasible? A few decades ago who would have said that the automobile was feasible?"
Motor Trend magazine, April 1951


HAL Computer! 2001's HAL Computer!
"I am a HAL 9000 computer, production number three. I became operational at the HAL plant in Urbana Illinois on January 12th 1997"

Arthur C. Clark, 2001: A Space Odyssey

Way back in 1968, all sorts of futuristic scenarios were painted with a broad brush of technological optimism. 2001: A Space Odyssey was sophisticated science fiction based on the predictions of respected scientists. Computers with human-like abilities did not seem out of reach.

Today's scientists have become humbled, still unable to create a computer that can do anything more than process preprogrammed instructions at lightning speed. Declares North Carolina State University's Robert Rodman:

"HAL's level of achievement has not been attained. To make a computer intelligent enough to pick up the real complexities of human speech... that's going to be pretty hard to do. I'm not sure it'll ever be possible."


Flying Cars! Here Come Cars Without Wheels!
"You'll ride low and fast on a bubble of compressed air, in fantastic new 'sleds.' They whoosh across fields, swamp, water — anywhere — at speeds that could match airplanes."

"Imagine a car without wheels, skimming four feet off the ground across roadless terrain. They slide on compressed air blasted down by a propeller. Freed from friction with the ground underneath, they zoom forward at airplane-like speeds.

"They threaten to turn transportation inside-out, giving you a sports car, speedboat, half-ton truck and back-pack helicopter all rolled into one. There is no restriction on speed. The designers are thinking about 150 mph over water, 500 mph over land.

"And the big thing: It should be cheap. A breakthrough like this could shake up living patterns here and everywhere else. Any stream or stretch of open country becomes a highway, any beach a harbor. Lonely farm families will get to town, winter and summer, as easily as if they lived alongside the freeway. Commercially, the deep-water harbors that are great port cities' richest assets would be obsolete."
Popular Science magazine, July 1959


Deep Doo-Doo!
"Some apocryphal Victorian, so the story goes, looked at the rate at which the number of horses on city streets was increasing and assured his peers that their capital would soon be knee-deep in horse manure. He got it wrong, largely because he failed to predict the imminent rise of the automobile. That brought its own problems, of course, but the point was that Victorians were blindsided by the future — which, as any would-be Cassandra soon learns, is seldom what it appears to be."
TIME magazine, April 2000

A Cure for Cancer!
James D. Watson Even a famous biology legend and Nobel laureate, often referred to as the father of modern genetics, can become the subject of controversy upon trying to predict the future:
"Potential therapies called angiogenesis-inhibitors hit the news in a big way [in May of 1998] when DNA-codiscoverer James D. Watson predicted that their inventor, Judah Folkman, would 'cure cancer in two years.'
Alas, cancer is still with us."
Scientific American magazine, December 2001

So why is everyone still working so hard?
Nobel laureate and artificial intelligence expert, Herbert A. Simon (1916-2001) of Carnegie Mellon University, winner of the A.M. Turing Award and the National Medal of Science and many other awards for his work in cognitive psychology and computer science, research ranging from computer science to psychology, administration and economics; considered to be a founder of the field of artificial intelligence, declared in 1965:
"[By 1985], machines will be capable of doing any work Man can do."
*****

Gee, this kind of arrogant optimism sounds familiar. Today's inflated predictions center around computer technology. To learn about one of Simon's most galling contemporaries, Ray Kurzweil, visit "Chiphead." History does repeat itself!

Time will tell...

posted by John Herman 8:23 AM

Sunday September 1, 2013

The T-shirt and the Future!

Summer is the season for T-shirts and we Phreniceanados have donned ours almost daily. Particularly significant this year — the now-universal symbol of leisure celebrated its centennial. As reported in Forbes, "The iconic T-shirt was born in 1913 when the U.S. Navy issued crewnecks for soldiers to wear under their uniforms." We're feeling jubilant too, given the unusual symbolic nature of our shirt — representing a prophecy or vision of the 21st century. Unlike most T-shirts with a message, ours suggests something that does not yet exist!

The Phrenicea T's screen-printed logo depicts a fictionalized account of what will occur in the first century of the new millennium. The name Phrenicea is coined from the words phrenic (of the mind) and panacea (cure all). Pronounced fren-EEE-shuh — our shirts instigate grimaces and quizzical, under-the-breath phonetic attempts at pronunciation — including fren-I-kee-ya, fren-i-SEE-ya, and more. Then the inevitable question typically follows, "So what's a Phrenicea?"

Phrenicea is a scenario — an extrapolation based on current trends including advances in biotechnology, the Internet/web phenomenon, and the consequent acceleration of social evolution.
'In many ways life ends up being, both socially and technologically, more like the primitive societies of yore.'

Today, many continue to cling to the stereotypical future as depicted in the Jetsons cartoon, where technology reigns supreme, especially in the form of gadgets galore. Phrenicea turns that upside down. Gone are all the gadgets. Even the Internet and web are gone — replaced with Phrenicea. Thus, in many ways life ends up being, both socially
'Phrenicea became the central source of knowledge for the human race, accessible by mere thought.'
and technologically, more like the primitive societies of yore.

Phrenicea is (initially) a consortium imagined to be formed in the year 2014. It was the ultimate consequence of the explosion in brain research, led at first by the world's pharmaceutical companies in pursuit of new drugs to ameliorate or cure brain disorders, particularly Alzheimer's and the like — followed by the promise of enormous profits that ensnared tech (search engine companies and chip manufacturers in particular), biotech, nanotech and even entertainment companies; and ultimately just about any entity with enough resources to pursue conquering the final (practically speaking) frontier — the workings of the human brain.

The optimism was contagious, as was the belief that the secrets of the human mind were at last going to be unveiled. But even with the vast financial and technical resources expended, the brain's inner machinations proved too elusive to elucidate.

What was successfully developed however was a working interface with the human brain. This soon was followed by the ability of individuals — triggered by mere thought — to connect and communicate with ("engage") a complex of interlinked brains that were "donated" by gullibles seeking eternal consciousness.

These in vitro brains were kept alive artificially within hexagonal-shaped capsules — each becoming a contributing member (knowledge, experience, perspective, etc.) in a gigantic beehive-like chamber fittingly called the "braincomb."

Surprisingly there wasn't a shortage of volunteers. But who would donate their brains upon death? Baby Boomers of course! As the weight of years shattered their illusion of youth despite their famous moniker, they successfully accepted the reality of mortality with the promise of foreverness.

The confluence of the various technologies synergistically enabled the transmission and storage of massive amounts of data related to human discourse. Just as the early 21st-century Internet made the physical collection of books, music, and other "hard copy" moot — Phrenicea rendered "life's hard copy," the physical and tactile components of daily living, irrelevant.

The capabilities of Phrenicea rapidly became so powerful that there was no longer a need to produce most of the gadgets upon which modern society depended. Suddenly the rage was to "engage," the popular term used to describe a mental hook-up with Phrenicea.
'Phrenicea rendered the physical and tactile components of daily living irrelevant.'
Phrenicea became the central source of knowledge for the human race, accessible not via computer or other device, but by mere thought — indistinguishable from what was once simply "thinking." The popular business management lingo "thinking outside the box," was prescient beyond anyone's wildest dreams!

As with most trends, which often reach a peak before entering a period of decline, the distributed nature of the Internet (where information reached its most dispersed form, spanning millions of server computers throughout the world) fostered redundancy, inaccuracy, and unseemly (in the literal sense) waste.

In an about-face, the final phase of the "Information Revolution" entailed the
'The final phase of the "Information Revolution" is Phrenicea.'
replacement of the information hodgepodge that grew willy-nilly for decades, with a centralized data entity none other than Phrenicea. Now data validity and wholesomeness could be ensured.

Phrenicea eventually became more important and powerful than government and corporations combined. Those two powerful entities, like the goods or services they had once provided — were now irrelevant.

*****

One hundred years ago it was never fathomed how the T-shirt would become a global force in personal communication, and the one with Phrenicea emblazoned on its front representative of a vision of the future. Now, to celebrate the centennial, we're giving away a Phrenicea quality T-shirt!

Enter today and you may be a winner!

Time will tell...

posted by John Herman 7:16 AM

Thursday August 1, 2013

Our Chromosomal Blueprint

Modern history suggests (confirms?) that we humans have inherently huge egos and repeatedly have strived to outdo or outsmart Nature. The "benefit" of our endeavors is usually apparent near term; however, it is the unforeseen consequences that unfortunately impact us in the long term.

Our arrogance can be categorized into different "ages," each defined by our new capabilities at the time. We've defined the beginning of a particular age by explosive, exponential accomplishment and impact affecting socio-economic behavior accompanied by a feeling of wonderment. We define the end of an age when its influence becomes transparent — leading to nonchalance or apathy.

The following will identify six distinct ages, all of which depict a fraction of humankind's drive and desire to achieve and to command our natural world. The last two, we purport, will lead to a scenario that we call Phrenicea.


The Age of Engineering defines the era of frenzied building and construction of incredible structures planned or in progress in the late 19th through the first half of the 20th century — using advanced engineering skill and labor saving tools and machines that magnified man's physical capabilities to incredible heights.

The most magnificent evidence marking this period includes:
  • bridges — Brooklyn, George Washington, Golden Gate, Quebec Railway, Sydney Harbor, Firth of Forth
  • dams — Hoover, Aswan
  • skyscrapers — Empire State, Chrysler and Woolworth buildings
  • canals — Panama, Albert
  • tunnels — Simplon, Apennine
  • super-highways — U.S.'s Interstate, Germany's Autobahn
  • the consequences — crowded cities, suburbs, bedroom communities, traffic jams, unnatural landscapes, social isolation, pollution


The Chemical Age was at its peak mid-20th-century. Organic and inorganic chemists were brewing all sorts of chemical combinations not found in nature. There was much optimism that humankind would improve upon nature's compounds without consequence.

Significant results of this period include:
  • synthetics — DDT, nylon, PVC, synthetic rubber, PCBs, cyclamate, Velveeta®, trans fatty acids, Tang®
  • "miracle" drugs — thalidomide, DES
  • the consequences — "progress," cancer, atherosclerosis, pollution, short hemlines, indigestion


The Atomic Age cracked (literally) the secrets of the atom to unleash energy from matter. Atomic power was envisioned to power cars, homes and rockets.

Obvious results of this period include:
  • the atomic bomb — the end of WWII
  • the Cold War — four decades of massive weapons build-up and proliferation
  • nuclear reactors — cheap, "safe" electricity generation; stealthy submarines
  • the consequences — gruesome deaths, awesome fears, national machismo, duck and cover, bomb shelters, radioactive waste, yet-to-be-seen horrors


The Space Age which began in the 1950s and quickly peaked in the '60s, was perhaps the next logical frontier to tackle, given the knowledge garnered after ostensibly conquering earth.

Dubious results of this period include:
  • footprints on the moon — "One small step for [a] man..."
  • satellites — instant communications, surveillance, navigation, weather tracking
  • ballistic missiles — long-range destructive potential
  • the consequences — anti-ballistic missiles, NASA, zillions of TV channels, cell phone mania, pagers, weather "forecasting," space junk, Tang®


The Computer/Digital Age will soon begin its inevitable decline. The 1990s saw an explosive growth with the proliferation of PCs, the Internet, the web, email usage — and dot-com mania. Although technical innovation will continue at breakneck speed, its impact will be merely evolutionary. Just as the light bulb, telephone, radio and TV became part of the fabric of daily living after the initial pulse of massive change, computer technology will be taken for granted too.

Evident results of this period include:
  • PC infiltration into homes, schools and businesses — and laps
  • Internet/web phenomenon
  • high-tech gadgets; computer control of just about everything
  • the dot-com bubble and crash
  • the consequences — individual empowerment, leveling of the global playing field, 24/7, unemployment, loss of privacy, eBay, Amazon, Google, spam, and...
    Phrenicea - the website.


Although the structure of DNA was discovered by Watson and Crick in 1953, and followed by decades of significant research and progress, the DNA Age is only now emerging. There will soon be an explosion of knowledge with practical application that will change our lives with all of the magnitude of past ages.

Predicted results of this period include:
  • proliferation of human cloning
  • man-made DNA for non-biological functions — such as portfolio management and money transfer
  • species tinkering for behavioral modification. Sample consequences: litter boxes for dogs, non-flip-up toilet seats
  • species tinkering for aesthetic enhancement — accelerated "breeding" of just about any species
  • creation of totally new biological life forms. At first just to do it; next to serve specific functions.
  • the consequences — moral and ethical chaos, near-extinction of the human race, and...
    Phrenicea - the entity.


Epilogue:

Our chromosomal blueprint evolved to create a primitive footprint, then capable handprints and finally dexterous fingerprints. Our accomplishments are impressive, arguably improving "the standard of living" for many. Still — one is hopeful that we'll all not ultimately end up out-of-print.

Time will tell...

posted by John Herman 6:03 AM

Monday July 1, 2013

Water On the Brain Syndrome

Now that the lazy, hazy, crazy days of summer are upon us, it's a good time to use the sultry weather as an opportunity to revisit yet again our feverish condition of "water on the brain." Way back in 2001 our "H2Ouch!" page began recommending the following:
"Pretend you were to pay $1/gallon the next time you take a shower or bath, brush your teeth, flush a toilet, wash the dishes, or God forbid — water the lawn! Begin to use less water than the average person. Set an example. Prevent H2Ouch!"

We still believe this is good advice, but perhaps the "hyperhydro" proposal was and still is naïve. The problem is that there is little incentive to conserve fresh water from the tap — given its ridiculously low price. For example, I recently received my "Annual Water Supply Report" from my local water company and was dismayed at how little fresh water costs. Here's the breakdown:

Quarterly Water Rates — Residential

Consumption (gallons) Charges
  • Up to 8,000
  • 8,001 - 58,000
  • 58,001 - 100,000
  • Over 100,000
  • $10.00 minimum
  • $0.90 / thousand gallons
  • $1.15 / thousand gallons
  • $1.40 / thousand gallons

A dollar for 1000 gallons of clean, fresh tap water? That's insane! By comparison, bottled water by the gallon costs about $1.99. Not bad, but that's $1990 for 1000 gallons. Why is there such a cost disparity with tap water? How can anyone be motivated to conserve water at these low rates,
Wouldn't it be prudent build "aqua equity" for future generations?
other than via a guilty conscious? And let's face it; there aren't many turning on their taps ladened with guilt. (If the water companies got savvy they'd upmarket their image with exotic brand names, pricing and refillable bottles with fancy labels adding cachet to their product. Imagine having bragging rights to elite-sounding potable water! It's not that silly a suggestion, since that is essentially what Coca-Cola did with Dasani and PepsiCo with Aquafina. They're both filtered municipal tap water.)

Actually what we really need is a "watershed moment"; a trickle-down epiphany to appreciate how finite and precious our water supply is. The first step should be to make users conscious of their water consumption — and that can be accomplished handily by raising the price per gallon and using a more dramatic cost gradient for excessive use. It sounds crazy, but those concerned about conservation should lobby for pricing increases.

Another way to raise awareness might be to move our water meters out from their usual obscure locations into full view in kitchens and bathrooms — fitted with big, red digital read-outs displaying gallons used in real time. Education on where our water comes from and how it's treated,
We should instill among ourselves an idiom-cum-mantra to "spend water like it's money."
stored, delivered and renewed would also serve to engender an appreciation of what is the major constituent of all living things.

There's an old saying attributed to spendthrifts that says they "spend money like it's water." Maybe we should instill among ourselves a new idiom-cum-mantra to "spend water like it's money." Since we all tend to waste water and take its abundance for granted — it even unintentionally spills over into the comics:



© 2007 Baby Blues Partnership. Reprinted with Special Permission of King Features Syndicate.

Did Zoe and Hammie's tub really have to be filled to the brim?

*****

Wouldn't it be prudent to pay more today to change our wasteful habits, while adopting a mindset focused on conservation to build "aqua equity" for future generations?

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 8:13 AM

Friday June 7, 2013

One Student's Epiphany

Many of today's high school and college students often wonder (vociferously!) why they need to memorize boring equations, formulae and other seemingly trivial or useless information.

What many times is not emphasized by their teachers is the origin and significance of man-made expressions of what is essentially describing the workings of nature. They're not taught that many of these discoveries required lifetimes of effort — often by iconoclasts, eccentrics, heretics and recluses willing to shed lots of sweat and probably tears in order to solve nature's mysteries.

Still, many students past and present have stumbled upon these truths on their own, often with epiphanic delight.

Below is one finally-getting-serious college student's "Epiphany" written way back in 1971, stripped bare with numerous misspellings illustrating a misspent youth, yet with genuine astonishment that this seemingly simple realization took so long to gel. It was hand written pen to paper and found in a musty old box after four decades. (Today's student might blog such a personal thought — with little chance of rediscovery years hence.)

A message to those who are in the same plight as I:

If you question the ways of the sciences — concepts — rediculous [sic] equations — symbols etc and become completely fatalistic toward them — think back a moment to your forefathers who devised these methods.

These are just building blocks to understanding. Just as you need tools to produce a manual task — tools are essential in building knowledge.

Nature does what is does without any influences (until recently however). Man has not and will never harness nature by merely understanding its processies [sic]. This form of study makes use of abstract concepts to make understanding less tedius [sic] and to standardize the methods of expressing our understanding of them possible and eliminate a caotic [sic] consequence.

This must be remembered if excelence [sic] in any science is achieved.

Written by J. Herman circa 1971

Perhaps someday education will go beyond mere memorization and copying teachers by rote to include a real appreciation of our current state of knowledge — knowledge that allows us to not only understand the workings of nature, but to leverage and alter them for our benefit, as well as our peril.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 12:39 PM

Wednesday May 1, 2013

Then and Now

Time is fascinating when you take the time to think about it. We tend to consider time a man-made commodity as if we created it — when if fact all we really do it measure it. Fooling ourselves in the process, we parse time into nanoseconds, microseconds, seconds, minutes, days, weeks, months, years, decades, centuries, millenniums, periods, ages, eras, eons, epochs.

As we're measuring we can have a great time; a sad time; a boring time; a wonderful time; even an exceptional time. We can also waste time, spend time, give up time, give back time, and squander time.

Our perception of time can vary. It can go fast; go slow; even stand still — like watching a DVD with a remote. And at times it may seem to repeat itself as Yogi Berra famously observed, "This is like déjà vu all over again." Unfortunately time cannot be rewound or made to go backwards. (Why is that?)

The musically gifted write songs about time: Turn Back the Hands of Time; Time Is On My Side; Time Passages; Does Anybody Really Know What Time It Is?; Till the End of Time; and As Time Goes By are just a few.

We also have colloquial phrases like "It's about time," "Time is precious," and "Time for change." Some are heard more often than others. (The rhetorical promises of the last U.S. presidential campaign comes to mind.)

We can use time to describe ourselves and others. We can be Big-time. They are only Small-time.

We can even use time to punish: "You're in time-out!" and "25 years to life!"

We can live our lives according to time, as in "Early to bed and early to rise." Most strive to live for the moment. Many yearn to go back the old days. Others impatiently anticipate the future.

Time can be ahead of us or behind us, which becomes more important for each of us as time goes by.

The age of time is very old. Physicists tell us it's been around since the birth of the universe about 15 billion years ago. (This used to sound like a big number before all of the recent financial bailouts.) During our relatively brief time here on earth we learn that some things disappear, new things arise, and some things stay relatively the same.

An interesting activity is compiling "Then and Now" images to visibly see how things have changed — noticeably and in some cases so much so as to almost revert back to the way they were.

Here are some examples:

Click images to magnify
The more things change...
Zenith Console Bolla Console
TV in a Console (1964) TV on a Console (2009)
...the more they stay the same.
The more things change...
TDK ad iPod ad
Johnny in 1973 Johnni today
...the more they DON'T stay the same.
Gripping advertising...
Armstrong ad Pirelli ad
Hawking tires (1965) Hawking tires (2009)
...never loses traction.
Some things thankfully never change...
Old Coke New Coke
Then and then (1968) Now
...at least on the inside.
In-flight departures from bias and stereotyping...
United ad Qatar ad
"How come our girls are so capable?"
(1951)
"Best cabin crew in the Middle East."
...thankfully arrived.
More great hits and stars! By snail mail!
Columbia ad iTunes
Wow! A record club! (1968) Wow! No record club!
Even more great hits and stars! Instantly!
Envisioning push buttons in 1959...
Western Electric ad iPhone ad
Then, then and 1959's future Now
...remembering push buttons today.
Recycling sheet metal...
Toyota 2000GT Pontiac GXP
1968 Toyota 2000GT 2009 Pontiac GXP
... in the figurative sense.

When you take the time to think about it time really is fascinating. After seeing Johnny with his TDK tape cassettes, the stewardesses enduring insipid questions, and push-button phones once defining our future — would anyone want time to go backwards in time?

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 12:04 PM

Monday April 1, 2013

Offshore = On the Beach?

A recent piece in the New York Times addressed the dismal state of U.S. job market:
...The jobs gap, the number of jobs necessary to return to the 2007 peak and cover the growth in the labor force since then, is stuck around 11 million. The labor market is still far from full recovery, with a tremendous waste of human talent and a personal toll on unemployed workers and their families.

This year is likely to be more of the same, as the deal on the fiscal cliff — the American Taxpayer Relief Act — will take about 0.4 to 0.6 percent off the economy’s growth rate. Additional cuts in government spending later this year, above those already emanating from the cap on discretionary spending, would further restrain job creation. Proven policies to increase aggregate spending and near-term job growth, like the continuation of payroll tax relief and infrastructure investment, appear to be off the table. That’s a mistake, because weak demand and slow growth of gross domestic product are the primary factors behind the tepid pace of job creation.

Despite anecdotes about how employers cannot find workers with the skills they need, there is little evidence that the unemployment rate remains elevated because of mismatches between the skill requirements of available jobs and the skills of the unemployed.

Meanwhile Anne Michaud, editor for New York's Newsday observes:
The U.S. unemployment rate remains stubbornly stuck just below 8 percent... Normally, you'd think that more workers would be required to fuel companies' growth. Instead, a lot of employers are "doing more with less" — a phrase for our time. In many jobs, technological advances allow people to produce more in the same amount of time, so fewer people are needed.

But another factor is that those fortunate enough to have jobs are working longer hours. Everyone seems to have a story about someone who's taken over the jobs of three people, or is answering work emails from 6 a.m. to 11 p.m. In Europe, the average employee works 1,625 hours annually; in the United States, it's 1,797.

These articles recall an editorial letter I submitted to Fortune magazine that was published almost 20 years ago (October 17, 1994) with a pull-quote that now appears timeless:

The corporate ladder has already taken the form of a horizontal plank.
The trends back then that instigated my letter included downsizing, reengineering and outsourcing. Today we have the latest buzzword "offshoring" and its deceivingly benign-sounding synonym "globalization" added to the mix.

The pursuit of these strategies industry by industry is almost inevitable given the perception of a competitive disadvantage otherwise. Downsizing eliminates jobs usually with the known consequence of harsher working conditions for the survivors and often with a concomitant reduction in the quality of product or service — a reality which unfortunately we've grown accustomed to. Reengineering eliminates jobs primarily
How many more employees and job seekers will find their career path is a horizontal plank?
with the use of (or more efficient use of) technology. (The frustration of menu-driven automated voice answering systems immediately comes to mind.) Outsourcing is the process of transferring job functions to external entities more specialized and usually more efficient, again reducing jobs. Now offshoring is the newest darling of corporate management. It is similar to outsourcing, only the jobs being replaced end up outside national borders in order to pay well below market — yielding huge cost savings for ever more companies. Regardless of the latest buzzword that's in vogue however, the overall impact is the permanent loss of jobs.

A question that should be asked now after more than two decades of job cuts is, Where will this all end?

Will there be new faddish, jargon-laced management-speak still to come whose consequence will reduce jobs and salaries even more — and further level the highest standards of living with the poorest on earth? How many more employees with years of dedicated service, along with new job seekers with expensive academic degrees, will yet find that their career path is ultimately a horizontal plank?

The Phrenicea vision of the future predicts that there will not be enough work to go around for a worldwide population that is essentially overqualified. Perhaps this grim scenario is not too farfetched after all.

Time will tell...

Do you believe the corporate ladder has already taken the form of a horizontal plank?

Presented as a "Timely Yet Timeless" post by John Herman 8:17 AM

Friday March 1, 2013

Cubicle Dwelling? Not Yet...

Now is as good a time as any to assess whether the world is catching up with the now fourteen-year-old Phrenicea scenario.

So, is it?

Well, not really. We're not yet donating our brains to the Phrenicea braincomb. Money, pets, and newspapers are still around. Cars have not been banned, although they're getting very expensive to operate. Human cloning hasn't replaced procreation, but at this point we probably wouldn't be too surprised to hear of a successful attempt. And no, we're not living in cubicles — yet.

Nevertheless, it probably could be said that the Phrenicea scenario today is perceived as a bit less bizarre than when it was unveiled way back in May, 1999. Of course this conclusion is based on objective data — that being the volume visitor ranting via email. The hysterical ones have been on a steady decline, although the predominance of critical feedback we get still centers around the "ridiculousness" of the Phrenicea scenario as presented on the website.

A very (un)popular idea continues to be that of people living in cubicles. Granted, compared with today's mobile population, it does seem unbelievable. But we are becoming increasingly isolated from fellow human beings in an imperceptibly incremental fashion.

Right under our noses, there's been a gradual reduction of human interaction fostered by technology. It began way back in the 1950s with TV and the sprouting of the first "couch potatoes." After TV's novelty wore off, many found it preferable to just stay indoors than to socialize or pursue physical activity.

As technology marched on, the need to interface with real people declined even further. Some examples:

the elimination of personalized attention at self-service gas stations, supermarkets, home centers, etc.
answering machines and voicemail replacing conversation; business conducted via telephone tag
live customer service reps replaced by impersonal automated systems with annoying nested menus and digitized voice
DJ radio personalities supplanted with computer programmed "Jack" and similar formats; essentially mechanized music shuffling
impersonal email, instant messaging and phone-based texting replacing the spoken word
iPod zombies existing in their own little worlds.

Imagine that strangers a century ago would actually greet each other with a "Hello!" and then strike up a conversation. Today that "strike," with near total detachment from fellow humans being the norm, is ever more likely to be violent perpetrations.

If the rate and course of technological change continues, which indeed it will, our cubicle scenario of the future will be minimally a metaphorical interpretation of the social isolation that will ultimately result.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 4:56 AM

Friday February 1, 2013

Stoplight HindSight

While stopped at a traffic light behind an old Ford Windstar minivan, I snickered at the chrome appliqué announcing in bold letters that it was equipped with "Traction Control." Wow, imagine that! Windstar Rear

Have we come that far in auto technology that it's laughable now to think that Ford would deem the now mundane TC worthy of rear-end vanity? But Ford is not alone in silly trunk bunk. Through the years I've studied many a car's derrière and have seen banality forged in chrome about:

  • engines — V-6, V-8, Hemi, Rotary, Turbo, Fuel Injection, Tri-Carb, Quadrajet, OHC
  • transmissions — Powerglide, Hydramatic, Automatic, 5-Speed
  • drivetrains — FWD, AWD, 4wd, 4 Matic, QuadraDrive, ABS
Then there are meaningless tags like GT, Touring, Limited, Unlimited and the dated Deluxe, Super and Custom.

In the good ol' days, most vehicles were simply named and branded eponymously (Chrysler, Ford, Toyota (Toyoda), Studebaker, Olds(mobile), Cord, Ghia, Dodge, Chevrolet, Buick, Mercedes-Benz, Hudson, Duesenberg, Tucker, Kaiser-Fraser, etc.). Specific models were typically distinguished with letters or numbers (Model T, Model 55, Series D, etc.).

Eventually, with the proliferation of models within brands, cars were named after:

  • animals — Impala, Mustang, Falcon, Tiburon, Stag, Bronco, Ram, Cobra, Barracuda, Lark, Rabbit, Jaguar, Colt, Eagle, Cougar, Stingray, Charger, Hornet, Beetle, Pinto, Hawk
  • gods and mythology — Mercury, Dragon, Titan, Fury, Demon
  • wind — Zephyr, Scirocco, Tempest
  • geographic places — Monte Carlo, Ventura, Sedona, Eldorado, Tucson, Lucerne, Bonneville, Plymouth, Capri, Windsor, Seville, LeMans, Manhattan, Monterey, Belvedere, Continental, Fleetwood, Monaco, Calais, Riviera, Biscayne, Park Avenue, Malibu, Bel Air, Catalina
  • macho types and legends — Cadillac, Chieftain, DeSoto, Ranger, Matador, Rebel, Lancer, Commodore, Valiant, Champion, Maverick, Challenger, Intrepid
  • royalty — Ambassador, Regal, Royal, Tudor, Signet, President, Victoria, Crown, Imperial, Coronet
  • weapons — Cutlass, Dart, Torpedo, Javelin, Armada, Excalibur, Arrow, Laser, Magnum
As brands and models continued to proliferate — and necessity being the mother of invention — newly coined words would, with clever marketing define the vehicle's image (Corvette, Galaxie, Electra, Chevelle, Polara, Camaro, Futura, Altima, Celica, Toronado, Jetta, Camry, Forenza, Corvair, Invicta, Sportage, Mystere, Impreza, Sentra, Boxster, Acura, Lexus, etc.).

Mercedes-Benz resisted this appellation temptation and continued with letters and numbers to distinguish their models. In the mid-1950s, its SL two-seater sports car debuted designating "sport light." A few years later came the SEL denoting an S-Class car with fuel injection (Einspritz) and a long wheelbase. (And just to add to the confusion, the "S" in SEL was not the same as the "S" in SL.)

Acceptable letter combinations are going fast. Soon only PU and BO will be left!

American cars toyed with the idea in the 1960s with the Ford XL and LTD, Chevy SS, Javelin SST and AMX, Dodge GTS, Plymouth GTX, Barracuda S, Cougar XR7 and Pontiac GTO.

As Mercedes' status and prestige spread worldwide, other manufacturers dove into the alphabet soup. (In 1996 Acura incredibly renamed its flagship Legend to RL, discarding a decade's worth of valuable brand equity — and sales have been a disappointment since.)

Now it's all the rage to name vehicles with meaningless (guess the brands) letter combinations like DTS, CTS, STS, XLR, SRX, ESV, EXT, TL, TSX, RSX, MDX, FCX, FX, QX, LS, ES, RX, IS, GS, SC, LX, LT, MX, RX, CX, CLK, CLS, CRX, SLR, SLK, SVT, GL, HHR, NSX, XC, XJ, XK, TT, C, A, E, G, Q, M, G, H, R, S... Zzzzzzzzzzzz. In an attempt to stay awake and in the game, Lincoln joined the chaos in 2007 by rebadging its year-old Zephyr to MKZ, followed by the introduction of MKX, MKS and MKT models. And on top of all that, the MKC was just revealed at the Detroit Auto Show and is expected to be added to the model line in 2014. Got all that?
[With acceptable letter combinations going so fast, soon only PU and BO will be left!]

Even more prestigious today is to be able to brandish on a car's hindquarters "Hybrid" or a chrome "H" indicating "this car is electric, gets great mileage and averts global warming." Prius Rear

But we'll just have to wait a decade or so to find out — when stopped at some traffic light in the future staring at these no-longer-shiny chrome badges of status — whether today's state-of-the-art green technology will have become ubiquitous and familiar enough to elicit a snicker.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 6:17 AM

Tuesday January 1, 2013

The Twelve Blogs of '12

Here we are again, lamenting yet another year's passing while suffering post-celebratory writer's block. Rather than trying to cobble together a half-hearted essay, we'll sheepishly follow the last six years' cop-outs to lethargically review the "Twelve Blogs of 2012" — with premature nostalgia and perhaps feigned interest as to whether they're still relevant.

December's blog warns of the ominous "Attack of the RFIDs!"

The bad scenarios are Big Brother scary. Governments could become Tinkerbells sprinkling RFID dust on unsuspecting individuals for tracking purposes. Inhaled and trapped in the lungs, you'd become an ambulatory transmitter for life... (Read more)

• In November small thoughts about survival on a large scale prevailed:

Miraculous too is to be able to wonder where these almost infinite brain processes will take us next. With the many problems and issues facing us today — worldwide trends pointing in directions both good and ominous — it's really up to us all to focus and channel our thoughts and energies responsibly, so we too can be the Survivors of Everyday Reality... (Read more)

October's blog exposed the significance of treating angina with Viagra:

The unexpected rise out of Viagra illuminates an important point: Drug synthesis and discovery is not really by design, although most chemists in the industry will not readily admit such. The chemical compounds are created with as much art as science by unnatural means... (Read more)

• In September big thoughts about small-minded behavior prevailed:

And after thousands of years we're still at it. We can make just about anything into a symbol of status. But one person's object of distinction might be another's folly. So we have to be among like-minded people to make an impression. And you can't even take it with you! (Read more)

• In August it was too hot to be serious so we played some games:

When the Phrenicea scenario of the future was first presented on this website twelve years ago, its pages stated that all the world's knowledge was immediately available via engagement with Phrenicea, and that boredom had ensued without the challenge to learn the traditional (aka hard) way with study. Could we already be approaching this point with access to the amazing capabilities of today's search engines? (Read more)

• In July we suffered the recurring affliction of "water on the brain":

Maybe we should instill among ourselves a new idiom-cum-mantra to "spend water like it's money." We all tend to waste water and take its current abundance for granted.

Wouldn't it be prudent to pay more today to change our wasteful habits, while adopting a mindset focused on conservation to build "aqua equity" for future generations? (Read more)

June's entry declared, "What we really need now is Common Sense 2.0!"

This is more than silly. To ascribe version numbers to the earth with the overblown illusion that it is now our creation because of our innate arrogance and egocentricity is laughable... (Read more)

May's blog revisited Dow Chemical's once weird advertising campaign propagated by a self-indulgent marketing profession bent on impressing peers that seemed to elevate the importance of the marketing effort above product:

I feel delusionally vindicated looking at their newest ad and wonder if my Two-Cents entry over the ensuing years had some influence on DOW to come back to their senses and roots — acknowledging that they are indeed a "chemical" company... (Read more)

• In April we became more observant of our dependence on the baggage in what appears to be a burgeoning Bag Age:

Does convenience beget stupidity? Does stupidity beget convenience? Or do they complement each other synergistically? I contemplate this now as I find myself inundated with plastic bags carried home from virtually every type of retail store. And the problem seems to be getting worse day-by-day as behavior accommodates their omnipresence and so-called convenience... (Read more)

March's blog lamented a disturbing trend among chiropractors tempted with franchising to boost profits:

Chiropractors traditionally have spent lifetimes developing successful practices of respectable size, which reflect their character and values. They're now tempted with visions of big profits by wooing masses of clients less sophisticated and discerning — and more receptive to being dazzled with faux technologies and procedures. Secondary is keeping existing patients that may be sharp enough to see through the hype and perceptive enough to sense the tawdry goal of profit...(Read more)

February's blog laments how America's Founding Fathers would shudder at how their idealistic vision has blurred:

How did it come to this? Technology! As in sophisticated systems for marketing and polling. As in TV and satellites; (ab)used by both political camps to capture the attention of the harried and hurried, 24/7 connected lifestyles that afford little time to consume more substantive data even if it was presented... (Read more)

January's entry was tinged with nostalgia and torpor — and guilt for not writing a new essay:

Here we are again, lamenting yet another year's passing while suffering post-celebratory writer's block. Rather than trying to cobble together a half-hearted essay, we'll sheepishly follow the last five years' cop-outs to lethargically review the "Twelve Blogs of 2011" — with premature nostalgia and perhaps feigned interest as to whether they're still relevant... (Read more).

You may or may not agree, but after reviewing the "Twelve Blogs of 2012" it appears most are still relevant as we roll into the New Year. And somehow after recalling the energy that instigated the blogs to begin with, we don't feel quite as lethargic.

Maybe that's the real benefit of looking backward, reminiscing and singing "Auld Lang Syne" — not for nostalgia per se, but to recharge our batteries and begin another year with renewed energy and enthusiasm.

Time will tell...

Happy New Year!

posted by John Herman 12:05 AM

Saturday December 1, 2012

Attack of the RFIDs!

As you shop 'til you drop this holiday season be on the alert for RFIDs! Pronounced almost like Triffid or aphid, they're neither fictitious sci-fi creatures nor common insects. Yet, you may soon be endangered by or infested with them. Perhaps you have one on your person already, or in your car for electronic toll collection.

RFID is an acronym for Radio Frequency Identification. RFID tags are tiny microchips that have antennae to transmit to receivers data such as account numbers, physical location, product information, price, color, size, purchase date, etc.
The scary part is you will become a mobile device beaming all sorts of data to who knows who.
(To be technically correct, an RFID tag actually transmits a link into a database on a computer somewhere that stores the information it is reporting on such as account number, etc.)

Some of the larger RFIDs have batteries while the tiniest don't — they get their power from the radio transmitters asking for their information.

In addition to being on a key chain or stuck on a windshield, radio tags are to be attached or embedded in credit cards, clothing, grocery items, drug bottles, books, magazines, cell phones, computers, currency, tires, passports and even you — beneath your skin for access to buildings and for storing your ID and medical records.

The scary part is that you will become a mobile device beaming all sorts of data to who knows who — through barriers and from as far away as 700 feet. Imagine if your charge cards, clothing and shoes all were transmitting data — even the age and color of your underwear! Inquiring types could identify or profile you and track you everywhere. You could even get hassled when your stuff's expected lifespan is reached to buy replacements. And because RFIDs last about ten years, they can potentially transmit information for purposes well beyond their original intention.

RFID infiltration is already underway. Besides Mobil with its SpeedPass, banks are issuing key chains and credit cards with MasterCard's PayPass that can be used at subway stations, 7-Eleven, McDonald's and movie theaters. Their marketing literature hypes the benefits:

• No need for cash!
• Amazingly quick and easy way to pay!
• Feels like magic!
• Now you can fly through the checkout!
• Be the first to get what's next! [Be the first on your block!]
This sounds great, but because these tags are linked to one of your accounts, they will have the means to eavesdrop and monitor your life — recording where you go and what you buy. There is a concern too that thieves will be able to hack your radio tags to make unauthorized purchases.

Not to pick on banks per se, retailers want RFIDs to replace traditional barcodes to better track their inventory. They'll also be able to scan you coming and going and perhaps direct you to aisles based on their perception of your needs. [Your underwear is how old?]

Governments could become Tinkerbells sprinkling RFID dust on unsuspecting individuals.

And as is typical when technology is involved, things will get even more complicated. Hitachi has developed RFIDs that are so tiny they could fit between ridges of a fingerprint. They're small enough to be referred to as "powder RFIDs."

These powder tags have the potential to identify "trillions and trillions" of items. They'll be incorporated in just about anything you may purchase as they can be embedded in the packaging. A positive scenario to envision is bagging your groceries as you navigate the isles at your supermarket. When finished a scanner will compute the total of your entire cart as you approach the cashier station.

The bad scenarios are Big Brother scary. Governments could become Tinkerbells sprinkling RFID dust on unsuspecting individuals for tracking purposes. Inhaled and trapped in the lungs, you'd become an ambulatory transmitter for life — a life which might become shortened by this high-tech asbestos.

The Phrenicea scenario of the future envisions the total loss of privacy. Perhaps this is the way it will begin.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 4:19 AM

Thursday November 1, 2012

Survivors of Everyday Reality

While sitting in my parked car at a neighborhood mega mall parking lot, situated beside a busy intersection leading to a circuitous exit path towards Suburbia Major, it occurred to me that every passing automobile was a survivor — literally — of thousands of split-second decisions of its driver over the course of its operating life.

I could see this firsthand and in real-time as cars approached, stopped, waited (some confident, some sheepish, some blowing-horn impatient), and finally darted into the fray of streaming vehicles with occupants anxious to get to their next destination.

Most of the newer autos had the benefit of being free of the cosmetic consequences of careless or absent-minded human decisions, still resplendent with shiny and dent free flanks and bumpers. Older models reflected their years of life with amazing precision — analogous to the wrinkled skin of an elderly human. Hard lives of stop and go, left and right, high and low, in and out, and forward and reverse were manifested with dents, chips, dings, touch-ups and weathered paint.

Transfixed now watching all this activity, I wondered about the many older cars passing by; Was it dumb luck or the good judgment of the driver that it was still on the road after perhaps a decade or two? (Probably both.) And for a select few it appeared to be that, plus the application of good old-fashioned elbow grease. Fortunate to be the object of a Car Nut’s desire, these pristine, eat-off-the-paint beauties paraded proudly through the queue — recipients of lovingly applied plastic surgery and makeup — turning heads all the way in their battle against entropy.

And to think, all of this orderly chaos is the result of minuscule electrochemical signals traveling within our brains — that amazing gray, convoluted organ sans any visible moving parts. But then looking around at the mall itself — the big box stores, gaudy signage, sculptured pavilion artwork, and the occasional plane flying overhead — all of it was designed, constructed and maintained as a result of countless neurochemical operations. It is truly incredible when you take time to ponder it.

Miraculous too is to be able to wonder where these almost infinite brain processes will take us next. With the many problems and issues facing us today — worldwide trends pointing in directions both good and ominous — it's really up to us all to focus and channel our thoughts and energies responsibly, so we too can be the Survivors of Everyday Reality.

But, will we?

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 7:09 AM

Monday October 1, 2012

Workin' the Pharm:
"Ask your doctor about Vesilev, Vesiesta, or Vesi [choose a suffix] • vix • rum • ami • cele • max • gel • ser • iza • vet • avo • cal • zet • via • ara • nex • xol • quel • xyz • luc • lis • tis • tor • itra • ast • tia • ica • gra • iva...!"

We're bombarded nowadays with drug ads in print and on TV called "direct to consumer advertising." Their proliferation is a result of a 1997 FDA change allowing pharmaceutical companies to promote drugs without having to elaborate the negative side-effects. A great business sales model immediately emerged to recruit consumers into believers, who then cajole their doctors into prescribing drugs.

Marketing firms are paid big money to create arbitrary (meaningless) names that are then registered as unique trademarks. The guidelines appear to be short names, five to eight letters, constructed from a common set of syllables — perhaps explaining why they all have a "drug-sounding" resonance.

Here's a "short" hawking list compiled from TV and a few magazines:

Celebrex
Amitiza
Viagra
Vytorin
Nexium
Crestor
Plavix
Actonel
Lunesta
Evista
Nasonex
Asmanex
Boniva
Symbicort
Vesicare
Roserum
Flowmax
Caduet
Lipitor
Avodart
Singulair
Lunesta
Januvia
Zetia
Reclast
Levitra
Xyzal
Lucentis
Cialis
Seroquel
Xolegel

Reading the list above for the first time you'd probably guess that they were drugs. The challenge then for creating new names is to find new syllable combinations not yet coined. Choosing six syllables from the heading above — born here are Nexquel, Estavix and Vexizet. They sure sound like drugs and per Google they're not in use. (The futuristic entity Phrenicea was coined analogously albeit not as arbitrarily thirteen years ago by combining phrenic and panacea !)

The intent of pharmaceutical companies is to make their names familiar with unrelenting advertising using memorable jingles like Viva Viagra!, cryptic hints on par with "When the moment is right, you can be ready" and citing identifiable conditions such as acid reflux (Nexium), osteoporosis (Boniva), allergies (Singulair), heart risks (Lipitor), diabetes (Januvia), bladder urges (Vesicare or Flowmax), enlarged prostate (Avodart), high cholesterol (Crestor or Zetia), or high blood pressure (Caduet). And of course everyone knows what Viagra purports to cure.

To see how well the drug companies have been able to brand their names into your brain, scan the list above and click a check inside the boxes for those that are familiar.

How many did you check? Perhaps more than you would have guessed. That's the power of advertising.

Very few new drugs make it to a list like this however. Only one out of ten earns FDA approval after three phases of exacting clinical trials. Most new drugs either have no effect or are harmful.
The unexpected rise out of Viagra makes clear that drug discovery is not by design.
Another outcome is the unexpected. Many are not aware that Viagra was originally tested to treat angina and by serendipity became famous with a surprising side effect. After the (probable) chuckles during trial testing subsided, the once meaningless v-i-a-g-r-a would become an official entry in Webster's dictionary. Perhaps one day it will even become a genericized trademark like zipper, kleenex, velcro, scotch tape, band-aid, coke and Q-tip.

The unexpected rise out of Viagra illuminates an important point: Drug synthesis and discovery is not really by design, although most chemists in the industry will not readily admit such. The chemical compounds are created with as much art as science by unnatural means, with elaborate apparatus to control physical conditions of temperature, pressure, etc. They're then evaluated via empirics with animals and humans, which is a fancy way of saying they watch for indications (good effects) and reactions (bad effects) and contraindications (bad combinations) with other drugs, chemicals, and nutrients. The entire process can take years or decades.

Because so many of the intricacies of human body chemistry are a yet to be learned or explained, oftentimes how a drug works (pharmacodynamics) is a mystery. The pamphlet insert (which hardly anyone bothers to read) for Aldara cream states, "The mechanism of action is unknown." In layman's terms it would read, "We have no idea how this stuff works."

That explains too why so little is known about the long-term impact of these synthetic concoctions, and why in some cases they have to be pulled from the market due to unforeseen negative complications (Vioxx).

Not ironically, the benign sounding drug names drummed into us belie all this complexity. The trade name Plavix has the clunky generic name of clopidogrel, which pales next to its chemical name:

(+)-(S)-methyl 2-(2-chlorophenyl)-
2-(6,7-dihydrothieno[3,2-c]pyridin-5(4H)-yl)acetate

Imagine putting music and images to clopidogrel, or worse — methyl dihydrothiano pyridin acetate. Would you be as easily swayed to "ask your doctor" as the Plavix advertisements implore?

Here's more:

Trade Name Generic Name Chemical Name
Celebrex celecoxib 4-[5-(4-methylphenyl)-3-(trifluoromethyl)
pyrazol-1-yl]benzenesulfonamide
Lunesta eszopiclone (5S)-6-(5-Chloro-2-pyridinyl)-7-oxo-6,7-dihydro-
5H-pyrrolo[3,4-b]pyrazin -5-yl 4-methyl-
1-piperazinecarboxylate
Levitra vardenafil 4-[2-ethoxy-5-(4-ethylpiperazin-1-yl)sulfonyl-phenyl]-
9-methyl-7-propyl- 3,5,6,8-tetrazabicyclo[4.3.0]
nona-3,7,9-trien-2-one
Viagra sildenafil 1-[4-ethoxy-3-(6,7-dihydro-1-methyl-
7-oxo-3-propyl-1H-pyrazolo[4,3-d]pyrimidin-5-yl)
phenylsulfonyl]-4-methylpiperazine citrate
Boniva ibandronic acid 1-hydroxy-3-(methyl-pentyl-amino)-1-phosphono-
propyl]phosphonic acid

If you've taken a course in organic chemistry you might comprehend more of this. But then you would better appreciate the complexity of the human body, and the precarious chances taken when ingesting these novel
Plow a furrow of skepticism in your brow as harvests past yielded unexpected results.
chemicals never before seen in nature; created in laboratories with equipment that defies evolutionary rules and the eons of time to be in harmony with biological systems.

Nevertheless, the next time you find yourself "workin' the pharm" by asking your doctor about a drug you saw on TV or in a magazine, don't be afraid to get your hands dirty beforehand by digging up your own proverbial dirt via the Web or elsewhere to learn "what's in a name" — and then muster the vigor to plow a furrow of skepticism in your brow as harvests past have tended to yield unexpected crops.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 5:23 AM

Saturday September 1, 2012

The Evolution of Status

After many years of blood and sweat, I was fortunate enough in the early 1980s to have the opportunity and means to purchase a house constructed from the ground up. The one-acre-zoned development would supplant what was in the 1950’s a working dairy farm - with real cows that I had observed as a young boy. It was an exciting experience watching the rich barren land evolve into livable structures; the genesis of yet another suburban dream — or nightmare, depending on your vantage point.

As the homes were completed and occupied in almost perfect sequential order, the display of owner status in various degrees of ostentatiousness became evident and resembled a stadium crowd performing "the wave" at a sporting event.

After the wave subsided, what followed was a hushed assessment of each owner's financial means and personal taste. Judgment was based on observations such as: Who installed the most, and most expensive, exterior lighting fixtures? Who erected the largest or most original custom-made curbside mailbox? Who had resplendent landscape designs immediately realized into manicured mini-arboretums?

Incredibly, without cable TV capability, even the lowly rooftop TV antenna became symbolic by revealing who was watching broadcast TV without a fuzzy picture. Then eight months later, when cable hook-ups finally became available, an incredible aerial flip-flop took off. Those who'd brandished clear reception with antennas quickly removed their one-time status symbols, since it was now embarrassing to be perceived as one not paying the premium.

As the years passed it became more difficult to recognize changes that might be discerned as enhanced cachet, but not for long. Without warning, a new wave swept through swelling heads ever higher — leaving in its wake huge curbside dumpsters signaling interior renovations or extensions. Paving stone became the rage, supplanting plebeian-yet-functional concrete walks and blacktop driveways. And not long after, expensive foreign cars graced the spiffy new driveways.

Finally, after almost three decades with all visible forms of home status exhausted, the ultimate bragging right is to erect a "For Sale" sign, host a garage sale and move to a carefree condo or leisure village. And yes, I’ve already moved into one!

*****

So you have to wonder — where does this pettiness and vainglory stem from? It's apparently embedded in our designer genes, going back thousands of years. Anthropologists actually consider this to be advanced behavior — when compared to our more ape-like ancestors that is.

New York University's Randall White explains:
"One of the things that we know from studying modern humans is that personal adornment and the symbolic communication of a social identity is involved in maintaining differences within a society. By studying artifacts we imagine that what was going on 40,000 years ago was the first time in human evolution that we have the internal subdivision of human societies into different categories of social persons."

And after thousands of years we're still at it. We can make just about anything into a symbol of status — even a TV antenna! But one person's object of distinction might be another's folly. So we have to be among like-minded people to make an impression. And upon death — or downsizing — you can't even take it with you!

So you have to ask then, after all these millennia, isn't it about time we evolved beyond this small-minded behavior? [Nah!]

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 5:12 AM

Feedback: Your Two-Cents!

I believe what you are calling Evolution is what we understood as trying to keep up with the Joneses — or the grass always looks greener on the other side of the fence. This is a relatively a new phenomenon, I know as a kid in the 50's I did not see this with any of my relatives. I also spent close to 4 years in Great Britain in the early 60's and they were basically under a socialist government at the time (Labor Party) and there was no such thing as competition or keeping up with the Joneses.

Now my children who are close to 40 years old spend a lot of time up upgrading their abodes to keep up with their friends. Needless to say they have some debt that they are now concerned about. So the moral of this reply is yes, we have the option to behave like this (such as my children and you), but countries that have socialism everybody is dragged down to the same common denominator — and they really don't have this option to much of any extent. The way this country (USA) is moving you may get your wish — we will all be forced away from this small-minded behavor, but it's definitely not Evolution — it's a choice!!!

Regards,
John S

Wednesday August 1, 2012

Summertime Fun

With the lazy, hazy, crazy days of summer upon us, it's time to chill out for a change. For those yearning to read more serious matter, just scroll down to previous blogs. Unfortunately most of them are still relevant.

So how about a "connecting the dots" puzzle!

In your mind's eye, or on paper if you draw the dots as shown, starting from any point, draw four continuous lines (without lifting your pen or pencil) so that each of the nine dots has at least one line running through it.

Although the solution will be alluded to below, it will not be posted here so feel free to enjoy this blog to the end without your fun being ruined.

.     .     .

.     .     .

.     .     .


When you've finished (or have given up!) with the Nine Dots Puzzle or want to multitask, here's another one. Try to determine the next sequence of numbers based on the following pattern:

1
11
21
1211
111221
312211
13112221
1113213211

What's the next sequence?

Did you get it? Did you give up?

When these puzzles were posed to me (at a Six Sigma seminar break no less), I like you was in front of a PC. Rather than racking my brain for an answer like many of my colleagues, I cleverly googled on Yahoo! (I crave contrarianism) "9 dots puzzle" and immediately had pages and pages of links to the correct answer. Try it!

For the second puzzle I used a search engine again using the last sequence number in quotes as in "1113213211" and boom, I again had pages and pages of links to the answer and its simple derivation. I don't think I would have ever figured out the logic for this one on my own. Give it a try!

Later on at a more serious point in the seminar it hit me that for both of these puzzles I did not even have to think — and I'm not sure this is a good thing. (I did not experience any sense of accomplishment either, although I was first to arrive at the answers to the puzzles.)

When the Phrenicea scenario of the future was first presented on this website twelve years ago, the engage and somnam pages stated that all the world's knowledge was immediately available via engagement with Phrenicea, and that boredom had ensued without the challenge to learn the traditional (aka hard) way with study.

Could we already be approaching this point with access to the amazing capabilities of today's search engines?

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 6:28 AM

Sunday July 1, 2012

Water On the Brain Syndrome

Now that the lazy, hazy, crazy days of summer are upon us, it's a good time to use the sultry weather as an opportunity to revisit yet again our feverish condition of "water on the brain." Way back in 2001 our "H2Ouch!" page began recommending the following:
"Pretend you were to pay $1/gallon the next time you take a shower or bath, brush your teeth, flush a toilet, wash the dishes, or God forbid — water the lawn! Begin to use less water than the average person. Set an example. Prevent H2Ouch!"

We still believe this is good advice, but perhaps the "hyperhydro" proposal was and still is naïve. The problem is that there is little incentive to conserve fresh water from the tap — given its ridiculously low price. For example, I recently received my "Annual Water Supply Report" from my local water company and was dismayed at how little fresh water costs. Here's the breakdown:

Quarterly Water Rates — Residential

Consumption (gallons) Charges
  • Up to 8,000
  • 8,001 - 58,000
  • 58,001 - 100,000
  • Over 100,000
  • $10.00 minimum
  • $0.90 / thousand gallons
  • $1.15 / thousand gallons
  • $1.40 / thousand gallons

A dollar for 1000 gallons of clean, fresh tap water? That's insane! By comparison, bottled water by the gallon costs about $1.99. Not bad, but that's $1990 for 1000 gallons. Why is there such a cost disparity with tap water? How can anyone be motivated to conserve water at these low rates,
Wouldn't it be prudent build "aqua equity" for future generations?
other than via a guilty conscious? And let's face it; there aren't many turning on their taps ladened with guilt. (If the water companies got savvy they'd upmarket their image with exotic brand names, pricing and refillable bottles with fancy labels adding cachet to their product. Imagine having bragging rights to elite-sounding potable water! It's not that silly a suggestion, since that is essentially what Coca-Cola did with Dasani and PepsiCo with Aquafina. They're both filtered municipal tap water.)

Actually what we really need is a "watershed moment"; a trickle-down epiphany to appreciate how finite and precious our water supply is. The first step should be to make users conscious of their water consumption — and that can be accomplished handily by raising the price per gallon and using a more dramatic cost gradient for excessive use. It sounds crazy, but those concerned about conservation should lobby for pricing increases.

Another way to raise awareness might be to move our water meters out from their usual obscure locations into full view in kitchens and bathrooms — fitted with big, red digital read-outs displaying gallons used in real time. Education on where our water comes from and how it's treated,
We should instill among ourselves an idiom-cum-mantra to "spend water like it's money."
stored, delivered and renewed would also serve to engender an appreciation of what is the major constituent of all living things.

There's an old saying attributed to spendthrifts that says they "spend money like it's water." Maybe we should instill among ourselves a new idiom-cum-mantra to "spend water like it's money." Since we all tend to waste water and take its abundance for granted — it even unintentionally spills over into the comics:



© 2007 Baby Blues Partnership. Reprinted with Special Permission of King Features Syndicate.

Did Zoe and Hammie's tub really have to be filled to the brim?

*****

Wouldn't it be prudent to pay more today to change our wasteful habits, while adopting a mindset focused on conservation to build "aqua equity" for future generations?

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 9:10 AM

Friday June 1, 2012

Two Point Uh-Oh

The term "Web 2.0" is a popular expression referring to the second generation of the World Wide Web. The media has created a "2.0" craze by slapping the trendy suffix (implying next or new) onto just about anything. We addressed this originally way back in May 2007 — and it seems there is no end in sight.

Version numbers like 2.0 are adopted from the software industry, from what's commonly referred to as the "development life cycle." The confounding terminology is being unabashedly usurped by advertisers making it part of our vernacular. Although their target market may not truly grasp the pretentious and technical jargon being exploited, what is undoubtedly implied is "new and improved."

To imagine where we would be today if this silly jargon began early in the 20th century, let's for fun try to retroactively apply (loosely) "something-point-something" version numbers to several ground-breaking advances from the past — innovations unleashed upon the masses at a time before software and its perpetual upgrades began controlling yet-to-be-developed computer hardware.

For example, "Radio 1.0" would enable millions of regular folks to acquire console-sized, beautifully crafted, wood-encased receivers to experience entertainment programming geared towards a wide audience. Breaking news would propagate across the land almost instantaneously. A new sense of mass identity and community was established using technology. For many, the radio became a focal point in the home — a place to gather each night.

So what innovation could we ascribe to Radio 2.0? The car radio perhaps. Radio 3.0? That might be the tiny, tinny, plastic and portable transistor radio. The pocket marvel that enabled millions of baby boomer teens to revel in their music — rock and roll — defining a generation while driving parents crazy. For many, the transistor radio became a focal point of the self — a personal gadget to hideaway with. Taking radio's progression further:
Radio 4.0 - FM
Radio 4.5 - FM stereo
Radio 5.0 - XM/Sirius satellite
Radio 6.0 - ?

Now let's try to apply version numbers to pre-recorded music:
Music 1.0 - phonograph - 78rpm
Music 1.2 - phonograph - 45rpm
Music 1.5 - phonograph - 33 1/3rpm "Long Playing"
Music 2.0 - hi-fi and stereo
Music 3.0 - 8-track tape
Music 3.5 - cassette tape
Music 4.0 - CDs
Music 5.0 - iPod
Music 6.0 - ?

How about culture-transforming electronic point-to-point communication:
Point-to-Point 1.0 - telegraph
Point-to-Point 2.0 - telephone rotary
Point-to-Point 2.5 - telephone pushbutton (touchtone)
Point-to-Point 3.0 - wireless cell phone
Point-to-Point 4.0 - BlackBerry
Point-to-Point 5.0 - ?

And finally the evolution of locomotion or travel:
Travel 0.1 - flagellum
Travel 0.9 - four-legged ambulation
Travel 1.0 - two-legged ambulation
Travel 2.0 - tamed horse
Travel 3.0 - covered wagon
Travel 4.0 - trains
Travel 5.0 - horseless carriage
Travel 5.5 - automobile
Travel 6.0 - prop(eller) planes
Travel 6.5 - jets
Travel 7.0 - transporter?

If applying version numbers retroactively seems silly — it probably is.

So, maybe we should view applying them willy-nilly nowadays as just as silly. It's apparently not too silly though for Scientific American magazine. Perhaps they took a cue from our original Phrenicea tongue-in-cheek analogies to seriously name a new magazine — Earth3.0 — based on the pretentious premise that humankind in a scant century or so has already outpaced 4.5 billion years of natural entropy.

Here's their justification (edited for brevity):

This planet is no longer simply the home of our species: it is also our creation.[sic] And as with any product, sometimes it is prudent to upgrade its quality.[sic]

Earth 1.0 was the world that persisted and evolved for billions of years, up until very recently. Even after we humans developed agriculture, which considerably enlarged our footprint on the environment, our overall influence was fairly small and localized. That changed two centuries ago with the arrival of Earth 2.0, when the industrial revolution gave the human race the leverage to achieve unprecedented health and prosperity but at the price of wanton consumption of natural resources.

Today we have unwittingly become the major drivers of potentially disastrous climate change. We have extinguished species at a rate not seen since the end of the dinosaurs. We have depleted ocean fisheries so severely they could collapse by midcentury.

Earth 3.0 is thus the new way forward that we need to establish, one with all the prosperity of 2.0 but also the sustainability of 1.0. And it is in that spirit that we present Earth3.0.

Click to read more


This is more than silly. To ascribe version numbers to the earth with the overblown illusion that it is now our creation because of our innate arrogance and egocentricity is laughable. Planet Earth will inevitably do just fine without us — whenever that ends up to be.

Nevertheless, maybe the idea for this new magazine was a good one, capitalizing on Scientific American's good name and our present day concerns, albeit with an unabashed goal to generate additional revenue.
Ascribing version numbers to the earth with the illusion that it's our creation is laughable.
Scientific American has been in existence for over a century. It's a blue-chip publishing brand built very slowly since 1845 that has garnered a reputation of quality. In contrast, How long will faddish Earth3.0 last? Unfortunately, everyone seems to think short term today — and that is why we are in a world of crisis on more than one front.

Like General Motors et al from the past, the publisher is leveraging the time-honored and respected name Scientific American for a quick profit. You'd think by now that lessons would have been learned given the state of companies like GM, who cheapened their once powerhouse names and reputations to virtually valueless trash.

What we really need now is Common Sense 2.0.

Time will tell...


Click here for more sightings of Point Uh-Oh terminology in the media.

Presented as a "Timely Yet Timeless" post by John Herman 5:33 AM

Tuesday May 1, 2012

How Now DOW Redux

Way back in 2006 I posed the question:
Will there come a time when most advertising supplants the mundane indignity of pushing product with haughty subliminal messages, propagated by a self-indulgent marketing profession bent on impressing peers while attempting to make us feel good about them?

This was provoked by the DOW Chemical Company's then bizarre advertising campaign referenced below. I feel delusionally vindicated looking at their newest ad and wonder if my Two-Cents entry over the ensuing years had some influence on DOW to come back to their senses and roots — acknowledging that they are indeed a "chemical" company.

Perhaps one day too we will sense the optimism of a booming economy portrayed by DOW's vintage ads from the '50s and '60s.


Sunday October 1, 2006

How Now DOW?

Isn't marketing amazing? It can come at you subtly or be right in your face. It can turn you off or lure you into its lair.

Or if you're like me, it can trigger a small effort investigating a company's marketing history.

What motivated this month's Two-Cents weblog is an unusual advertising campaign by Dow Chemical.

For what seemed like the entire summer, I kept opening the front covers of Newsweek, Business Week, Scientific American and several others to find a prominent ad spanning two pages. Impossible to miss on the left-hand page was an in-your-face facial portrait overlaid with the large letters "HU," followed by the word "HUMAN" and an unusual equation "7E+09."

For weeks I would glance at the face, the letters, the equation — and while not really focusing on the right-hand page, subliminally recognize the company Dow by the red diamond logo. Then I would just turn the page.

After perhaps the tenth iteration of this scenario I finally read the accompanying copy. What hit me was that this was an advertisement for a chemical company, yet there was no mention of a product or service. Instead there was flowery prose like "Bonds are formed between aspirations and commitments..." and "...energy released from reactions fuels a boundless spirit..." What? This is a company that manufacturers chemicals!

Curious now how this advertising approach differed from years' past, I searched through some musty boxes of old Scientific American magazines. (Crazy, yes?) Sure enough I was able to find a few Dow advertisements.

In the booming post-war 1950s and 1960s, the function of marketing was simple and direct — to sell.

  • In 1954 Dow bragged about their permanent silicone lubricant. Imagine that it gave "consistent performance over a wide temperature span" and "lasted 30 times longer" and was "semi-organic and inherently stable." (Today that might make great copy for a prophylactic!)
  • In 1959 Dow boasted, "Today, the imposing list of high quality pharmaceutical chemicals supplied by Dow in abundance includes bromine, medicinal salicylates, epsom salt, chloroform, analgesic drugs and elemental iodine..." (Great, especially if you got a bad cut!)
  • And in 1962 Dow broached the sticky subject of resins, "Consider first, an extremely pure epoxy resin with a viscosity of 4,000 - 6,400 cps, and color 1 max. This is essentially the pure diglyceride ether of bisphenol..." (Yes, now that's a real macho chemical company ad!)

These days it seems the importance of marketing is elevated beyond the product. In this example it has taken on a life of its own, surpassing any practical objective. It seems to function mainly to mold subjective feelings.

Why is that?

Is it because we're beyond being impressed with lubricants, epoxies, and analgesics? How else should a chemical company advertise in an age where the behind-the-scenes, nuts-and-bolts of society are deemed boring — or worse, beyond general comprehension?

Is it the negative connotations associated with the word "chemical," given past incidents of pollution? Is it the unexpected consequences and deleterious effects that have been associated with man-made concoctions? Or is it that we are so detached nowadays from the "brick and mortar" of what keeps the complex infrastructure running.

*****

So what does "HU" and "7E+09" mean in the Dow ad? For that I had to visit the Dow website and read the "Around Dow" company newsletter. Sure enough, my observations were confirmed regarding the absence of product in order to concentrate on building Dow's reputation.

I can just imagine the Dow executives drinking up their marketing firm's mumbo-jumbo proposal to adopt chemistry's periodic table as the basis for the "Human Element." And that the contrived atomic weight of the human population 7E+09 would be symbolic of Dow's commitment to humanity's well-being.

But is it smart advertising when it's incomprehensible to the casual observer? Or even to those intrigued enough to try to figure it all out? Should one have to search a company's website in order to garner the intent of their ad campaign?

Worse still, will there come a time when most advertising supplants the mundane indignity of pushing product with haughty subliminal messages, propagated by a self-indulgent marketing profession bent on impressing peers while attempting to make us feel good about them?

Time will tell...

posted by John Herman 8:53 AM


Time will tell...

posted by John Herman May 1, 2012 5:19 AM

Sunday April 1, 2012

Bag Age Baggage

con•ve•ience
Freedom from difficulty, discomfort, or trouble.
(Merriam-Webster)

stu•pid•i•ty
The quality or state of being stupid; given to unintelligent decisions or acts.
(Merriam-Webster)

*****

Does convenience beget stupidity? Does stupidity beget convenience? Or do they complement each other synergistically?

I contemplate this now as I find myself inundated with plastic bags carried home from virtually every type of retail store. And the problem seems to be getting worse day-by-day as behavior accommodates their omnipresence and so-called convenience.

The phenomenon has made me more observant of our dependence on the baggage in what appears to be a burgeoning Bag Age:

  • When buying six replacement glass chimney light shades at a local big-box home center, the clerk first wrapped each individually with unopened plastic bags before placing them in bags for carrying, two per bag. That's nine bags! (In times past they would have been wrapped in newspaper and placed in an old cardboard box.)
  • While fumbling for cash to pay for a watch battery, the clerk placed the tiny two-inch square cardboard package and a three-foot-long receipt inside a large two-foot square bag. Unsuccessful in giving it back, I self-consciously carried the nearly empty bag out and almost lost it to a gust of wind. (What happened to small paper bags? This is convenience?)
  • A local beer outlet insisted that I not walk the five feet to the exit carrying a six-pack naked of a plastic bag. "It's the law," chided the proprietor. (Government mandated stupidity or a manufactured "convenience"?)
  • Suddenly this year, curbside Christmas trees that filled homes with holiday cheer were discarded in huge white plastic bags — large enough to cocoon a small car. (I cannot fathom any convenience here.)
  • Taking advantage of the annual supermarket "can can" sale, the clerk did me a "favor" by packing only two cans per plastic bag. No amount of coaxing would convince otherwise. (Convenience or stupidity?)
  • As I was entering a supermarket checkout line, the elderly shopper before me said to the cashier, "You can put more items in those bags." The cashier replied, "I don't want you to have work too work hard loading them into your car!" I tried to count the mountain of bags in her cart but was unsuccessful. (I left the store in a state of dismay.)
  • Upon exiting a supermarket checkout, the cashier asked the shopper behind me, "Plastic or paper, sir?" He chose plastic. (Which definition above applies here?)
  • Waiting for a train I watched as a garbage bin maintenance worker mindlessly emptied bin after bin replacing the mostly empty plastic liners with not one, but two new bags. That's waste times two. (I could only shake my head at the moronic stupidity of a management that most likely mandated this policy.)

Pathetically, these scenarios play out, day in and day out. Unfortunately, it's now become business as usual.

I've tempered some of my own guilt by reusing used bags as garbage liners. A decade ago I purchased a half dozen "eBaskets." They're simply but ingeniously designed to accept standard issue bags enabling them to complete
It's been a decade since purchasing bags that are Glad just to carry trash.
their usable life cycle more responsibly. It's been that long since I purchased bags that are Glad just to carry trash. Particularly annoying then are odd-shaped bags that are too tall and narrow or too short and wide — and good for absolutely nothing afterwards.

Still, having so many more standard-sized bags than trash needing a liner, I end up annually with bags stuffed full of themselves from supermarkets, department stores, stationery stores, etc. Fortunately my municipality recycles them. Even so, how many just thoughtlessly toss them out?

Evidently too many — since the company that produced the eBaskets, Green Earth Products, went out of business in 2007. All the politically correct rhetoric nowadays about going "green" apparently is just that. Many talk the talk. But when it comes to walking the walk, the difficult part that's just too much trouble, that's when we fall short.

Now I have to ask:

  • What happened to good old biodegradable paper bags? (Plastic is now cheaper.)
  • How did it come to this? (Wasting ubiquitous resources easily becomes habit forming.)
  • Are reusable cloth bags a viable solution? (Probably not without an outright ban on plastic. Beyond the perception of being a hassle, with repeated use they could become unsanitary; since it's doubtful many would bother laundering them.)
  • What will this (now worldwide) plastic waste eventually do to the environment, given that the bags are non-degradable polyethylene that will last at least 1000 years? (Time will eventually tell.)

So, in final analysis:

  • Does convenience beget stupidity?
  • Does stupidity beget convenience?

I deem yes on both counts here — unless we one day have the wherewithal to sack today's Bag Age baggage.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 4:37 AM

Thursday March 1, 2012

Chiropractic Tactic?

The Phrenicea scenario of the future depicts chiropractic supplanting what is today considered traditional medicine — a profit-driven medical/insurance industry treating various states of disease with drugs and/or surgery. Unfortunately, this idealistic vision appears to be in jeopardy with the emergence of a new and ominous trend among chiropractors.

To compete with today's established "big business" of medicine, many chiropractors have been participating in a trend to further legitimize their profession and increase revenue by expanding their application of physical manipulation of the spine and other body structures to include the pursuit of "wellness." This may include the addition of other alternative, non-traditional modalities such as acupuncture, naturopathy, message therapy, yoga and more. The wellness approach attempts to proactively prevent disease holistically with proper lifestyle, rather than treating symptoms of disease already in progress.

The proliferation of wellness centers is a trend that is expected to continue, since chiropractors have found receptive, health conscious, well-heeled clients willing to pay out of their own pockets for treatments that appear to be effective, as well as for expensive natural supplements.

A more recent trend, and an ominous one in my opinion, is the growth of franchised centers of wellness with the chiropractor as the hub. The franchise concept is not new to the service industry — certainly not with fast-food like burgers. Over the years it has expanded to include niche restaurants, haircutting, package shipping, travel, eyeglass fitting, chimney cleaning, home inspections, lawn maintenance, maid service and more. It is quite new to the business of chiropractic however.

It's apparent that the goal of franchising in this profession is to "McDonald's-ize" the chiropractic experience in terms of creating recognizable brands to a broader population, while bringing in extra cash for programs of questionable benefit — for the patient at least.

Chiropractors traditionally have spent lifetimes developing successful practices of respectable size, which reflect their character and values. They're now tempted with visions of big profits by wooing masses of clients less sophisticated and discerning — and more receptive to being dazzled with faux technologies and procedures. Secondary is keeping existing patients that may be sharp enough to see through the hype and perceptive enough to sense the tawdry goal of profit; the same patients that are at risk of being alienated with slick, sterile, and generic branding, flashy placards and gee-whiz computer-facilitated tests of dubious value.

It's not clear, at least to me, whether the cookie-cutter success associated with the formulaic, sanitized operation of a franchise is transferable to the practice of chiropractic. And even if it is, should it be? (A moot question if more profit is the primary motive.)

The inevitable result of franchising in general is the lowering of service quality to merely acceptable or tolerable, vis-à-vis stand-alone businesses. (How many four-star restaurants are members of a franchise? How many top-notch hair cutters?) This settling consequence might be acceptable for a meal or a haircut — but for healthcare?

If this scenario of greed plays out as it did with traditional medicine, there may have to be a new alternative to today's alternative healthcare.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 6:11 AM

Wednesday February 1, 2012

In Sound Bites We Trust

Through all of the 2012 U.S. presidential campaigning hoopla, there's been little substantive discussion of important issues like energy, education, boomer retirement, immigration, conservation, environment, health care, outsourcing, lingering security threats — as well as the fragile global economy.

Instead, we are bombarded daily with cursory, carefully crafted sound bites, incessant accusations, tough talk, innuendo, decades-old misdeeds, and of course poll results. Both political parties have employed character-manipulating handlers, and SWAT teams to verbally attack and counterattack each other with clever pyrotechnics that can instantly shoot around the world for anyone still interested.

How did it come to this?

Technology! As in sophisticated systems for marketing and polling. As in TV, satellites, cable, the Internet/web, PCs and hand-held devices. The unfortunate result is a desperate attempt
The Founding Fathers would shudder at how their idealistic vision blurred with the worst that technology has to offer.
by both political camps to capture the attention of the harried and hurried, 24/7 connected lifestyles that afford little time to consume more substantive data even if it was presented.

America's Founding Fathers would shudder at how their idealistic vision has blurred with the worst that technology has to offer.

Perhaps the most powerful influence is TV, although the Internet/web is closing in. Even before the fateful debate between Kennedy and Nixon, Eisenhower employed the marketing powerhouse Rosser Reeves (known for M&Ms' "Melts in your mouth, not in your hand" as well as other famous commercial tag lines) to subtly portray him in the TV ads as authoritative yet likable.

It was the first use of a political "ad campaign" with short 30 second "spots" to manipulate the viewer's perception of the candidate. Its success set the stage (so to speak) to employ ever more sophisticated principles and techniques for legal brainwashing en masse.

Wayne Stayskal
© Tribune Media Services, Inc. All rights reserved.

At about the same time, the science of polling "Galluped" ahead and provided the so-called pulse of the nation — which can itself influence opinions exponentially.

Finally, the ubiquitous electronic medium facilitates the infamous "debates," which have become anything but. The intense negotiations that precede them strive to minimize spontaneity and maximize the probability that there'll be no major gaffes that would be aired ad infinitum — leading to the destruction of the unfortunate perpetrator. Former news anchor Jim Lehrer explains, "It is the only opportunity to evaluate candidates side by side. For the candidates, it's their last chance to close the deal with voters." The candidates echo Lehrer:

  • "You're on guard; you don't want to make a mistake; you don't want to say anything that's going to offend." - George H.W. Bush
  • "It was intense and confrontational from beginning to end." - John Edwards
  • "The stakes are very, very high." - John Kerry
  • "Now, was I glad that the damn thing [1992 debate] was over? Yeah, maybe that's why I was looking at [my watch]. Only ten more minutes of this crap." - George H.W. Bush

*****

So when will this end?

Short term: After the 2012 U.S. election and there's a collective sigh of relief among its citizens — that it will be at least a year (hopefully!) before the 2016 election is mentioned.

Long term: When the institutionalization of Phrenicea occurs. "Phrenicea ensures that no voter is uninformed. Every eligible voter has complete access to the entire unbiased history of each candidate — objective factual and biographical information resident within the Phrenicea braincomb — to decide which candidate is the best choice. Any vote even partly based partly on subjective criteria is summarily rejected.

Sounds too good to be true.

Time will tell...

© 2008 by King Features Syndicate, Inc. World rights reserved.

Presented as a "Timely Yet Timeless" post by John Herman 5:17 AM

Sunday January 1, 2012

The Twelve Blogs of '11

Here we are again, lamenting yet another year's passing while suffering post-celebratory writer's block. Rather than trying to cobble together a half-hearted essay, we'll sheepishly follow the last five years' cop-outs to lethargically review the "Twelve Blogs of 2011" — with premature nostalgia and perhaps feigned interest as to whether they're still relevant.

• In December we pondered the ephemeralness of our personal memories:

It seems impossible now, but there will come a day when the mechanism for memory assembly and storage within the human brain is elucidated. A next logical step would be to try to save or replicate these memories, perhaps for recollection by others... (Read more)

• In November we were preoccupied with losing our Inno"Since":

Keep an ear out and an eye open for the next time you hear or see an advertisement touting "In business since....," "founded in...," or "established..." Contrary to its intent or implication, the practical value of the boast is questionable. Perhaps this newfound awareness can be termed The End of Inno"Since"... (Read more)

• In October we wrung our hands over being "all thumbs":

Looking at the impact of our silicon-infused, technological society it is obvious that very little is done "by hand" nowadays — by young and old alike... (Read more)

• The September blog attempts to explain why epigenetics is downright frightening:

Epigenetics is a new sub-field of biology that studies how chemicals in our environmental can play havoc "on top of" genes — acting like switches to turn them on or off abnormally. (The "epi" prefix means "on top of," as in epidermis — the outer layer of skin "on top of" the dermis.) So even though the gene appears normal and has not changed (or mutated), it does not function properly... (Read more)

• In August we suffered the recurring affliction of "water on the brain":

Maybe we should instill among ourselves a new idiom-cum-mantra to "spend water like it's money." We all tend to waste water and take its current abundance for granted.

Wouldn't it be prudent to pay more today to change our wasteful habits, while adopting a mindset focused on conservation to build "aqua equity" for future generations? (Read more)

• In July one smart cookie crumbled:

I can't wait to order Chinese takeout again. Will I crack open another smart cookie to find an abstruse, recondite maxim with a scientific theme? Or will I have the misfortune of reading just another silly platitude? I'm really hoping for the former.

Forever puerile, I'll imagine my next postprandial surprise to be the work of some underemployed subatomic particle physicist, shedding an optimistic photon beam on my two left feet so I can finally show off some dance floor prowess...(Read more)

• In June we compiled "Then and Now" images to acknowledge the passage of time and visibly observe how things have changed through the years:

Time is fascinating when you take the time to think about it. We tend to consider time a man-made commodity as if we created it — when if fact all we really do it measure it. Fooling ourselves in the process, we parse time into nanoseconds, microseconds, seconds, minutes, days, weeks, months, years, decades, centuries, millenniums, periods, ages, eras, eons, epochs... (Read more)

• The May blog pitted brainy scientists against nerdy computer wiz's:

A while back I emailed Dr. Terrence Deacon, professor of Biological Anthropology and Neuroscience at University of California-Berkeley. He's an honest-to-goodness, genuine brain scientist — as opposed to certain computer scientists who are brain-expert wannabees, typically proselytizing in the media their tenuous computer-brain analogies and artificial intelligence predictions... (Read more)

• In April we worried about recycling dangerous technology:

There's little doubt that rising fuel prices and global temperatures are pressing issues requiring action. We owe it to ourselves and to posterity to become sufficiently educated to intelligently evaluate potential options. It's popular now to pander to those outraged over $60 fill-ups and brand global warming "bad" because that's been the predominant message... (Read more)

• In March we suffered the first wave of the "Baby" Boomer generation turning 60 years of age. (Ugh!):

Now turning 60, this famously spoiled (sans "baby"?) boomer demographic anomaly will be a force to be reckoned with. We'll be retired, but not retiring gray-haired activists with plenty of time on our hands to leverage our generational clout to lobby for our interests. By sheer demographic heft we'll again effect change as in years past. We'll read cover stories at the 70- and 80-year milestones chronicling our deterioration and the burden our cohort will be inflicting upon younger generations... (Read more)

• In February we intended to address unintended consequences resulting from continued advancement of scientific knowledge:

Nevertheless, as our quest for scientific knowledge marches on (making our lives ever more complex) — particularly related to biology — we are going to have to face the fact that we are not all created equal.

Just as our disruption of the macro-environment has led to pollution, species extinction and global warming — analogous unintended consequences await our intrusion into the ancient inner workings of cells... (Read more)

January's entry was tinged with nostalgia and torpor — and guilt for not writing a new essay:

Here we are again, lamenting yet another year's passing while suffering post-celebratory writer's block. Rather than trying to cobble together a half-hearted essay, we'll sheepishly follow the last four years' cop-outs to lethargically review the "Twelve Blogs of 2010" — with premature nostalgia and perhaps feigned interest as to whether they're still relevant... (Read more).

You may or may not agree, but after reviewing the "Twelve Blogs of 2011" it appears most are still relevant as we roll into the New Year. And somehow after recalling the energy that instigated the blogs to begin with, we don't feel quite as lethargic.

Maybe that's the real benefit of looking backward, reminiscing and singing "Auld Lang Syne" — not for nostalgia per se, but to recharge our batteries and begin another year with renewed energy and enthusiasm.

Time will tell...

Happy New Year!

posted by John Herman 12:01 AM

Thursday December 1, 2011

Memories: Firsthand & First Person

Our memories, and the memories of us, are precious. Are they not? Well, at least to some authors they appear to be.

Cliff Pickover, prolific writer and research staff member at IBM's T. J. Watson Research Center, admits that much of his motivation to publish comes from his desire to exit this world with something to leave behind for future generations. He laments:

After you die, will the world remember anything you did? Most of us rarely leave marks, except on our immediate family or a few friends. We'll never have our lives illuminated in a New York Times obituary or uttered by a TV news anchorperson. Even your immediate family will know nothing of you within four generations. Your great-grandchildren may carry some vestigial memory of you, but that will fade like a burning ember when they die — and you will be extinguished and forgotten.

That's pretty depressing. As we try to live each day to the fullest — being productive, learning, and ultimately creating memories for ourselves and others — we rarely ponder the ephemeralness of it all. (Although the often-heard dispassionate phrase, "Who will care a hundred years from now?" stems from the sad reality of our short time here on earth.)

But does it have to be this way?

The Phrenicea scenario envisions a time when all our memories and experiences will be stored forever, within our own brain as well as within others' — firsthand memories that are deemed rich enough to be bought and sold at auction — like memorabilia traded today on eBay.

Imagine sharing the actual memory of one accepting a Nobel Prize or an Academy Award, winning a marathon, falling head-over-heels for         your favorite actor         , starving in the third world, learning of a terminal disease; even of dying!

Most are tempted to say, "Yeah right! No way."

Memories bought and sold at auction — like memorabilia on eBay.

It seems impossible now, but there will come a day when the mechanism for memory assembly and storage within the human brain is elucidated. A next logical step would be to try to save or replicate these memories, perhaps for recollection by others.

When it does come to fruition, imagine the regret for the many first-person memories already — or soon to be — lost forever:
- the excitement of witnessing man's first flight
- the despair of the 1929 Stock Market Crash
- the horror of the Holocaust and Hiroshima
- the excitement of purchasing a new color TV in the 1950s
- the anticipation of finding out "Who Shot JR?"
- the relief of learning of a cure for polio
- the nostalgia of catching the last feature at the local drive-in movie theatre just before its closing forever
- the shock of President John F. Kennedy's death
- the thrill of setting foot on the moon
- the marvel at the first "horseless carriage," telephone, "talkie" talking movie, ballpoint pen, transistor radio, Polaroid camera, VCR

Of course you could read or see videos about these events. But nothing can approach an actual memory — just ask Neil Armstrong.

Gee, when you think about it, memories really are precious.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 6:27 AM

Tuesday November 1, 2011

The End of Inno"Since"

Recently I spotted an ad for "Heating and Air Conditioning Service, in business since 1954." It occurred to me to ask, "What does that really signify?" All I came up with is that all those who made the business successful back then are gone. It had to be an okay service to survive initially when the original principals ran things. But the fact that they're still in business does not necessarily translate into anything superior today. They could be in a downturn. We see it all around us today:
• General Motors was founded in 1908. Honda did not begin producing automobiles until the 1960s. Which builds better cars?

• The U.S. Congress has been around since 1787. Enough said about that.

Nevertheless, advertising a business' birth is still very popular. Googling "in business since" returns 1.5 billion results. Similarly, googling "founded in" yielded 551 million results. Several are very well known:
• Heinz (best known for ketchup) was established in 1869 and this is proudly declared on their label. Its longevity probably has no bearing on sales today — its products happen to be very good. Remarkably, they've managed to retain smart management with the common sense to not mess up the original recipes with new processing technologies.

• Ditto for Gulden's Mustard established 1862, first made by Charles Gulden, now manufactured by the agricultural giant ConAgra, in keeping with the original secret recipe.

• Coke almost blew it with New Coke in 1985, 99 years after the original was introduced. Its management back-stepped only three months later with old-new Coke Classic and eventually dropped the Coke that failed. A close call averted by facing failure boldly and quickly.

• Perhaps the most ridiculous business type to claim value in longevity is a restaurant. It could be on the downslide with the original owners replaced by their spoiled or less-driven progeny, or not-as-committed outsiders. Even with consistent ownership, chefs tend to migrate from one establishment to another like bees collecting pollen, and wait staff can be as ephemeral as cumulus clouds. Maybe if they could brag, "Serving our clientele with the same owner, same chef and same support staff since...." Now that would be meaningful.
Then there are those that can advertise longevity but choose — perhaps wisely — not to:
• Microsoft was founded in 1975. In this case boasting "in business since" probably would be a negative, as newer technology companies like Google are perceived to be more nimble and innovative. Microsoft is smart not to advertise their relative longevity since for them even not-so-old is too old.

• KODAK has been in business since 1892, which means nothing given the technology of photography is primarily electronic today — and not chemical processing as it was for almost a century. The company that once touted film (remember Kodachrome?) and paper quality survives tenuously because its management has struggled to maintain its brand "image" while photo technology marched digitally forward.

• Ford Motor Company does not advertise that it's been in business since 1903. Perhaps because the company has had its back to the wall so many times throughout its history. Fortunately its products mechanically have become excellent and almost rival the best of the Japanese and Europeans. But they blew it yet again — shooting themselves in the foot by letting the geeks run amok replacing simple dashboard buttons with complicated MyFord Touch technology — technology for technology's sake. Until the company can have more than a decade of consistent quality, it's probably better for them to stress the present rather than the past.

• The famous (infamous?) Yankee baseball team was established in the early 1920s. So what? What distinguishes one team from the other nowadays? With player turnover, team rosters can completely change within several years. Diehard fans brandish their enthusiasm with logoed shirts and caps. But what are they really fans of — the pinstripe uniform? The phrase "old is new again" is apropos.
(Ditto for your favorite team?)

*****

So keep an ear out and an eye open for the next time you hear or see an advertisement touting "In business since....," "founded in...," or "established..." Contrary to its intent or implication, the practical value of the boast is questionable.

Perhaps this newfound awareness can be termed The End of Inno"Since."

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 4:56 AM

Saturday October 1, 2011

Look, No Hands!

Given that most websites are maintained using powerfully sophisticated software, I usually elicit shock (well, perhaps astonishment) when explaining that I update the Phrenicea site "by hand," utilizing MS Notepad and basic HTML coding. (Being somewhat fastidious, at least I'm content knowing that every HTML character and tag has been hand-keyed, sans extraneous source code that's automatically generated.)

Beyond the tedious development of Phrenicea however, the terms "by hand" and "handmade" seem to be on the wane — and it stems from our growing disconnect from the physical world that unfortunately begins today early in childhood.

A case in point, this is from the December 1960 issue of Popular Mechanics:

Model making has replaced stamp collecting as the nation's number one hobby. A week after the U.S. Air Force announced its Starfighter jet set a new altitude record, miniature construction kits of the plane were sold out in stores from coast to coast.

It's hard to fathom today with YouTube, Facebook, cell phones and iPods that model making back then was the favored pastime of America's youth — and that stamp collecting ever was!

Baby Blues
Baby Blues
Baby Blues
Baby Blues
Baby Blues
© 2011 Baby Blues Partnership. Reprinted with Special Permission of King Features Syndicate.

It is a wonder if kids today so detached and desensitized by all things computer would even get excited about a manned Mars landing, but that's another topic for the future.

Looking at the impact of our silicon-infused, technological society it is obvious that very little is done "by hand" nowadays — by young and old alike. For example:

  • What happened to the popularity of Erector Sets with their nuts, bolts, screws and mechanical parts? What about paint-by-number kits, Lincoln Logs and chemistry sets? These required tactile finesse while nourishing portions of the brain that no computer could possibly stimulate.
  • What happened to mending and darning of clothes? Most likely the blemished garment is just tossed out or given away.
  • What about home-cooked meals? This activity is diminishing with prepackaged and fast-food becoming evermore popular — expanding waistlines the world over.
  • Who maintains their own cars? Who would even have a clue how?
  • Who attempts to fix... anything? A throwaway mentality prescribes to buy new, as "good with their hands" skills eventually become extinct.

Those that are savvy have taken advantage of our evolution towards being "all thumbs." Given their rarity in today's mechanized world, the terms "made by hand" and "handcrafted" are implied synonyms for quality and usually an excuse to command excessively high prices. Other adjectives often associated with pricey handmades are "fine" and "exclusive." Watch out for them.

The comparatively few who have bucked the highbrow trend to attend college, pursuing instead one of the hands-on trades, are now enjoying very successful and lucrative
The savvy have taken advantage of our evolution towards being "all thumbs."
careers — especially if their work is top quality. They are in the minority vis-à-vis today's dime-a-dozen, white-collar, so-called professionals stuck in cubicles with little practical physical skills. The blue-collars today conceivably are living better than they ever dreamed and doubtless have more work than they can possibly handle.

Since 1999, the Phrenicea scenario of the future has predicted a priority bestowed upon hands-on work. In fact, what are termed "blue collar" jobs today are projected to be the highest paid professions in the future. We may be seeing signs of that already today.

*****

A lot was done "by hand" in the old days. Will it be done as well — or at all — in the future?

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 5:17 AM

Thursday September 1, 2011

Epigenetics and You...

The study of biology is difficult. The study of physics is difficult. The study of chemistry is difficult. Once learned, we tend to view these and other scientific disciplines as our inventions — often flaunting erudition with advanced-degree titles — when essentially all they are is our explanation of the complexities of nature — or worse — imagined competencies that can beget arrogance to disturb natural equilibriums.

In February 1953 James Watson and Francis Crick, after several years of leveraging their (and others') understanding of nature's complexity, announced to the world the chemical structure of DNA. Over the years much has been learned about the molecule that is common to all life on earth — but only recently has it become a colloquial term related primarily to forensics to identify perpetrators of crime. For most of the lay public, that is about as much as they know about the acronym.

But what is DNA?

'Imagined competencies can beget arrogance to disturb natural equilibriums.'

DeoxyriboseNucleicAcid is a long-chained chemical molecule that functions as life's blueprint, and can be visualized simplistically as a ladder twisted like a spiral staircase — with each step (for illustration purposes) being one of four colors: black, white, red, or green.

Long stretches of DNA with many steps become genes. Many genes strung together become chromosomes. Every human has in each of their trillion cells 23 different chromosomes each containing thousands of genes. There are actually two sets of 23 chromosomes, one set from each parent. And each of the 23 has specific functions assigned — like growing fingernails or hair, or shaping the nose, etc.

The order or sequence of the steps' colors is important and unique to each person — and can reveal identity conclusively just like fingerprints. In other words, a person's uniqueness is determined by the sequence of the steps as one would imagine when climbing their DNA ladder. (For example: red, green, green, green, black, red, red, white,,,,,,,,,,,,,,, red = Jane Doe.)

The study of DNA, genes and chromosomes and how they function is called genetics. Like biology, physics and chemistry — genetics is difficult too.

And what about epigenetics?

Epigenetics is a new sub-field of biology that studies how chemicals in our environmental can play havoc "on top of" genes — acting like switches to turn them on or off abnormally. (The "epi" prefix means "on top of," as in epidermis — the outer layer of skin "on top of" the dermis.) So even though the gene appears normal and has not changed (or mutated), it does not function properly. Imagine those colored steps mentioned above blocked at various points to prevent them from being useful, even though they are still there undamaged.

Why is Epigenetics Significant?

Epigenetics is becoming important because it appears that many of the 80,000 man-made chemicals and pollutants in our environment can inadvertently create these epi-switches — and incredibly these can be passed on to future generations. The scary part then is that what your parents or grandparents were exposed to, before you were born, might be more detrimental cumulatively than anything you are exposed to directly in your lifetime; and the new switches you accumulate with exposure to today's many new toxins can too be passed on to your children and grandchildren.

So what does this mean?

It means that with each generation the impact is potentially cumulative — and many scientists in the field are convinced that Alzheimer's, Parkinson's, cancer, autism, obesity, diabetes, asthma and perhaps many more maladies may be caused by the effects of epigenetics.

Biology, physics, chemistry and genetics may be difficult — but epigenetics is downright frightening!

For the optimists reading this, we can alternatively view epigenetics from a positive viewpoint:
     Way back in 1999, the short story The Engagement of Phrenicea prophesied, "artificial genes producing hormonal parameters and switches — in simplistic terms — could be set or reset by Phrenicea to control or monitor behavior “officially” defined as antisocial, criminal, etc. [No crime!]
     And in 2004 we teased, "What if you could achieve and maintain the health that physical exercise promised, without any effort? If you could stay in so-called perfect physical condition without exercise, would you still workout? By mid-century this fantasy becomes reality. It was discovered that embedded within our DNA are gene expression-controlling switches tailored to 'sculpt' your preferred body shape. This epigenetic information can help to regulate the quantity and quality of the various chemical and structural components that make up our bodies. Ironically, it is the so-called 'junk' DNA, ignored for 50 years, that harbors these growth-controlling switches." [No exercise!]

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 4:57 AM

Monday August 1, 2011

Water On the Brain Syndrome

Now that the lazy, hazy, crazy days of summer are upon us, it's a good time to use the sultry weather as an opportunity to revisit yet again our feverish condition of "water on the brain." Way back in 2001 our "H2Ouch!" page began recommending the following:
"Pretend you were to pay $1/gallon the next time you take a shower or bath, brush your teeth, flush a toilet, wash the dishes, or God forbid — water the lawn! Begin to use less water than the average person. Set an example. Prevent H2Ouch!"

We still believe this is good advice, but perhaps the "hyperhydro" proposal was and still is naïve. The problem is that there is little incentive to conserve fresh water from the tap — given its ridiculously low price. For example, I recently received my "Annual Water Supply Report" from my local water company and was dismayed at how little fresh water costs. Here's the breakdown:

Quarterly Water Rates — Residential

Consumption (gallons) Charges
  • Up to 8,000
  • 8,001 - 58,000
  • 58,001 - 100,000
  • Over 100,000
  • $10.00 minimum
  • $0.90 / thousand gallons
  • $1.15 / thousand gallons
  • $1.40 / thousand gallons

A dollar for 1000 gallons of clean, fresh tap water? That's insane! By comparison, bottled water by the gallon costs about $1.99. Not bad, but that's $1990 for 1000 gallons. Why is there such a cost disparity with tap water? How can anyone be motivated to conserve water at these low rates,
Wouldn't it be prudent build "aqua equity" for future generations?
other than via a guilty conscious? And let's face it; there aren't many turning on their taps ladened with guilt. (If the water companies got savvy they'd upmarket their image with exotic brand names, pricing and refillable bottles with fancy labels adding cachet to their product. Imagine having bragging rights to elite-sounding potable water! It's not that silly a suggestion, since that is essentially what Coca-Cola did with Dasani and PepsiCo with Aquafina. They're both filtered municipal tap water.)

Actually what we really need is a "watershed moment"; a trickle-down epiphany to appreciate how finite and precious our water supply is. The first step should be to make users conscious of their water consumption — and that can be accomplished handily by raising the price per gallon and using a more dramatic cost gradient for excessive use. It sounds crazy, but those concerned about conservation should lobby for pricing increases.

Another way to raise awareness might be to move our water meters out from their usual obscure locations into full view in kitchens and bathrooms — fitted with big, red digital read-outs displaying gallons used in real time. Education on where our water comes from and how it's treated,
We should instill among ourselves an idiom-cum-mantra to "spend water like it's money."
stored, delivered and renewed would also serve to engender an appreciation of what is the major constituent of all living things.

There's an old saying attributed to spendthrifts that says they "spend money like it's water." Maybe we should instill among ourselves a new idiom-cum-mantra to "spend water like it's money." Since we all tend to waste water and take its abundance for granted — it even unintentionally spills over into the comics:



© 2007 Baby Blues Partnership. Reprinted with Special Permission of King Features Syndicate.

Did Zoe and Hammie's tub really have to be filled to the brim?

*****

Wouldn't it be prudent to pay more today to change our wasteful habits, while adopting a mindset focused on conservation to build "aqua equity" for future generations?

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 7:18 AM

Friday July 1, 2011

One Smart Cookie...

Help Wanted

FORTUNE COOKIE WRITER:
PhD in biology required. Sociology BA helpful. Brevity and a belief in the supernatural a plus. Only smart cookies will become fortune(ate) candidates.

The above want ad sounds ridiculous of course. But that is what I envisioned recently after opening a Chinese cookie to find the following fortune:

"Even as the cell is the unit of the organic body, so the family is the unit of society."

A socio-bio-based aphorism in a fortune cookie? Could this have been created by an underemployed graduate of biology or sociology?

Perhaps this is early evidence bolstering the Phrenicea scenario of the future that envisions not enough jobs to employ most of the world's population. As higher education becomes the norm globally, college graduates will have to assume lower scale jobs formerly taken by those with high school diplomas or less.

Already, more and more sales associates, telemarketers, customer service reps, bank tellers, bookkeepers, etc. are bringing to the job the benefit of a four-year degree.

But who would have thought about a "fortune cookie writer"?

With the popularity of Chinese food on the rise, there may be a strong demand for offbeat fortune writers! Fortune cookies are a lot like horoscopes. Our intellectual side tells us it's all
As higher education spreads globally, graduates will have to assume lower scale jobs.
not to be believed. Yet, just as many feel compelled to read what the day may bring for their zodiac sign, opening a fortune cookie for its bit of wisdom or prediction can be irresistible. And if the theme just happens to coincide with a life situation, that's provides reinforcement to look forward to a next time.

Ever the skeptic, I saved a few choice fortunes through the years that had some relevance to see if their message would ever be realized. So here goes:

"Good fortune is just around the corner."
Unfortunately, I haven't turned this corner yet.
"You will soon gain something you have always wanted."
This might have come true;
but it was probably so trivial I didn't realize it.
"Financial hardship in your life is coming to an end. Enjoy!"
All I know is that I'm still waiting for this end to come.
"Two small jumps are sometimes better than one big leap."
I have no idea why I kept this one, although I'm wondering now if Neil Armstrong took that fortune cookie job.
(What does one do after landing on the moon?)

Even though it's evident I've not had much luck with fortune cookies, I still can't wait to order Chinese again. Will I crack open another smart cookie to find an abstruse, recondite maxim with a scientific theme? Or will I have the misfortune of reading just another silly platitude? I'm really hoping for the former.

Forever puerile, I'll imagine my next postprandial surprise to be the work of some underemployed subatomic particle physicist, shedding an optimistic photon beam on my two left feet so I can finally show off some dance floor prowess:

"You will learn to be like a meson — a strongly interacting boson — a hadron with integral spin!"
I'll certainly save this one. "Dancing with the Stars" look out!
*****

Imagine — physics graduates crafting cookie fortunes. An inane exaggeration? Perhaps, but let us hope that's not how the cookie crumbles for many other fields of study the world over.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 6:43 AM

Wednesday June 1, 2011

Then and Now

Time is fascinating when you take the time to think about it. We tend to consider time a man-made commodity as if we created it — when if fact all we really do it measure it. Fooling ourselves in the process, we parse time into nanoseconds, microseconds, seconds, minutes, days, weeks, months, years, decades, centuries, millenniums, periods, ages, eras, eons, epochs.

As we're measuring we can have a great time; a sad time; a boring time; a wonderful time; even an exceptional time. We can also waste time, spend time, give up time, give back time, and squander time.

Our perception of time can vary. It can go fast; go slow; even stand still — like watching a DVD with a remote. And at times it may seem to repeat itself as Yogi Berra famously observed, "This is like déjà vu all over again." Unfortunately time cannot be rewound or made to go backwards. (Why is that?)

The musically gifted write songs about time: Turn Back the Hands of Time; Time Is On My Side; Time Passages; Does Anybody Really Know What Time It Is?; Till the End of Time; and As Time Goes By are just a few.

We also have colloquial phrases like "It's about time," "Time is precious," and "Time for change." Some are heard more often than others. (The rhetorical promises of the last U.S. presidential campaign comes to mind.)

We can use time to describe ourselves and others. We can be Big-time. They are only Small-time.

We can even use time to punish: "You're in time-out!" and "25 years to life!"

We can live our lives according to time, as in "Early to bed and early to rise." Most strive to live for the moment. Many yearn to go back the old days. Others impatiently anticipate the future.

Time can be ahead of us or behind us, which becomes more important for each of us as time goes by.

The age of time is very old. Physicists tell us it's been around since the birth of the universe about 15 billion years ago. (This used to sound like a big number before all of the recent financial bailouts.) During our relatively brief time here on earth we learn that some things disappear, new things arise, and some things stay relatively the same.

An interesting activity is compiling "Then and Now" images to visibly see how things have changed — noticeably and in some cases so much so as to almost revert back to the way they were.

Here are some examples:

Click images to magnify
The more things change...
Zenith Console Bolla Console
TV in a Console (1964) TV on a Console (2009)
...the more they stay the same.
The more things change...
TDK ad iPod ad
Johnny in 1973 Johnni today
...the more they DON'T stay the same.
Gripping advertising...
Armstrong ad Pirelli ad
Hawking tires (1965) Hawking tires (2009)
...never loses traction.
Some things thankfully never change...
Old Coke New Coke
Then and then (1968) Now
...at least on the inside.
In-flight departures from bias and stereotyping...
United ad Qatar ad
"How come our girls are so capable?"
(1951)
"Best cabin crew in the Middle East."
...thankfully arrived.
More great hits and stars! By snail mail!
Columbia ad iTunes
Wow! A record club! (1968) Wow! No record club!
Even more great hits and stars! Instantly!
Envisioning push buttons in 1959...
Western Electric ad iPhone ad
Then, then and 1959's future Now
...remembering push buttons today.
Recycling sheet metal...
Toyota 2000GT Pontiac GXP
1968 Toyota 2000GT 2009 Pontiac GXP
... in the figurative sense.

When you take the time to think about it time really is fascinating. After seeing Johnny with his TDK tape cassettes, the stewardesses enduring insipid questions, and push-button phones once defining our future — would anyone want time to go backwards in time?

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 5:27 AM

Sunday May 1, 2011

Brains & Neurons & Computers! Oh, my !

The premise of the Phrenicea scenario of the future is that we give up trying to copy the human brain utilizing computer technology and artificial intelligence, and instead appreciate its amazing complexity by immuring our brains upon death and keeping them alive within a "braincomb" to function and assist those still living their ambulatory lives.

A while back I emailed Dr. Terrence Deacon, professor of Biological Anthropology and Neuroscience at University of California-Berkeley. He's an honest-to-goodness, genuine brain scientist — as opposed to certain computer scientists who are brain-expert wannabees, typically proselytizing in the media their tenuous computer-brain analogies and artificial intelligence predictions. (We at Phrenicea like to call them "chipheads.")

I asked Prof. Deacon for his views on computer technology and artificial intelligence, and even the feasibility of keeping brains alive a la the Phrenicea scenario. I also asked, "Given that 'brains are not designed the way we would design any machine' [a direct quote by Dr. Deacon], do you agree or disagree with my major assumption that we will not (within the next 100 years) be able to mimic the brain's function utilizing non-biological technology?"

Given Dr. Deacon's lofty stature and busy schedule, I was pleasantly surprised and appreciative to receive his prompt and cogent reply as follows:

1. Brains are quite unlike computers. The closest thing to computation in the brain is probably the performance of highly over-learned rapid ballistic movements and systems that regulate visceral systems. And even these are in effect the production of "virtual computations" — not true computations but simulations of very simple computation by stochastic approximation. Most of cognition is much more analogous to chaotic attractor dynamics and evolutionary processes.
[My interpretation: Brains are not like computers. (Yes!) Learning physical movements to the point of where they become "automatic" (becoming proficient in a sport, musical instrument or even driving a car) might cause the brain to act as if it were computing from our perspective — and not an exact form of computing at that, more like a controlled anarchy of processes.]

2. Despite the incredible network size of brains and the additional fact that even the subcellular level of intraneuronal information processing rivals current "neural net" simulations, neither complexity nor unattainable dynamical organization offer an unsurpassable threshold.
[My interpretation: We'll eventually unravel the mysteries of how the brain works.]

3. Once we get past our fascination with the brain-as-computer dead-end metaphor it should not be impossible to begin to build (or more likely grow) devices that utilize this kind of information-generation logic (notice I didn't say "information processing") and at least some of the adaptive and agentive features of brains will be possible to achieve (and I don't mean simulate) this way.
[My interpretation: The brain is not analogous to, and does not work like a computer. However, we may be able to one day grow biological structures that mimic the brain in some limited ways.]

4. Neurons don't live forever and are subject to spontaneous functional degradation even in perfect metabolic support conditions.
[My interpretation: The envisioned Phrenicea "braincomb" will need more research and development to sustain human brains forever. Back to work guys and gals!]

5. Also, contra sci-fi stories, the idiosyncrasy of representational encoding in different brains and the incompatibility of cognition and computation processes will make transfer of a person's memories, thoughts, personality traits, and experiences to different "media" (e.g. brain to machine) impossible.
[My interpretation: Our knowledge, memories, etc. will never be transferred to silicon or any other inorganic computer-memory substrate.]

Terry

*****

Prof. Deacon in a very short space provided authoritative views on computer technology and artificial intelligence vis-à-vis the human brain. He is pessimistic about our ever mimicking the brain's function utilizing non-biological technology. He even addressed the feasibility of keeping brains alive a la the Phrenicea scenario.

You may or may not agree with all of his assessments (or my interpretations!). Regardless, this perhaps is the most exciting field of study today — and probably will be for most of the 21st century.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 9:08 AM

Friday April 1, 2011

Nuclear Redux Redux

I feel sadly vindicated. The nuclear-power memories that were fading way back in 2005 are suddenly all over the news again.

Unfortunately it had to take a disaster and not logical concern to wake us up. We humans are really pathetic....


Saturday August 20, 2005

Nuclear Redux — or Déjà vu?

Our current QuikPoll so far indicates surprisingly that there is more concern with fossil fueled global warming than with long-lived and deadly radioactive waste (and weapons grade fuel) produced by nuclear power plants. This is astonishing, especially with the threat of terrorism and the third-world's desire for nukes.

It's amazing how attitudes can change over the course of time — about 60 years in this case. At the dawn of the Atomic Age, there was optimism that not only would "going nuclear" fuel power plants to generate electricity — it would also power our planes, cars and rockets — and cook our food! (Even Walt Disney was convinced, producing the movie and book, "Our Friend the Atom.")

Initial optimism was eventually supplanted with concern about bomb proliferation; paranoia associated with the loss of U.S. technological supremacy (on this date in 1953 the Soviet Union acknowledged it had tested a hydrogen bomb); and uncertainty as to the long term effects of the radiation from weapons testing. The many science fiction movies from the era depicting awakening dinosaurs, giant insects, and incredible shrinking men attest to the uneasiness associated with radiation. Chernobyl and Three Mile Island seemingly were the final straws that broke nuclear's back. But, memories fade.

Politicians and the nuclear industry are capitalizing on today's radioactive blaséness with talk of new reactors helping to solve the global warming predicament. Cameco, the
Recycling dangerous technology may not be the answer to global warming.
"world's largest uranium producer" crows the slogan, "NUCLEAR. The Clean Air Energy." Embedded in Fortune magazine's August 8 issue is a nuclear-industry funded feature posing as objective content proclaiming a "Nuclear Redux." The piece, cleverly written by freelancer Robert McGarvey (who might have sold his professional soul here), subtly conveys an environmentally friendly green theme with an innocuous graphic, a green pull-quote highlighting nuclear's return to center stage, and a green text box incredibly proclaiming that "Radiation is good for you."! How subtle. How frightening!

There's little doubt that global warming is a pressing issue requiring action. However, the answer may not be to recycle dangerous technology. We owe it to ourselves and to posterity to become sufficiently educated to intelligently evaluate potential options. It's popular now to brand Global Warming "bad" because that's been the predominant message; and nuclear energy as comparatively "good" because: (1) it's been out of the tabloid news for twenty years; and (2) that's what some politicians and the industry would now have us believe.

Don't listen passively to either view. The world and its inhabitants are at stake and the clock is ticking.

Time will tell...

posted by John Herman August 20, 2005 7:22 AM


Feedback: Your Two-Cents!

Ships sink, planes crash, cars still kill thousands and thousands of people a year, oil platforms explode. Technology is never perfect. Sometimes we fail. Do we stop trying to make these safer and ban them forever? We learn and move on making them safer. Right now nuclear power is part of our energy production. Someday it will be replaced but until then it has a role to play in producing our energy. The world will learn from Japan's misfortune and make power plants safer.
-Steve H.

Steve you are 100% right. There are no guarantees in life. Crossing the street is a challenge. The people in Pennsylvania are exposed to more radiation because of radon in the rock formation in that state, all homes have monitors in their basement because of the dangers of radon.
I am going to play golf today. Hopefully some crazy golfer won’t hit me with his drive or club.
-John S.

I'll respectfully disagree with you both. I'll take Pennsylvania over Japan. And yes, nuclear plants should be banned.

Time will tell...

Only two things are infinite, the universe and human stupidity, and I'm not sure of the former.
                               - Albert Einstein

Walt Handelsman
Walt Handelsman
Walt Handelsman
© Walt Handelsman. Dist. by Universal Uclick. Reprinted with permission. All rights reserved.

posted by John Herman April 1, 2011 6:12 AM

Tuesday March 1, 2011

"Baby" Boomers Turn 65 — Ugh!

Tick...tick...tick. The "baby boomer" population bulge is approaching 65. I'm a leading-edge boomer and frankly am having trouble accepting that fact. I seem to have a lot of company too — most of my friends, coworkers, and fellow (grand?) parents are boomers and many of them feel similarly distressed. Why are we having so much trouble accepting this aging thing?

Through the years I've developed a belief that it's because baby boomers were raised to think of themselves as privileged, special and young, as the moniker implies. We are the product of an unprecedented birthrate and as youngsters were bathed in attention — regardless of whether it was for our potential in those promising and optimistic postwar years or for profit by opportunistic marketeers. We were the center of attention and the target market virtually from birth. Consider the following:

Pediatrician Dr. Benjamin Spock wrote a baby and child care book just for us — to ensure that our parents would provide a nurturing environment to help us grow to our full potential, both mentally and physically. It was a best seller for years.

Our parents, who experienced deprivation during the Great Depression and World War II, were determined to give us what they did not have: a comfortable home, plenty of food, new clothes and a good education which included a college degree. Everyone would be "white collar." Yes, we were really special.

Whole neighborhoods were built just for us, a phenomenon that became known as "suburban sprawl." Virtually everything was new. We had new houses, roads, stores, schools, buses, playgrounds and pools. We were handed new books and new tests which were designed to measure us if we simply darkened the proper circle with #2 pencils. Eventually we would even have "new math."

The Golden Age of TV catered to us with Captains Video, Midnight and Kangaroo; Space Cadet and Space Patrol; Kukla, Fran & Ollie; Davy Crockett, Superman, Howdy Doody, Mickey Mouse Club, Roy Rogers, Lassie, Mr. Wizard et al. We watched and matured with wholesome situation comedies depicting the idealized kid-centered family like Ozzie & Harriet, Father Knows Best, Donna Reed and Leave it to Beaver. TV would later contribute to and reflect our loss of naïveté with such ground-breaking shows as Smothers Brothers, Laugh-In, All in the Family and Saturday Night Live. Television's impact cannot be overstated — we were the first to grow up with it and it helped to reinforce our preoccupation with youth and Self.

A plethora of toys and gadgets were invented just for us: Slinky, Mr. Potato Head, Barbie, Etch A Sketch, Silly Putty, Hula-Hoop, Frisbee, a watch with a Mickey Mouse face and the world's biggest toy — Disneyland. Naïvely optimistic slogans professed things like "Progress is Our Most Important Product." Planned obsolescence brought us the "new and improved" designed for quick and easy disposal. We had transistor radios and record changers playing 45s. Stereo LPs, 8-track and cassettes would soon follow. So would calculators, computer games, and digital watches. Yes, we were spoiled and literally growing up like no generation before us.

We were comforted to know that we would always have all the food we would ever need — with farm mechanization and the liberal use of pesticides. The threat of serious disease was virtually eradicated for us with "miracle drugs" such antibiotics and vaccines. Advanced surgical techniques would enable us to change our features, our gender and even replace internal organs when they wore out. We learned that we could take drugs to cure infertility or The Pill to ensure it. What power was brought forth for us!

We were dazzled with jets, rockets and space flight spurred by Sputnik. One day we would all travel through space. Impatient families would ride in automobiles sporting bullet-nosed bumpers and huge tail fins. I can remember being lectured at school that we should feel privileged growing up in the Modern Age; and that our eyes would eventually see what no others saw before us.

But we didn't have to wait. We saw tremendous progress during those prosperous 1950s and '60s and were often reminded that we would be the prime beneficiaries because "our whole lives were ahead of us." Y-O-U-N-G was etched into our psyche. We felt catered to and it made us feel very important — and that the world should revolve around us.

This attitude combined with our critical mass gave us leverage to have an impact on society beginning in our late teens and twenties. We began to question everything and inadvertently turned into a political force when our implanted optimism was replaced with disillusionment. We became demonstrators, hippies, college drop-ins and establishment drop-outs — and we didn't trust anyone over 30. We affected the style of dress, hair, and music; and hedonistically espoused "free love." We marched to end the arms race, the Vietnam War, and to give Peace a chance. The unintended effects of the so-called miracle drugs were beginning to show up with cancer and birth defects, and we read Rachel Carson's Silent Spring. We became concerned about the impact "progress" was having on the environment. And many tried but failed to escape this reality explosion with mind-altering drugs.

Then we turned 30. To remain trustworthy we selfishly redefined Young vs. Old. Our idealism was replaced with good old-fashioned materialism. These would be the years of the Me Generation. "Free love" was replaced with free spending. We sold our ideals for the good life that selfishly included expensive vacations, cars and houses. While immersed in our indulgence, we reached "the big 4-O." We were now a massive group of "yuppies" and although we read cover stories about how we were losing our hair and gaining weight, we still were unconvinced that we were getting old.

Then in the blink of an eye we hit 50 — that (gasp) half-century milestone. Although chronologically defined as hard-core middle-agers, we still refused to succumb to the stereotype of age that we held for our parents. We continued to rock'n'roll, workout to stay fit and tried to eat healthy — or at least followed the fads from oat bran to pasta to tofu. It was a good thing too; we needed all the energy and stamina we could muster to survive the waves of downsizing, rightsizing, outsourcing and reengineering. Although many succumbed to early retirement, that did not stop us from learning new tricks; from embracing the Internet to becoming entrepreneurs. We'll survive, we told ourselves — we're resilient and Y-O-U-N-G.

Now we'll begin turning 65 and this famously spoiled (sans "baby"?) boomer demographic anomaly will be a force to be reckoned with. We'll be retired but not retiring; a horde of 79 million gray-haired activists that will leverage our generational clout to lobby for our interests. By sheer demographic heft weighing in at 25% of the population, we'll again effect change as in years past. We'll read cover stories at the 70- and 80-year milestones chronicling our deterioration and the burden our cohort will be inflicting upon younger generations. And the marketeers will be there pushing the latest pharma miracles, adult diapers, adjustable mattresses, hearing aids and life insurance. They'll capitalize on our penchant for nostalgia that will inevitably precipitate an onslaught of hyped retro-fads. We'll call ourselves rockers even when we're sitting in them nodding off with Modern Maturity. And still we won't believe we're O-L-D.

Time will tell...

Comics Just for Us "Baby" Boomers!

Mallard Fillmore
Mallard Fillmore
Mallard Fillmore

Mallard Fillmore
©2011 Reprinted with Special Permission of King Features Syndicate.

Presented as a "Timely Yet Timeless" post by John Herman 11:07 AM

Tuesday February 1, 2011

Created equal?
      Free Will?

Are we fooling ourselves?

The famous statement "...all men [and women!] are created equal..." appears in the opening of the American Declaration of Independence, written by Thomas Jefferson in 1776.

Depending on your beliefs, you may view your "Creator" as a higher being, or perhaps the forces of Evolution and Natural Selection.

Regardless, as our quest for scientific knowledge marches on (making our lives ever more complex) — particularly related to biology — we are going to have to accept that we are not all created equal.

As gene mapping continues to identify what gene is responsible for what trait or function,
An entire industry will be born for splicing favorable genes to replace those deemed undesirable.
we'll come face-to-face with our "good" and "bad" genes, and inevitably realize the reality that every aspect of our lives is in some way affected by them.

With comparative genomics, DNA sequencing is in progress with humans, chimpanzees, gorillas, orangutans, ancient hominids like Neanderthals, and even more primitive organisms down to lowly yeast. When completed, we'll be able to identify genes that we have in common with our hereditary ancestors going back millions of years.

As Dr. Terrence Deacon, professor of Biological Anthropology and Linguistics at University of California-Berkeley states:
"Certainly studying human evolution makes you think about the future. It makes you realize how much baggage we carry forward with us at all times."

Many experts acknowledge that we still harbor instinctive traits from our primitive past. And we have to confront the stark reality that we'll soon have the capability to pinpoint who has what gene and its associated characteristic. Perhaps it will explain much in terms of our oftentimes uncivilized, barbaric behavior.

Today DNA tests are used as "fingerprints" to prove culpability in crime. In the near future potential perpetrators will be tagged virtually at birth when "crime" genes are identified.

It is our take however that just about everything we do — every facet of "humanness" — is controlled by our genes. Of course the nurture aspect plays a part too, but even our reaction to our unique environment is determined by our genes.

As genes are mapped to specific traits, it will become evident at conception who has an advantage/disadvantage innately in terms of:

  • Musical talent
  • Athletic ability
  • "Photographic" memory
  • Reasoning ability
  • Phobias
  • Mathematical ability
  • Verbal ability and propensity to learn languages
  • Hand-eye coordination
  • Aggressiveness/passivity
  • Bravery/cowardice
  • You name it...

Of course along with this newfound knowledge will come genetic profiling — and an entire industry
Potential perpetrators will be tagged at birth when "crime" genes are identified.
geared towards the splicing of perceived favorable genes in place of those deemed undesirable, for those with the inclination and wherewithal. And this is where the real danger lies.

Just as our disruption of the macro-environment has led to pollution, species extinction and global climate change — analogous unintended consequences await our intrusion into the ancient inner workings of cells — due to our ignorance of yet-to-be-discovered functions of non-coding (junk) DNA and convoluted interrelationships between epigenetic switches and seemingly unrelated genes.

Worrisome? You decide.

*****

Finally, here's a suggestion: The next time you attend an event with people of similar interest — a concert, lecture, museum, ball game, etc. — ponder how you found yourself in the company of strangers with the same tastes, enthusiasm, excitement or passion — as you. Was it free will? Or could it be you all have the same genes that led you there?

Are we fooling ourselves?

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 5:51 AM

Saturday January 1, 2011

The Twelve Blogs of '10

Here we are again, lamenting yet another year's passing while suffering post-celebratory writer's block. Rather than trying to cobble together a half-hearted essay, we'll sheepishly follow the last four years' cop-outs to lethargically review the "Twelve Blogs of 2010" — with premature nostalgia and perhaps feigned interest as to whether they're still relevant.

December's blog questioned whether knowledge is beneficial or baneful:

Looking ahead to the future, too much knowledge might even be mentally debilitating. Imagine an over-the-top scenario where we had to know from birth an important aspect of our future — the date and time of our demise... (Read more)

• In November we did the math to sum up whether we've experienced exponential progress since entering this century:

A lot happened in the 20th century making its end appear quite different from its beginning. So, let's evaluate the most significant achievements/events that occurred in those incredible ten decades vs. the 21st century to date... (Read more)

October's blog contemplated why many of today's richest, most influential and creative individuals broke from the mold and charted their own paths to success:

The solution to increased competition from an ever-growing global workforce will be attaining two important qualities sooner rather than later: maturity and passion. Unfortunately many of today's youth are lacking in both, and that quest seems to elongate with each succeeding generation... (Read more)

• In September we pondered whether too much time is being squandered on skill-less gratification:
While watching an old Star Trek episode I realized that I and fellow Trekkies have been watching Kirk and crew beaming up via the transporter for over forty years. I wonder now after watching the series for so long, whether I'd be thrilled if the old-fashioned transporter (as well as warp drive, phasers, and photon torpedoes) finally became reality... (Read more)

• In August we suffered the recurring affliction of "water on the brain":

Maybe we should instill among ourselves a new idiom-cum-mantra to "spend water like it's money." We all tend to waste water and take its current abundance for granted.

Wouldn't it be prudent to pay more today to change our wasteful habits, while adopting a mindset focused on conservation to build "aqua equity" for future generations? (Read more)

July's blog exposed the significance of treating angina with Viagra:

The unexpected rise out of Viagra illuminates an important point: Drug synthesis and discovery is not really by design, although most chemists in the industry will not readily admit such. The chemical compounds are created with as much art as science by unnatural means... (Read more)

• In June we lamented the metaphorical switching of gears:

Shanthi Gears is not located in California, Ohio, Maryland, Massachusetts, Indiana, Connecticut or New York. It's manufacturing those gears in Tamilnadu, India. And while it is turning the wheels of industries worldwide, the U.S. has shifted into high gear towards a service economy. The once ubiquitous "Made in U.S.A." is rarely seen today... (Read more)

May's entry presents proof that having an interest in predicting the future is by no means qualification to do so:

Who would have predicted that magazines would fade away into obscurity after decades of influence and significance? Certainly not me... (Read more)

• In April we subscribed to the idea that the periodicals industry is pursuing a self-fulfilling prophecy by warning of its own demise:
The obvious weakening of substantive content engenders a self-imposed fait accompli discouraging sales, which in turn brings in less advertiser revenue, thus forcing staff cutbacks which lead to further decreases in content, and a spiraling sales decline... (Read more)

• The March blog attempts to explain why epigenetics is downright frightening:

Epigenetics is a new sub-field of biology that studies how chemicals in our environmental can play havoc "on top of" genes — acting like switches to turn them on or off abnormally. (The "epi" prefix means "on top of," as in epidermis — the outer layer of skin "on top of" the dermis.) So even though the gene appears normal and has not changed (or mutated), it does not function properly... (Read more)

• In February big thoughts about small-minded behavior prevailed:

And after thousands of years we're still at it. We can make just about anything into a symbol of status. But one person's object of distinction might be another's folly. So we have to be among like-minded people to make an impression. And you can't even take it with you! (Read more)

January's entry was tinged with nostalgia and torpor — and guilt for not writing a new essay:

Here we are again, lamenting yet another year's passing while suffering post-celebratory writer's block. Rather than trying to cobble together a half-hearted essay, we'll sheepishly follow the last three years' cop-outs to lethargically review the "Twelve Blogs of 2009" — with premature nostalgia and perhaps feigned interest as to whether they're still relevant... (Read more).

You may or may not agree, but after reviewing the "Twelve Blogs of 2010" it appears most are still relevant as we roll into the New Year. And somehow after recalling the energy that instigated the blogs to begin with, we don't feel quite as lethargic.

Maybe that's the real benefit of looking backward, reminiscing and singing "Auld Lang Syne" — not for nostalgia per se, but to recharge our batteries and begin another year with renewed energy and enthusiasm.

Time will tell...

Happy New Year!

posted by John Herman 12:05 AM

Wednesday December 1, 2010

Is Knowledge Power — Or is it Burden?

"Knowledge is Power" is a well-known quotation, first uttered by Sir Francis Bacon in 1597. Unfortunately in today's world knowledge is also a burden; a cause of worry.

Example: Global Warming.
Much worry is attributed to our knowledge of carbon dioxide emissions and the theoretical impact of high concentrations in the atmosphere. Endless cerebral energy is spent debating whether weather is made heavier and more erratic, and if/when coastal cities will submerge. Yet it still may be proven to be a false theory. [We don't think so.] But might we be better off being dumb and happy?

Example: The mapping of the human genome.
As knowledge of genetics becomes more sophisticated, susceptibility to ever more diseases will be determined by markers (gene sequences) within a person's genome. Imagine the agony knowing that a terminal or debilitating disease is inevitable, well before any symptoms and before an effective cure becomes available. Many will find themselves cheated out of a joyful life knowing that someday they are likely to succumb to an inherited genetic disorder.

Example: The anticipation of a deadly pandemic.
Many around the world are involved envisioning horror scenarios should a virus mutate into a form that can be transmitted among humans. Just enough is known about viruses and past pandemics to ponder the consequences. Several years ago it was feared the H5N1 bird-borne flu bug would evolve yielding humans as the primary viral host. Much energy is expended discussing "What if?", brewing worldwide fear. It may never happen. But then again, it might.

Example: Events calculated to be "overdue" based on mathematical laws of probability.
Anxiety can be engendered in a locale or region susceptible to devastating hurricanes or earthquakes if, based on historical data, it is deemed at risk "any year now." It makes for great TV news on a slow day, and is a cause of (needless?) apprehension.

Could our sophisticated state of knowledge be burden enough to instigate proactive worry — and contribute to the reasons why a great many people smoke, drink and participate in other activities that assist them to take leave of their senses?

*****

Looking ahead to the future, too much knowledge might even be mentally debilitating. Imagine an over-the-top scenario [again?!] where we had to know from birth an important aspect of our future — the date and time of our demise. Once cognizant, would we not count each day in anticipation of "The End," effectively zapping our lust for life? Who knows, conditions related to population or some other critical parameter may eventually warrant a predetermined appointment with death... To donate a brain perhaps?     ;-)

So, is knowledge beneficial or baneful?


Dilbert 10/28/2007

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 8:37 AM

Monday November 1, 2010

A Tough Act to Follow

It's been a while since we took Ray Kurzweil to task, the self-proclaimed "visionary futurist," whom we fondly call Chiphead, our respectfully disrespectful appellation for the champion of chip-based intelligence. He predicts "We won't experience 100 years of progress in the 21st century — it will be more like 20,000 years of progress (at today's rate)."

We relish lambasting Kurzweil because of his unabashed arrogance. A lot happened in the 20th century making its end appear quite different from its beginning. So, let's evaluate the most significant achievements/events that occurred in those incredible ten decades vs. the 21st century to date, which is already approaching one-tenth of its 100 years. You can then adjudge the veracity of Kurzweil since we should have already progressed ten percent of 20,000 years. (Do the math, that's 2000 years of progress since entering this new century.)

Significant achievements/events in the 20th Century:

  • broadcast radio — cultural homogenization
  • airplane — shrinking of the globe
  • telephony — connecting the world
  • tungsten light bulb — consumption of electricity, 24 X 7
  • automobile — suburbia, superhighways, petroleum economy, pollution
  • TV — cultural homogenization, couch potatoes
  • synthetic chemistry — Nylon, Dacron, PVC, Big Pharma and more — the end of "all natural"
  • Pearl Harbor — U.S world predominance
  • drive-in theater — submarine races, sci-fi B-movies*
  • atomic bomb — fear of non-fungal mushrooms, MAD
  • transistor — gadgets galore
  • antibiotics — (perceived) microbial domination
  • DNA double helix — life built from lifeless molecules
  • rock 'n' roll — generational segregation 1.0
  • credit cards — living beyond our means
  • Sputnik — Space Race, spin-off technology
  • The Pill — equality of the sexes
  • cable TV — cultural splintering
  • microprocessor — PCs and computerized everything
  • Internet/Web — digital revolution, outsourcing and offshoring

Wow! It could be debated though whether the list is the most significant. Perhaps there should be more, maybe less. And which is the most influential? Depending upon your perspective, it might be the automobile, The Pill, TV or the microprocessor. Then again, the influence of an event then might not be perceived as significant in this new century — like the Space Race perhaps.

An objective approach is to use some measuring criteria. In keeping with the offbeat spirit of this website, we hereby propose the "B-movie* plot test," using the fodder of drive-in theaters from days gone by. (From Wikipedia: The term B-movie originally referred to a motion picture made on a modest budget and intended for distribution as the less-publicized, bottom half of a double feature. U.S. production of movies intended as second features largely ceased by the end of the 1950s.)

Based of the following plethora of radiation-based B-movie plots, we submit that the atomic bomb had the biggest overall impact during the 20th century. Just feast on the beasts below:

  • The Beast From 20,000 Fathoms — an atomic test in the arctic thaws a dinosaur so it can migrate back to New York to cause havoc.
  • Godzilla — American hydrogen bomb tests awaken and mutate the monster Godzilla. You know the rest.
  • Them! — the first U.S. nuclear test causes ants to mutate into giants.
  • The Beast of Yucca Flats — a defecting Russian scientist is chased by the KGB and winds up amidst a nuclear mushroom cloud. The radiation turns him into a killing beast.
  • The Amazing Colossal Man — a U.S. soldier suffers serious burns following exposure to plutonium from a bomb blast. He survives but the radiation causes him to grow into a giant.
  • The Crawling Eye — a radioactive cloud sitting atop a mountain has its climbers winding up decapitated without explanation.
  • Hideous Sun Demon — decades before sunblock and SPF, a scientist exposed to a radioactive isotope devolves into a scaly reptilian when caught in the rays of the sun.
  • The Giant Behemoth — at a science conference it is noted that atomic tests have contaminated plankton, fish, and birds in a "biological chain reaction" of radiation culminating with a monster that burns flesh with radioactive waves.
  • The Incredible Shrinking Man — a man is subjected to a radioactive mist that causes him to shrink beyond detection.

You could dismiss all this as Phrenicea-phoolery. But the 1950s were indeed scary and the atomic bomb is right up there in terms of centurial impact.

Now for a comparison let's list the significant aspects of the 21st Century to date:

  • cell phone mania — piercing ringtones, texting, rude cacophony
  • 9/11 — global terrorism
  • Google — domination of "search" and...?
  • iPod — ubiquitous earbuds, cultural isolation
  • human genome mapping — Too Much Information, personal revelations that we'll not want to cope with
  • Facebook — virtual socialization (seeds of Phrenicea?)
  • Internet dependence — on par with electricity and fresh water
  • Financial Meltdown — bank losses, credit contraction, loss of jobs, eroding property values, mortgage defaults.

What will be the most significant? 9/11? The iPod? Facebook? Dependence on the Internet? It's hard to infer from the narrow perspective of the present day (and sans the B-movie test!). But one thing's certain — we'll not have 2000 years of progress in this decade.

*****

In sensible and realistic terms, the sweeping change of the 20th century is going to be a tough act to follow.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 4:39 AM

Friday October 1, 2010

A Model Shattered?

The world has changed. Jobs are being lost to technology, outsourcing and the "flattening" of organizations. As intellectual skills rise globally, many of the jobs left are lower paying, heeding the old law of supply and demand.

This new reality shatters the half-century old model for graduating high school, attending college, and then getting a good "white collar" job.

You may have heard stories like these:

  • A fine arts graduate working as a "sales associate" at electronics chain
  • A perpetual full-time student serialing changing majors pursuing a bachelor degree well beyond the expected four years
  • A mathematics graduate stocking shelves at a health food store
  • Add one or more of your own here

Typically these are students who aren't sure of their purpose in life, and thus are funneled into a college course of study for the wrong reasons: pleasing parents; it's what's expected; anticipation of good money after graduation; a prestigious title; or perhaps the lure of landing a job with summers off and generous benefits.

The solution to increased competition from an ever-growing global workforce will be attaining two important qualities sooner rather than later: maturity and passion. Unfortunately many of today's youth are lacking in both,
The flood of intellectual talent morphs the corporate ladder into a horizontal plank.
and that quest seems to elongate with each succeeding generation. Today's extended reliance on parents precludes the development of independence and can encourage a preoccupation with proving adulthood through self-destructive means like binge drinking and worse. If young adults truly had adult responsibilities, such conspicuous indulgences would be pointless.

This is not folly. U.S. News reported on a film, "Two Million Minutes: A Global Examination" comparing American teenagers' attitudes to those in India and China. The conclusion: U.S. students are preoccupied with "having fun," and are less focused and motivated.

It's not necessarily the kids' fault. They're pushed, prodded and subjected to structured programs in academics and sports that rob them of extemporaneous life experiences and a sense of personal accountability. With little time to experiment and make self-inflicted mistakes, absent is the benefit from the consequent lessons that would result. (Ironically, many times it's the accumulated wisdom from mistakes that is most beneficial years hence.)

Here's an excerpt from one student's college essay:

The competition is fierce. I've heard the infamous question, "What do you want to be when you grow up?" ever since I first started school. The expectation of teachers, guidance counselors and even family members contributes to the constant pressure for young adults to distinguish their future. When I was younger, I promised myself that I would end up doing something I truly loved. Feeling overwhelmed makes it harder for students to discover who they really are as people. Before all the stress, I used to have a long list of potential professions in mind, but now it all seems faint.

Not to paint an overly bleak picture, in reality many are ready to attend college right after high school;
Alternate pursuits not congruent with expectations should not be deemed inferior.
they're the fortunate ones. Too many are not however. For them, this is not a recommendation to not attend college, but to suggest that alternate pursuits not be deemed inferior if not congruent with entrenched expectations. Entering college prematurely, before some self-initiated responsibility and maturity have been garnered, is an invitation to disappointment.

It's not obvious today that the expectation of attending college is a relatively new phenomenon, first fueled by the post WWII GI Bill and mature, serious-minded veterans determined to climb the corporate ladder. But given the flood of intellectual talent nowadays, that ladder oftentimes morphs into a horizontal plank.

Perhaps its time to go back to the model that existed prior to WWII. Very few went to college; most found jobs right from high school. The benefit however was that the responsibilities of adulthood came very fast.

Just imagine a parent encouraging their college-aged son or daughter to:

Go out and experience life. Get a job. McDonalds', Home Depot, Starbucks and a zillion retail stores are starving for competent help. Then strive to earn some responsibility. Become a supervisor or assistant manager. Learn to effectively deal with customers, and employees both more senior and junior than you. These jobs are hard work. They are at times monotonous and can even be degrading. Become aware of what it's like to begin work without a specific skill set. Still, there are invaluable lessons to be learned. Wisdom to be acquired. The attainment of compassion and empathy. The opportunity to earn respect from others as well as for yourself.

Then after this experience perhaps you'll be better prepared for college. And you won't need your parents' hounding to earn good grades. You'll want them all by yourself.

Heresy? Of course this goes against the grain of today's expectations. But you may have heard of several well-known and successful individuals who followed unorthodox career paths, including Bill Gates (co-founder of Microsoft), Richard Branson (entrepreneur, Virgin Records & Airways), Arnold Schwarzenegger (bodybuilder/actor/governor), Steve Jobs (co-founder of Apple), Michael Dell (founder of Dell Computer), Craig Venter (iconoclastic genomic pioneer, first to decipher human genome), Billy Joel (pop musician), Ian Anderson (self-taught musician, salmon farm entrepreneur).

Is it a coincidence that many of today's richest, most influential and creative individuals broke from the mold and charted their own paths to success?

With the ever-increasing worldwide competition for jobs, the key to success may be discovering very soon after high school innate talents and a passion for something; anything. (As Google's super-successful executive chef Josef Desimone imparts, "I never wanted to be an astronaut... I only wanted to cook.")

What accompanies passion is an inexhaustible reservoir of energy for accomplishment. To quote an old proverb: "Do work you love and you'll never work a day in your life."



© Zits Partnership. Reprinted with Special Permission of King Features Syndicate.

*****

Maturity and Passion — a fortuitous combination — perhaps the 21st-century passport to success. Luckiest are the ones that have it at an early age. Fortunate are the ones that acquire it eventually. Sorry will be the ones unable to achieve it in time, or at all.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 5:16 AM

Wednesday September 1, 2010

Imitation Lives Imitating Art?

While watching an old Star Trek episode entitled "Return to Tomorrow," I realized that I and fellow Trekkies have been watching Kirk and crew beaming up via the transporter for over forty years. I wonder now after watching the series for so long, whether I'd be thrilled if the old-fashioned transporter (as well as warp drive, phasers, and photon torpedoes) finally became reality. With the ever-increasing sophistication of special effects that can sometimes seem more real than for real: Are we becoming desensitized not only to the present — but also to the yet-to-be?

Looking back, the goal of audiovisual technologies in the early days was to make the artificial seem real. Phonographic recording media began as scratchy cylinders and clay platters and progressed to crystal clear
Reality show popularity reflects an ironic yearning to experience reality, via TV!
multi-channel discs that can sound better than live. Films evolved to do the same, from the primitive "silents" to the first "talkies" to Technicolor and widescreen leading to today's mind-blowing digital IMAXs. With all this amazing technology, the real — when it is remarkable in some fashion — now evokes the artificial.

For example, living through a tornado might remind one of the movie Twister. Seeing a dangerous car chase could elicit Gone in 60 Seconds. If and when we're invaded by aliens, it'll probably recall Alien, ET, Close Encounters or War of the Worlds — depending on their demeanor. If the experts warning of global warming are correct, one day reality will summon The Day After Tomorrow. If biologists succeed in cloning extinct species, we'll all be able to visit Jurassic Park. Maybe life imitating art will be the norm and not the exception.

Perhaps; But how can we continue to consume escapist entertainment when the unreal and real become so blurred inside our heads? It used to be that we were only living the movie while watching it. Today they are etched in our minds for life, just waiting to be educed by circumstance. And as the special effects continue to dazzle beyond physical reality, we'll have trouble living in the relatively staid present without suffering mental spasms of withdrawal.

The Phrenicea scenario envisions future inhabitants swapping their present with the past to escape their boring existence — reconstituting moments lived by others by bidding on bits of memories stored within Phrenicea's braincomb. Will they just be experiencing old TV and movie clips?

Today, many young adults watch Sex and the City, Friends and Seinfeld et al until they can recite every line from every episode, fantasizing their real-life relationships will be as interesting as idealized fiction. Looking at it from a different perspective; Would they watch a Friends episode portraying the characters watching a Friends
Maybe life imitating art will be the norm and not the exception.
episode? Sounds awfully boring. And that's the point — Are they wasting their corporeal lives watching TV, DVDs and other vicariousnesses like YouTube? Ironically, the popularity of reality shows might reflect a yearning to experience reality, albeit via TV! Has real living become that tedious?

More worrisome, the fake is overtaking "for real" in other aspects of life as well. The music video game Guitar Hero simulates the playing of guitar using just five colored fret buttons. More than 20 million have been sold. Tournaments and competitions are the rage — all to electronically mimic the real thing without learning music terminology, theory, notes, or chords. Mastery of hand-eye coordination is the only requisite. Is too much time being squandered on skill-less gratification rather than on authentic knowledge acquisition and appreciation? Are there too many imitation lives imitating art?

Star Trek's "to boldly go..." this isn't.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 7:26 AM

Sunday August 1, 2010

Water On the Brain Syndrome

Now that the lazy, hazy, crazy days of summer are upon us, it's a good time to use the sultry weather as an opportunity to revisit yet again our feverish condition of "water on the brain." Way back in 2001 our "H2Ouch!" page began recommending the following:
"Pretend you were to pay $1/gallon the next time you take a shower or bath, brush your teeth, flush a toilet, wash the dishes, or God forbid — water the lawn! Begin to use less water than the average person. Set an example. Prevent H2Ouch!"

We still believe this is good advice, but perhaps the "hyperhydro" proposal was and still is naïve. The problem is that there is little incentive to conserve fresh water from the tap — given its ridiculously low price. For example, I recently received my "Annual Water Supply Report" from my local water company and was dismayed at how little fresh water costs. Here's the breakdown:

Quarterly Water Rates — Residential

Consumption (gallons) Charges
  • Up to 8,000
  • 8,001 - 58,000
  • 58,001 - 100,000
  • Over 100,000
  • $10.00 minimum
  • $0.90 / thousand gallons
  • $1.15 / thousand gallons
  • $1.40 / thousand gallons

A dollar for 1000 gallons of clean, fresh tap water? That's insane! By comparison, bottled water by the gallon costs about $1.99. Not bad, but that's $1990 for 1000 gallons. Why is there such a cost disparity with tap water? How can anyone be motivated to conserve water at these low rates,
Wouldn't it be prudent build "aqua equity" for future generations?
other than via a guilty conscious? And let's face it; there aren't many turning on their taps ladened with guilt. (If the water companies got savvy they'd upmarket their image with exotic brand names, pricing and refillable bottles with fancy labels adding cachet to their product. Imagine having bragging rights to elite-sounding potable water! It's not that silly a suggestion, since that is essentially what Coca-Cola did with Dasani and PepsiCo with Aquafina. They're both filtered municipal tap water.)

Actually what we really need is a "watershed moment"; a trickle-down epiphany to appreciate how finite and precious our water supply is. The first step should be to make users conscious of their water consumption — and that can be accomplished handily by raising the price per gallon and using a more dramatic cost gradient for excessive use. It sounds crazy, but those concerned about conservation should lobby for pricing increases.

Another way to raise awareness might be to move our water meters out from their usual obscure locations into full view in kitchens and bathrooms — fitted with big, red digital read-outs displaying gallons used in real time. Education on where our water comes from and how it's treated,
We should instill among ourselves an idiom-cum-mantra to "spend water like it's money."
stored, delivered and renewed would also serve to engender an appreciation of what is the major constituent of all living things.

There's an old saying attributed to spendthrifts that says they "spend money like it's water." Maybe we should instill among ourselves a new idiom-cum-mantra to "spend water like it's money." Since we all tend to waste water and take its abundance for granted — it even unintentionally spills over into the comics:



© 2007 Baby Blues Partnership. Reprinted with Special Permission of King Features Syndicate.

Did Zoe and Hammie's tub really have to be filled to the brim?

*****

Wouldn't it be prudent to pay more today to change our wasteful habits, while adopting a mindset focused on conservation to build "aqua equity" for future generations?

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 7:39 AM

Thursday July 1, 2010

Workin' the Pharm:
"Ask your doctor about Vesi • lev • esta • vix • rum • vix • ami • cele • max • gel • ser • iza • vet • avo • cal • zet • via • ara • nex • xol • quel • xyz • luc • lis • tis • tor • itra • ast • tia • ica • gra • iva...!"

We're bombarded nowadays with drug ads in print and on TV called "direct to consumer advertising." Their proliferation is a result of a 1997 FDA change allowing pharmaceutical companies to promote drugs without having to elaborate the negative side-effects. A great business sales model immediately emerged to recruit consumers into believers, who then cajole their doctors into prescribing drugs.

Here's a "short" hawking list compiled from TV and a few magazines:

Celebrex
Amitiza
Viagra
Vytorin
Nexium
Crestor
Plavix
Actonel
Lunesta
Evista
Nasonex
Asmanex
Boniva
Symbicort
Vesicare
Roserum
Flowmax
Caduet
Lipitor
Avodart
Singulair
Lunesta
Januvia
Zetia
Reclast
Levitra
Xyzal
Lucentis
Cialis
Seroquel
Xolegel
Marketing firms are paid big money to create these arbitrary (meaningless) names that are then registered as unique trademarks. The guidelines appear to be short names, five to eight letters, constructed from a common set of syllables — perhaps explaining why they all have a "drug-sounding" resonance.

Reading the list above for the first time you'd probably guess that they were drugs. The challenge then for creating new names is to find new syllable combinations not yet coined. Choosing six syllables from the heading above — born here are Nexquel, Estavix and Vexizet. They sure sound like drugs and per Google they're not in use. (The futuristic entity Phrenicea was coined analogously eleven years ago by combining phrenic and panacea !)

The intent of pharmaceutical companies is to make their names familiar with unrelenting advertising using memorable jingles like Viva Viagra!, cryptic hints on par with "When the moment is right, you can be ready" and citing identifiable conditions such as acid reflux (Nexium), osteoporosis (Boniva), allergies (Singulair), heart risks (Lipitor), diabetes (Januvia), bladder urges (Vesicare or Flowmax), enlarged prostate (Avodart), high cholesterol (Crestor or Zetia), or high blood pressure (Caduet). And of course everyone knows what Viagra purports to cure.

To see how well the drug companies have been able to brand their names into your brain, scan the list above and click a check inside the boxes for those that are familiar.

How many did you check? Perhaps more than you would have guessed. That's the power of advertising.

Very few new drugs make it to a list like this however. Only one out of ten earns FDA approval after three phases of exacting clinical trials. Most new drugs either have no effect or are harmful.
The unexpected rise out of Viagra makes clear that drug discovery is not by design.
Another outcome is the unexpected. Many are not aware that Viagra was originally tested to treat angina and by serendipity became famous with a surprising side effect. After the (probable) chuckles during trial testing subsided, the once meaningless v-i-a-g-r-a would become an official entry in Webster's dictionary. Perhaps one day it will even become a genericized trademark like zipper, kleenex, velcro, scotch tape, band-aid, coke and Q-tip.

The unexpected rise out of Viagra illuminates an important point: Drug synthesis and discovery is not really by design, although most chemists in the industry will not readily admit such. The chemical compounds are created with as much art as science by unnatural means, with elaborate apparatus to control physical conditions of temperature, pressure, etc. They're then evaluated via empirics with animals and humans, which is a fancy way of saying they watch for indications (good effects) and reactions (bad effects) and contraindications (bad combinations) with other drugs, chemicals, and nutrients. The entire process can take years or decades.

Because so many of the intricacies of human body chemistry are a yet to be learned or explained, oftentimes how a drug works (pharmacodynamics) is a mystery. The pamphlet insert (which hardly anyone bothers to read) for Aldera cream states, "The mechanism of action is unknown." In layman's terms it would read, "We have no idea how this stuff works."

That explains too why so little is known about the long-term impact of these synthetic concoctions, and why in some cases they have to be pulled from the market due to unforeseen negative complications (Vioxx).

Not ironically, the benign sounding drug names drummed into us belie all this complexity. The trade name Plavix has the clunky generic name of clopidogrel, which pales next to its chemical name:

(+)-(S)-methyl 2-(2-chlorophenyl)-
2-(6,7-dihydrothieno[3,2-c]pyridin-5(4H)-yl)acetate

Imagine putting music and images to clopidogrel, or worse — methyl dihydrothiano pyridin acetate. Would you be as easily swayed to "ask your doctor" as the Plavix advertisements implore?

Here's more:

Trade Name Generic Name Chemical Name
Celebrex celecoxib 4-[5-(4-methylphenyl)-3-(trifluoromethyl)
pyrazol-1-yl]benzenesulfonamide
Lunesta eszopiclone (5S)-6-(5-Chloro-2-pyridinyl)-7-oxo-6,7-dihydro-
5H-pyrrolo[3,4-b]pyrazin -5-yl 4-methyl-
1-piperazinecarboxylate
Levitra vardenafil 4-[2-ethoxy-5-(4-ethylpiperazin-1-yl)sulfonyl-phenyl]-
9-methyl-7-propyl- 3,5,6,8-tetrazabicyclo[4.3.0]
nona-3,7,9-trien-2-one
Viagra sildenafil 1-[4-ethoxy-3-(6,7-dihydro-1-methyl-
7-oxo-3-propyl-1H-pyrazolo[4,3-d]pyrimidin-5-yl)
phenylsulfonyl]-4-methylpiperazine citrate
Boniva ibandronic acid 1-hydroxy-3-(methyl-pentyl-amino)-1-phosphono-
propyl]phosphonic acid

If you've taken a course in organic chemistry you might comprehend more of this. But then you would better appreciate the complexity of the human body, and the precarious chances taken when ingesting these novel
Plow a furrow of skepticism in your brow as harvests past yielded unexpected results.
chemicals never before seen in nature; created in laboratories with equipment that defies evolutionary rules and the eons of time to be in harmony with biological systems.

Nevertheless, the next time you find yourself "workin' the pharm" by asking your doctor about a drug you saw on TV or in a magazine, don't be afraid to get your hands dirty beforehand by digging up your own proverbial dirt via the Web or elsewhere to learn "what's in a name" — and then muster the vigor to plow a furrow of skepticism in your brow as harvests past have tended to yield unexpected crops.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 5:23 AM

Tuesday June 1, 2010

Switching Gears...

While trying to peer into the future I also find enjoyment looking backwards in time. A particular fascination is juxtaposing the past to the present and future by flipping through old magazines from decades past. Revealing are the ads that reflect the mood of the time as well as general economic conditions. (Unfortunately this ability may well be lost going forward if much of today's print media falls victim to purely online content.)

On a recent treasure hunt, I found in the stacks at a nearby university library a musty bound collection of Scientific American magazines from 1954. Most of the ads were by manufacturers touting their engineering and technical might to attract clients as well as recruit talent. The ads reflect unabashedly the post-WWII booming optimism and fixation on progress — that era's buzzword. (It would be at least a decade before "progress" would become almost a pejorative term in hindsight of the unintended consequences of unbridled technical advancement of that time.)

Here's just a sample of the testosterone-infused ads from those vintage magazines:

Click images to magnify
Scientific American Scientific American
Scientific American Scientific American
Scientific American Scientific American
Click images to magnify


Ohio's Cleveland Tool Company "discovered how to shrink motors by floating a screw on a stream of balls," eliminating the need for excess power to overcome friction.

California's Hewlett-Packard Co., then a "World leader in electronic measuring instruments," brags that "advanced electronic test instruments are invaluable in rocketry, nuclear physics and research into interstellar phenomena."

Air Research Manufacturing Co. in Los Angeles answered the U.S. Air Force's call to build a
The U.S. was an energized hotbed of industrial activity.
jet engine starter four times as powerful as anything before and only slightly larger than the original.

Haynes Alloys Co. from Kokomo, Indiana produced "alloys for every wear condition shaped to your specifications." All you had to do was send them a blueprint of a part that was prematurely wearing out and they'd solve your problem.

Lycoming Co. in Stratford, Connecticut crows about "Peak performance by any product requires big performance from small parts. Lycoming's skill at producing such custom parts explains why so many leading manufacturers look to Lycoming with its 2 million feet of floor space, and 6,000-plus machine tools ready to serve you."

Ford Instrument Co. in Long Island City, New York boasts that "Taming the monster power of a nuclear reactor requires precision control of all the elements. Ford Instrument is designing controls that seek and hold the optimum power level of the pile and keep the rods so exactly set that the reactor's energy is harnessed safely, securely."


As I continued to flip the pages, Doelcam Corporation from Boston touted micro-precision synchros. Maryland's Bendix Aviation Corp. bragged that Ni-Span diaphragms were heat treated in a vacuum furnace and tukon[!] tested for hardness. California's Kollsman Instrument Corp. instructs that "The old Roman god Janus lives today in servo mechanisms, instruments and controls which take past information and use it to guide the future." Bersworth Chemical Co. in Framingham, Massachusetts for more than a quarter century devoted all their time, talent and energies to the study of chelate chemistry.

And this goes on and on — page after page, ad after ad. What becomes obvious is the U.S. was an energized hotbed of industrial activity.

You just don't see ads like this anymore and I thought for sure I'd never see them in today's
"Made in U.S.A." is rarely seen today.
magazines. That's why I almost fell off my chair when reading a relatively current Business Week, turning to a full-page ad filled with a swagger and bravado and masculine images of meshing gears and heavy-duty gearboxes. The advertiser was Shanthi Gears, "turning the wheels of industries worldwide" with "no compromise, total trust and quality at its best manifestation."

Business Week


Wow! This page would fit quite comfortably inside those old Scientific American magazines.

Then I noticed the one major difference from the old days:
Shanthi Gears is not located in California, Ohio, Maryland, Massachusetts, Indiana, Connecticut or New York. It's manufacturing those gears in Tamilnadu, India. And while it is turning the wheels of industries worldwide, the U.S. has shifted into high gear towards a service economy. The once ubiquitous "Made in U.S.A." is rarely seen today.

Dennis the Menace
Dennis the Menace. Reprinted with Special Permission of North American Syndicate.

Looking backwards in time adds perspective when trying to peer into the future. Questions to ponder include whether shifting to a service economy is wise in the long term, and whether India (and of course China) will one day switch gears to follow that same path.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 6:19 AM

Saturday May 1, 2010

A Subscription Rx Redux

Last month's "A Subscription Rx" generated some thoughtful feedback. The two most popular themes; magazines are expensive and not good for the environment, nor are they conducive to deep intellectual enlightenment. Here are two submissions that are representative:
Considering the amount [of money] that you spent on your magazine subscriptions, you could probably afford an iPad (or at minimum, a Kindle).

Not only can you get your magazine news articles, but you'll be considered "Green," as you will not be contributing to the mass destruction of our forests for the perpetuation of a business that is dying.

Submitted by Howie L.

I have never understood the fascination with magazines especially auto magazines — it’s like reading the owner's manual for your automobile; full of facts and boring, and that accounts for almost half of your reading.

Then there’s the information magazines like Time and NewsWeek, that’s similar to reading the OPED pages in your Local Newspaper. I have found that people who read these fading pieces of paper stapled together rarely are into reading hard covered books, which have a way of taking you into the past, the present and sometimes the future.

I had the opportunity to take a tour of Trinity College in Dublin built in 1592 with one of the oldest libraries in the world; books going back to year 800 and not a magazine to be found.

Magazines started around 1900 in this country and your auto magazines around 1940 so they really have not been around that long and have been getting smaller and now gradually disappearing.

I think you made a wise decision [canceling your subscriptions] and can’t imagine the money that went down the drain and the amount of garbage you created, you should be ashamed of yourself.

Now if you want a form of escapism get yourself a good novel. It will take you on a fantasy and adventure at times; it can lead you into a world that is more interesting and exciting than your own. Plus you can go to your local library to borrow a great book; it doesn’t cost you a dime unless you're late returning it — not to mention you pay taxes for the facility so use it. Good reading!!!

Submitted by John S.

In addition to feedback, several timely indicators were encountered reaffirming the contention that magazines are on their way out.

First was an email from Newsweek almost begging for subscription renewal, followed by paging to an advertisement funded by the magazine industry disputing their ominous future, and then receiving notification that BusinessWeek would be ending its 80-year run — a consequence of McGraw-Hill's unloading it to Bloomberg News, who will leverage its news organization in an attempt to keep the publication afloat. (It won't be surprising when it eventually sheds the BusinessWeek name altogether.)

Found too was an essay (ironically published in April 19 Time magazine) written by Alan Brinkley, Professor of History at Columbia University, entitled "What Would Henry Do?", contemplating how Time co-founder Henry Luce would deal with the challenges of today's digital age and whether he'd be as successful in revolutionizing the spread of knowledge. Brinkley is not optimistic, "I fear the fallout from the decline of traditional media will endanger our ability to understand our world. But I hope for the best."

Finally, an extemporaneous trip to the library revealed a downsized periodicals section with many empty slots.

Newsweek gets desperate...
Newsweek Offer
Come Back! 90% Off!
The Magazine Industry gets delusional:
Magazine Ad
Swimming in magazines?
Time magazine's past and future:
TIME Magazine
"The Luce magazines have not fared well."
Goodbye BusinessWeek
BusWeek Folds BusWeek Folds
Another publishing institution bites the dust...
Library shelves emptying:
Library Shelves
Swimming in magazines?

*****

Who would have predicted that magazines would fade away into obscurity after decades of influence and significance? Certainly not me — which is proof that having an interest in predicting the future is by no means qualification to do so. However, Phrenicea did predict way back in 1999 that there would be "no news" in the short story The Engagement of Phrenicea.

Time will tell...

posted by John Herman 11:49 PM

Thursday April 1, 2010

A Subscription Rx

I've decided to cancel most of my magazine subscriptions — a self-imposed prescribed proscription.

Here I thought I would one day impress my offspring with published letters alongside the likes of Microsoft's Bill Gates (read) and Lee Iacocca of Chrysler fame (read). Many others found their way into print through the years, including rants excoriating the hubris of a technical genius (read & read) and even a clever quote in Time (read).

Less impressive but still gratifying was Fortune magazine's inclination to print my submissions including prescient views on boomer retirement (read), career advancement (read), career vs. family (read), rude GenXers (read), and even a sentimental encomium to their magazine's elegant beginnings (read).

More fun was seeing my first letter published in Motor Trend bragging of my power of persuasion (read), a nostalgic tribute to General Motors' Craftsman Guild (read), and mini-essays describing my three-decade-old battle with Mother Nature (read) my innate interest in cars and model building (read) and a tongue-in-cheek lament about the dangers of playing with DNA (read).

But who would have predicted that magazines would fade away into obscurity after decades of influence, significance, and power in terms of advertising might? Certainly not me — which is proof that having an interest in predicting the future is by no means qualification to do so.

Nevertheless, I expect to save a lot of time and money, here is my current subscription list:
     - Time
     - Newsweek
     - BusinessWeek
     - Fortune
     - Motor Trend
     - Car and Driver
     - Automobile
     - Road and Track
     - Collectible Automobile
     - Consumer Reports
     - Scientific American Mind
     - AutoRestorer
     - Science News
     - World Future Society
     - BBC Knowledge
     - US News & World Report

The periodicals industry has been predicting its own demise yielding it a self-fulfilling prophecy. There's a dramatic contrast in content from the past, both literally and figuratively. For example, Time magazine is shrinking in pages and physical dimensions (look).
'Who would have predicted that magazines would fade away into obscurity?'
Back in 1969 each issue's cover held about 80 pages and measured 8.5" X 11, for fifty cents. This week's issue comes out at 7.75" X 10.5" with just 56 pages, too many photos and too little content. The flimsy weekly hardly appears worth its $4.95 cover price.

Not to pick on Time, Newsweek morphed from a competing clone into what now resembles a compilation of editorials of questionable worth.

US News & World Report, soon after my latest renewal and without warning, went monthly in print and weekly on the web, as if I have time to read it with any degree of concentration on my computer. And then car enthusiast's Autoweek followed suit.

It's such a shame, publications like Time, Newsweek and US News were the bastions of respected world and national reporting. High school students were encouraged by their teachers to subscribe to complement their textbooks and provide source material for term papers (look). And even recalcitrant youngsters learned grammar and syntax while (surreptitiously) consuming automotive magazines like Motor Trend and Hot Rod.

The obvious weakening of substantive content engenders a self-imposed fait accompli discouraging sales, which in turn brings in less advertiser revenue, thus forcing staff cutbacks which lead to further decreases in content, and a spiraling sales decline.

Then the desperation becomes obvious: Fast Company recently offered a year's subscription for $5. Fortune's solicitation was $10 for a year. I discarded both, with the expectation that they'd go the online route like US News, or fold altogether.

Times change. Technology infiltrates. Life magazine was huge in the 1950s in terms of readership and famous for its photography; It was supplanted by television. Will the Internet now kill its once-sister Time magazine and its ilk?

Perhaps this is just an enexorable future unfolding — as Phrenicea predicted way back in 1999 that there would be "no news" in the short story The Engagement of Phrenicea.

Time will tell...

posted by John Herman 4:12 AM

Monday March 1, 2010

Epigenetics and You...

The study of biology is difficult. The study of physics is difficult. The study of chemistry is difficult. Once learned, we tend to view these and other scientific disciplines as our inventions — often flaunting erudition with advanced-degree titles — when essentially all they are is our explanation of the complexities of nature — or worse — imagined competencies that can beget arrogance to disturb natural equilibriums.

In February 1953 James Watson and Francis Crick, after several years of leveraging their (and others') understanding of nature's complexity, announced to the world the chemical structure of DNA. Over the years much has been learned about the molecule that is common to all life on earth — but only recently has it become a colloquial term related primarily to forensics to identify perpetrators of crime. For most of the lay public, that is about as much as they know about the acronym.

But what is DNA?

'Imagined competencies can beget arrogance to disturb natural equilibriums.'

DeoxyriboseNucleicAcid is a long-chained chemical molecule that functions as life's blueprint, and can be visualized simplistically as a ladder twisted like a spiral staircase — with each step (for illustration purposes) being one of four colors: black, white, red, or green.

Long stretches of DNA with many steps become genes. Many genes strung together become chromosomes. Every human has in each of their trillion cells 23 different chromosomes each containing thousands of genes. There are actually two sets of 23 chromosomes, one set from each parent. And each of the 23 has specific functions assigned — like growing fingernails or hair, or shaping the nose, etc.

The order or sequence of the steps' colors is important and unique to each person — and can reveal identity conclusively just like fingerprints. In other words, a person's uniqueness is determined by the sequence of the steps as one would imagine when climbing their DNA ladder. (For example: red, green, green, green, black, red, red, white,,,,,,,,,,,,,,, red = Jane Doe.)

The study of DNA, genes and chromosomes and how they function is called genetics. Like biology, physics and chemistry — genetics is difficult too.

And what about epigenetics?

Epigenetics is a new sub-field of biology that studies how chemicals in our environmental can play havoc "on top of" genes — acting like switches to turn them on or off abnormally. (The "epi" prefix means "on top of," as in epidermis — the outer layer of skin "on top of" the dermis.) So even though the gene appears normal and has not changed (or mutated), it does not function properly. Imagine those colored steps mentioned above blocked at various points to prevent them from being useful, even though they are still there undamaged.

Why is Epigenetics Significant?

Epigenetics is becoming important because it appears that many of the 80,000 man-made chemicals and pollutants in our environment can inadvertently create these epi-switches — and incredibly these can be passed on to future generations. The scary part then is that what your parents or grandparents were exposed to, before you were born, might be more detrimental cumulatively than anything you are exposed to directly in your lifetime; and the new switches you accumulate with exposure to today's many new toxins can too be passed on to your children and grandchildren.

So what does this mean?

It means that with each generation the impact is potentially cumulative — and many scientists in the field are convinced that Alzheimer's, Parkinson's, cancer, autism, obesity, diabetes, asthma and perhaps many more maladies may be caused by the effects of epigenetics.

Biology, physics, chemistry and genetics may be difficult — but epigenetics is downright frightening!

For the optimists reading this, we can alternatively view epigenetics from a positive viewpoint:
     Way back in 1999, the short story The Engagement of Phrenicea prophesied, "artificial genes producing hormonal parameters and switches — in simplistic terms — could be set or reset by Phrenicea to control or monitor behavior “officially” defined as antisocial, criminal, etc. [No crime!]
     And in 2004 we teased, "What if you could achieve and maintain the health that physical exercise promised, without any effort? If you could stay in so-called perfect physical condition without exercise, would you still workout? By mid-century this fantasy becomes reality. It was discovered that embedded within our DNA are gene expression-controlling switches tailored to 'sculpt' your preferred body shape. This epigenetic information can help to regulate the quantity and quality of the various chemical and structural components that make up our bodies. Ironically, it is the so-called 'junk' DNA, ignored for 50 years, that harbors these growth-controlling switches." [No exercise!]

Time will tell...

posted by John Herman 6:53 AM

Monday February 1, 2010

The Evolution of Status

After many years of blood and sweat, I was fortunate enough to have the opportunity to purchase a house constructed from the ground up. It was an exciting experience watching barren land evolve into livable structures; the genesis of yet another suburban development supplanting what was once a dairy farm.

As the homes were completed and occupied in almost perfect sequential order, the display of owner status in various degrees became evident and resembled a stadium crowd performing "the wave" at a sporting event.

After the wave subsided, what followed was a hushed assessment of each owner's financial means and personal taste. Judgment was based on observations such as: Who had splurged on the fanciest window coverings? Who installed the most and most expensive exterior lighting fixtures to replace the builder cheapos? Who laid the lushest sod lawn? Who erected the largest or most original custom-made curbside mailbox? Who was able to watch broadcast TV without a fuzzy picture, as witnessed via an antenna on the roof? Who were among the first to install automatic irrigation systems? Who had resplendent landscape designs immediately realized into manicured mini-arboretums? Who had their monotone wheat-colored walls professionally refinished with faux, murals and other elaborate wall coverings?
And on and on...

About eight months later cable TV finally arrived, followed by an incredible aerial flip-flop. Those who'd brandished clear reception via rooftop antennas quickly removed their one-time status symbols, since it was now embarrassing to be perceived as one not paying the premium.

As the years passed it became more difficult to recognize changes that might be discerned as enhanced cachet, but not for long. Without warning, a new wave swept through swelling heads ever higher — leaving in its wake huge curbside dumpsters signaling interior renovations or extensions. Paving stone became the rage too, and out went plebeian concrete walks and blacktop driveways. And not long after, expensive foreign and sports cars graced the spiffy new driveways.
And on and on...

Finally, after two decades and with all visible forms of home status exhausted, the ultimate bragging right today is to erect a "For Sale" sign, host a garage sale and move out to a carefree leisure village where it's warm and sunny all year 'round.

*****

So you have to wonder — where does this pettiness and vainglory stem from? It's apparently embedded in our designer genes, going back thousands of years. Anthropologists actually consider this to be advanced behavior — when compared to our more ape-like ancestors that is.

New York University's Randall White explains:
"One of the things that we know from studying modern humans is that personal adornment and the symbolic communication of a social identity is involved in maintaining differences within a society. By studying artifacts we imagine that what was going on 40,000 years ago was the first time in human evolution that we have the internal subdivision of human societies into different categories of social persons."

And after thousands of years we're still at it. We can make just about anything into a symbol of status. But one person's object of distinction might be another's folly. So we have to be among like-minded people to make an impression. And you can't even take it with you!

You have to ask then, after all these millennia, isn't it about time we evolved beyond this small-minded behavior? [Nah!]

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 5:12 AM

Feedback: Your Two-Cents!

Date: February 6, 2010 3:07:37 PM EST

I believe what you are calling Evolution is what we understood as trying to keep up with the Joneses — or the grass always looks greener on the other side of the fence. This is a relatively a new phenomenon, I know as a kid in the 50's I did not see this with any of my relatives. I also spent close to 4 years in Great Britain in the early 60's and they were basically under a socialist government at the time (Labor Party) and there was no such thing as competition or keeping up with the Joneses.

Now my children who are close to 40 years old spend a lot of time up upgrading their abodes to keep up with their friends. Needless to say they have some debt that they are now concerned about. So the moral of this reply is yes, we have the option to behave like this (such as my children and you), but countries that have socialism everybody is dragged down to the same common denominator — and they really don't have this option to much of any extent. The way this country (USA) is moving you may get your wish — we will all be forced away from this small-minded behavor, but it's definitely not Evolution — it's a choice!!!

Regards,
John S

Friday January 1, 2010

The Twelve Blogs of '09

Here we are again, lamenting yet another year's passing while suffering post-celebratory writer's block. Rather than trying to cobble together a half-hearted essay, we'll sheepishly follow the last three years' cop-outs to lethargically review the "Twelve Blogs of 2009" — with premature nostalgia and perhaps feigned interest as to whether they're still relevant.

December's blog celebrated one student's epiphany:

Many of today's high school and college students often wonder (vociferously!) why they need to memorize boring equations, formulae and other seemingly trivial or useless information. What many times is not emphasized by their teachers is the origin and significance of man-made expressions of what is essentially describing the workings of nature. They're not taught that many of these discoveries required lifetimes of effort — often by iconoclasts, eccentrics, heretics and recluses...(Read more)

November's entry contemplated whether the part of the brain that facilitates music appreciation is influenced en masse:

Arguably today's popular music is not as rich lyrically or musically vis-à-vis past genres. And why is that? Today's technological sophistication should encourage greater complexity. Why don't kids today approach the intricate styles of the Beatles' later work, Jethro Tull, Pink Floyd or Yes? Or, why don't they embrace big band jazz? Why not disco? (Read more)

October's blog reassessed whether the world is catching up with the Phrenicea scenario of the future:

If the current rate and course of technological change continues, which indeed it will, our cubicle scenario of the future will be minimally a metaphorical interpretation of the social isolation that will ultimately result...(Read more)

September's blog lamented a disturbing trend among chiropractors tempted with franchising to boost profits:

Chiropractors traditionally have spent lifetimes developing successful practices of respectable size, which reflect their character and values. They're now tempted with visions of big profits by wooing masses of clients less sophisticated and discerning — and more receptive to being dazzled with faux technologies and procedures. Secondary is keeping existing patients that may be sharp enough to see through the hype and perceptive enough to sense the tawdry goal of profit...(Read more)

• In August we were down-to-earth about a flighty plan to revive a manned space program to the moon:

Little has changed since Apollo's cancellation to warrant reviving a space program on the same scale. Much of the technology would have to be relearned and today we have the Iraq war, bankrupt corporations and an economic slowdown. A more worthwhile pursuit in this very different twenty-first century would be a crash program to develop solar energy for practical use — to finally escape dependence on fossil fuel... (Read more)

• In July we were preoccupied with losing our Inno"Since":

Keep an ear out and an eye open for the next time you hear or see an advertisement touting "In business since....," "founded in...," or "established..." Contrary to its intent or implication, the practical value of the boast is questionable. Perhaps this newfound awareness can be termed The End of Inno"Since"... (Read more)

• In June we suffered the recurring affliction of "water on the brain":

Maybe we should instill among ourselves a new idiom-cum-mantra to "spend water like it's money." We all tend to waste water and take its current abundance for granted. Wouldn't it be prudent to pay more today to change our wasteful habits, while adopting a mindset focused on conservation to build "aqua equity" for future generations? (Read more)

May's entry was up-front about rear-end vanity:

Have we come that far in auto technology that it's laughable now to think that Ford would deem the now mundane TC [Traction Control] worthy of rear-end vanity? But Ford is not alone in silly trunk bunk. Through the years I've studied many a car's derrière and have seen banality forged in chrome... (Read more)

• In April we compiled "Then and Now" images to acknowledge the passage of time and visibly observe how things have changed through the years:

Time is fascinating when you take the time to think about it. We tend to consider time a man-made commodity as if we created it — when if fact all we really do it measure it. Fooling ourselves in the process, we parse time into nanoseconds, microseconds, seconds, minutes, days, weeks, months, years, decades, centuries, millenniums, periods, ages, eras, eons, epochs... (Read more)

• The March blog loathed hero pilot Sullenberger as he succumbed to the self-serving advances of the media and opportunistic politicos:

Chesley B. (Sully) Sullenberger is a hero to the people that he saved with his miraculous jetliner landing in New York's Hudson River after Canada Geese incapacitated both engines. I don't consider Sullenberger my hero however — and I'm tired of the media trying to convince me that he is... (Read more)

• In February we pondered whether too much time is being squandered on skill-less gratification:

While watching an old Star Trek episode I realized that I and fellow Trekkies have been watching Kirk and crew beaming up via the transporter for over forty years. I wonder now after watching the series for so long, whether I'd be thrilled if the old-fashioned transporter (as well as warp drive, phasers, and photon torpedoes) finally became reality.... (Read more)

January's entry was tinged with nostalgia and torpor — and guilt for not writing a new essay:

Here we are again, lamenting yet another year's passing while suffering post-celebratory writer's block. Rather than trying to cobble together a half-hearted essay, we'll sheepishly follow the last two years' cop-outs to lethargically review the "Twelve Blogs of 2008" — with premature nostalgia and perhaps feigned interest as to whether they're still relevant... (Read more).

You may or may not agree, but after reviewing the "Twelve Blogs of 2009" it appears most are still relevant as we roll into the New Year. And somehow after recalling the energy that instigated the blogs to begin with, we don't feel quite as lethargic.

Maybe that's the real benefit of looking backward, reminiscing and singing "Auld Lang Syne" — not for nostalgia per se, but to recharge our batteries and begin another year with renewed energy and enthusiasm.

Time will tell...

Happy New Year!

posted by John Herman 1:12 AM

Tuesday December 1, 2009

One Student's Epiphany

Many of today's high school and college students often wonder (vociferously!) why they need to memorize boring equations, formulae and other seemingly trivial or useless information.

What many times is not emphasized by their teachers is the origin and significance of man-made expressions of what is essentially describing the workings of nature. They're not taught that many of these discoveries required lifetimes of effort — often by iconoclasts, eccentrics, heretics and recluses willing to shed lots of sweat and probably tears in order to solve nature's mysteries.

Still, many students past and present have stumbled upon these truths on their own, often with epiphanic delight.

Below is one finally-getting-serious college student's "Epiphany" written way back in 1971, stripped bare with numerous misspellings illustrating a misspent youth, yet with genuine astonishment that this seemingly simple realization took so long to gel. It was hand written pen to paper and found in a musty old box after 35 years. (Today's student might blog such a personal thought — with little chance of rediscovery years hence.)

A message to those who are in the same plight as I:

If you question the ways of the sciences — concepts — rediculous [sic] equations — symbols etc and become completely fatalistic toward them — think back a moment to your forefathers who devised these methods.

These are just building blocks to understanding. Just as you need tools to produce a manual task — tools are essential in building knowledge.

Nature does what is does without any influences (until recently however). Man has not and will never harness nature by merely understanding its processies [sic]. This form of study makes use of abstract concepts to make understanding less tedius [sic] and to standardize the methods of expressing our understanding of them possible and eliminate a caotic [sic] consequence.

This must be remembered if excelence [sic] in any science is achieved.

Written by J. Herman some time in 1971

Perhaps some day education will go beyond mere memorization and copying teachers by rote to include a real appreciation of our current state of knowledge — knowledge that allows us to not only understand the workings of nature, but to leverage and alter them for our benefit, as well as our peril.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 5:53 AM

Sunday November 1, 2009

Bebop, Doo-Wop & Hip-Hop

Hip-hop is mainstream these days, ringing up substantial sales in the music industry. The demographic is young, with ages between thirteen and 30 years old.

This begs the question, using yesteryear's "cool" vernacular: Why do today's youth "dig" hip-hop?

Empirical observation tends to reveal that musical taste follows a generational track. It seems people bracketed by parameters defining a "generation" assent to a particular musical genre making it popular or "in," to a degree where it's translated into measured sales on pop music charts and download sites.

Why is this? Is it additives in the food? Pollution in the air? Society's morals and values at the time? Is it the effect of that other medium — television?

Somehow the part of the brain that facilitates music appreciation is influenced similarly en masse. Consequently, the popular music of the latter 20th century can be categorized as follows (with some overlap notwithstanding):

the 1940s was the decade of big band swing;
the 1950s was the decade bebop, doo-wop and simple rock'n'roll;
the 1960s was the decade of experimental, progressive rock;
the 1970s was the decade of disco and punk rock;
the 1980s was the decade of new wave;
the 1990s was the decade of grunge, super-slick r'n'b, and lip-synched girl and boy bands.

So now it's hip-hop. Arguably today's popular music is not as rich lyrically or musically vis-à-vis past genres. And why is that? Today's technological sophistication should encourage greater complexity.
The part of the brain that facilitates music appreciation is influenced en masse.
Why don't kids today approach the intricate styles of the Beatles' later work, Yes, Jethro Tull, or Pink Floyd? Or, why don't they embrace big band jazz? Why not disco?

And why do the adolescents from decades past, the boomers in particular, abhor hip-hop? Like their parents, they continue to cling to their generation's music. Could it be the effects of atomic testing? Howdy Doody? Silly Putty?

Public television has discovered this phenomenon and to its delight is breaking all kinds of pledging records by producing more and more nostalgic concert programs. Featured are wrinkled, grayed and frayed performers — some that can barely move or hold a tune. Yet they evoke delight for themselves and their audiences, as the music floods their clouding minds with memories of youth and the "good old days" gone by.

The once super popular groups and solo performers seem incredulous that they're up there again on stage. And surely the so-called "one hit wonders" never dreamed they'd be performing their one song ad infinitum five decades later!

Incredible too is the irony that what was once thought to be throwaway tunes — trash by previous standards and deemed junk by parents — are instead being performed so many years later by these original performers in front of their original fans. That's a miracle of technology — not musical but medical!

All this begs another question: Fifty years from now — in the Age of Phrenicea — will today's youth still be listening to what might by then be considered a new form of classical music... Hip-Hop?   ;-)

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 6:49 AM

Feedback: Your Two-Cents!

Date: November 23, 2009 8:56:05 PM EST

Hip-Hop started almost 40 years ago, my granddaughter is now into it but just the other day she had it blaring on my car radio and I said, "Do me a favor and listen to this one song on a CD I had." The song was "That's Life" sung by Michael Buble. She said "Woo great voice and song," but of course it's not what the radio is playing on most stations.

Now there are singers out there like Michael Buble, Harry Connic Jr, Jamie Cullum, Robbie Williams, Melanie Gardot, Kurt Elling and even some old rock artists like Rod Stewart, who restarted his career with sold out CDs and concerts singing all the old Cole Porter, Irving Berlin type songs. Look at American Idol — these young kids are singing the standards. I like to think music is cyclical but 400 years from now you will still be listening to these young singers and of course Sinatra. Rap, Disco, and Punk rock will be something not even heard. My son who is 39, daughter 37 they are gradually coming around to listening to the old standards because of the great lyrics. Of course the greatest interpreter of these songs is Sinatra. The Beatles are also in a class of their own; they had great lyrics and a sound. Paul McCartney sold out the new Citi Field twice last year. They wrote songs that people like Sinatra, Matt Monro, Shirley Bassey sang and the New York Philharmonic played. They will also be around 400 years from now. I love the song that George Harrison wrote, "Something," what great lyrics. Now the TV and radio can fool some of the people some of the time, but not all the people all of the time — and most people gradually come around to great works of art. (I hope.)

John Sertic

PS I saw Sinatra seven times at Carnegie Hall. It was always a mixed audience of young and old, and most times he finished the concert by saying, "I wish you all good health and I hope 400 years from now the first thing you hear is my voice."

Thursday October 1, 2009

Cubicle Dwelling? Not Yet...

Now is as good a time as any to assess whether the world is catching up with the decade-old Phrenicea scenario.

So, is it?

Well, not really. We're not yet donating our brains to the Phrenicea braincomb. Money, pets, and newspapers are still around. Cars have not been banned, although they're getting very expensive to operate. Human cloning hasn't replaced procreation, but at this point we probably wouldn't be too surprised to hear of a successful attempt. And no, we're not living in cubicles — yet.

Nevertheless, it probably could be said that the Phrenicea scenario today is perceived as a bit less bizarre than when it was unveiled way back in May, 1999. Of course this conclusion is based on objective data — that being the volume visitor ranting via email. The hysterical ones have been on a steady decline, although the predominance of critical feedback we get still centers around the "ridiculousness" of the Phrenicea scenario as presented on the website.

A very (un)popular idea continues to be that of people living in cubicles. Granted, compared with today's mobile population, it does seem unbelievable. But we are becoming increasingly isolated from fellow human beings in an imperceptibly incremental fashion.

Right under our noses, there's been a gradual reduction of human interaction fostered by technology. It began way back in the 1950s with TV and the sprouting of the first "couch potatoes." After TV's novelty wore off, many found it preferable to just stay indoors than to socialize or pursue physical activity.

As technology marched on, the need to interface with real people declined even further. Some examples:

the elimination of personalized attention at self-service gas stations, supermarkets, home centers, etc.
answering machines and voicemail replacing conversation; business conducted via telephone tag
live customer service reps replaced by impersonal automated systems with annoying nested menus and digitized voice
DJ radio personalities supplanted with computer programmed "Jack" and similar formats; essentially mechanized music shuffling
impersonal email, instant messaging and phone-based texting replacing the spoken word
iPod zombies existing in their own little worlds.

Imagine that strangers a century ago would actually greet each other with a "Hello!" and then strike up a conversation. Today that "strike," with near total detachment from fellow humans being the norm, is ever more likely to be violent perpetrations.

If the rate and course of technological change continues, which indeed it will, our cubicle scenario of the future will be minimally a metaphorical interpretation of the social isolation that will ultimately result.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 5:36 AM

Tuesday September 1, 2009

Chiropractic Tactic?

The Phrenicea scenario of the future depicts chiropractic supplanting what is today considered traditional medicine — a profit-driven medical/insurance industry treating various states of disease with drugs and/or surgery. Unfortunately, this idealistic vision appears to be in jeopardy with the emergence of a new and ominous trend among chiropractors.

To compete with today's established "big business" of medicine, many chiropractors have been participating in a trend to further legitimize their profession and increase revenue by expanding their application of physical manipulation of the spine and other body structures to include the pursuit of "wellness." This may include the addition of other alternative, non-traditional modalities such as acupuncture, naturopathy, message therapy, yoga and more. The wellness approach attempts to proactively prevent disease holistically with proper lifestyle, rather than treating symptoms of disease already in progress.

The proliferation of wellness centers is a trend that is expected to continue, since chiropractors have found receptive, health conscious, well-heeled clients willing to pay out of their own pockets for treatments that appear to be effective, as well as for expensive natural supplements.

A more recent trend, and an ominous one in my opinion, is the growth of franchised centers of wellness with the chiropractor as the hub. The franchise concept is not new to the service industry — certainly not with fast-food like burgers. Over the years it has expanded to include niche restaurants, haircutting, package shipping, travel, eyeglass fitting, chimney cleaning, home inspections, lawn maintenance, maid service and more. It is quite new to the business of chiropractic however.

It's apparent that the goal of franchising in this profession is to "McDonald's-ize" the chiropractic experience in terms of creating recognizable brands to a broader population, while bringing in extra cash for programs of questionable benefit — for the patient at least.

Chiropractors traditionally have spent lifetimes developing successful practices of respectable size, which reflect their character and values. They're now tempted with visions of big profits by wooing masses of clients less sophisticated and discerning — and more receptive to being dazzled with faux technologies and procedures. Secondary is keeping existing patients that may be sharp enough to see through the hype and perceptive enough to sense the tawdry goal of profit; the same patients that are at risk of being alienated with slick, sterile, and generic branding, flashy placards and gee-whiz computer-facilitated tests of dubious value.

It's not clear, at least to me, whether the cookie-cutter success associated with the formulaic, sanitized operation of a franchise is transferable to the practice of chiropractic. And even if it is, should it be? (A moot question if more profit is the primary motive.)

The inevitable result of franchising in general is the lowering of service quality to merely acceptable or tolerable, vis-à-vis stand-alone businesses. (How many four-star restaurants are members of a franchise? How many top-notch hair cutters?) This settling consequence might be acceptable for a meal or a haircut — but for healthcare?

If this scenario of greed plays out as it did with traditional medicine, there may have to be a new alternative to today's alternative healthcare.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 5:06 AM

Feedback: Your Two-Cents!

Date: September 4, 2009 12:56:21 PM EDT

Regarding your blog on the Chiropractic industry — having been going to Chiropractors for years, I know that they (as a whole) have been petitioning the AMA and the insurance agencies (for years) to get acceptance. Which, they recently received. Most medical insurance companies now pay (at least a portion of) the charges for chiropractic care, and some medical doctors will refer you to a chiropractor for care.

This, in my opinion, was a double-edged sword. Chiropractors now get more clientele, through referrals and walk-ins. So, with volume up, they need to run their business like the "big business" that they never were before.

Now that my chiropractic care is covered by my medical insurance, I have more forms to fill out, and more approvals to get. There are longer waits at the office, and he spends less "quality time" with me.

The price we pay is more than the actual price.

Signed
Howard Levine — bad back sufferer

Saturday August 1, 2009

That's One Small Step for [a] Man,
One Not-so-giant Leap for Mankind

Now that the 40-year-anniversary hoopla for the Apollo 11 moon landing has subsided, we can look towards the 50th when we'll again assess the legacy of that incredible* accomplishment — and quote repeatedly Neil Armstrong's misquote.

Although he got all the "right stuff" right, like successfully landing the lunar module by the seat of his pants after its computer crashed — with just ten seconds of fuel to spare — he flubbed what was to become its legacy in words. He had planned to say "a man" meaning himself. But as he states, "On flight tapes, I leave a lot of syllables out. I think reasonable people will realize the 'a' was intended." So the famous quote is redundant since "for man" and "for mankind" are synonymous.

Worse than quoting misquotes, the landing's Golden Anniversary will yet again ignore poor Michael Collins. He went through all the rigorous training just to remain in the command module; Jethro Tull's lyrics by Ian Anderson:

I'm with you L.E.M.
though it's a shame that it had to be you.
The mother ship is just a blip
from your trip made for two.
I'm with you boys, so please employ just a little extra care.
It's on my mind I'm left behind
when I should have been there.
Walking with you.

The moon landing just for two was supposed to be a stepping stone to exploring our solar system. Then President Nixon gushed upon the astronauts' return to earth, "The heavens have become part of man's world." Not quite. In school we were told we were fortunate to be growing up in the Age of Space. We would see men (and women) traveling to Mars and even beyond. So many of us were convinced that there would be incredible advances and benefits derived from the effort.

One optimistic student was convinced that the moon's lack of atmosphere and slight gravity attraction would benefit industry and certain areas of study:

Telescopes could be constructed on the moon for clearer view
Radio telescopes could escape the atmosphere and interference on earth by moving on the lunar surface
Minerals contained in the unlimited amounts of meteors could be a great source for valuable resources
Manufacturing of electronic components can be made more cheaply on the lunar surface
Welding of metals could be done more easily and efficiently
Research scientists could work on the moon for experiments requiring a vacuum
Spaceships could be launched using less fuel
Heart patients can be brought to the moon to recover
As we know, none of this came to fruition, except perhaps that today's heart patient bills are sky high. It should be realized however that the twentieth century's Space Race was more than a technological challenge; it was first and foremost a political maneuver. President John F. Kennedy seized the anxiety associated with the Soviet threat and the perception spawned by Sputnik that the U.S. had fallen behind in science and technology. It was brilliant plan to galvanize a nation into action which would ultimately prove our political system superior.

An ancillary benefit besides the political power play was the technology that was spawned from the research and development. Although the space-centric predictions above never became reality there were realized benefits, but not what could have been envisioned back then:

Freeze-dried foods (developed to prevent food spoilage in space)
Solid state technology leading to today's consumer electronics
Sunscreens (formulas based on digital image analysis)
Wireless headphones (based on headsets used on moon)
Cool suits for race car drivers and firefighters (based on moon suits)
Bicycle helmets (foam used to make astronauts comfortable)
Cordless vacuums (capturing moon samples)
Despite the promise of further progress, the Apollo program was terminated with three missions aborted, reasons being the loss of political will (the U.S. regained technological superiority over the Soviets) and the perceived misdirection of funds given the Vietnam war and changing attitudes toward space exploration. The thrill was gone once the effort seemed to become commonplace. Typical shortsighted thinking dismantled the project's infrastructure and mothballed documentation and records.

*In hindsight the feat was incredible given the relatively primitive technologies of the time. The onboard computer upon which three lives depended was equivalent to today's hand-held calculator, which was in part later derived from the space program's research and development. Slide rules and adding machines — artifacts from the Stone Age of Space — amazingly got us there and back. Incredible too is that just twelve years prior the U.S was trumped by the Soviet Union in launching a tiny satellite that would orbit the earth. (Imagine anything of magnitude getting done today in twelve years!)

There is talk nevertheless of resurrecting a vibrant space program. Landing on the moon again — or a planet like Mars for that matter — will never evoke the awe and wonder of 1969's landing, which became a worldwide event. To generations raised on special effects and computer games, reality is boring in comparison.


Dana Summers
© Tribune Media Services, Inc. All rights reserved.

Little has changed since Apollo's cancellation to warrant reviving a space program on the same scale. Much of the technology would have to be relearned and today we have the Iraq war, bankrupt corporations and an economic slowdown. Russell "Rusty" Schweickart, pilot of the first lunar module agrees, "It doesn't sound like the right thing to do." And retired Grumman president Joe Gavin is hesitant, "I don't think bringing back a sack of rocks is a good reason."

A more worthwhile pursuit in this very different twenty-first century would be a crash program to develop solar energy for practical use — to finally escape dependence on fossil fuel.

Time will tell...

posted by John Herman 7:06 AM

Feedback: Your Two-Cents!

Date: August 9, 2009 7:50:48 PM EDT

Why can't we do both [solar energy and] lets hope we beat the Chinese to the moon, they are talking about a manned launched next few years. Instead of this space station and shuttle operation, we should have built a station on the moon. It would have given us some knowledge of how to live and work on another planet and from there launch to Mars you name it. I can see some budget restraints on NASA now with our government running automobile companies, banks, insurance companies, railroads, post offices, health system, major investments in some of our big cities and the used car business. Excuse while I vomit!

Cheers
JS

Wednesday July 1, 2009

The End of Inno"Since"

Recently I spotted an ad for "Heating and Air Conditioning Service, in business since 1954." It occurred to me to ask, "What does that really signify?" All I came up with is that all those who made the business successful back then are gone. It had to be an okay service to survive initially when the original principals ran things. But the fact that they're still in business does not necessarily translate into anything superior today. They could be in a downturn. We see it all around us today:
• Chrysler was founded in 1925. Honda did not begin producing automobiles until the 1960s. Which builds a better car?

• Bank of America was founded 1874 and Chase in 1877. There's been a lot of turnover since! Commerce Bank (recently swallowed up by TD Bank) is a relative newcomer founded in 1973. It blew them all away in customer satisfaction garnering top J.D. Power accolades.

• The U.S. Congress has been around since 1787. Enough said about that.

Nevertheless, advertising business birth is still very popular. Googling "in business since" returns almost two million results. Similarly, googling "founded in" yielded 51 million results. Several are very well known:
• Heinz (best known for ketchup) was established in 1869 and this is proudly declared on their label. Its longevity probably has no bearing on sales today — its products happen to be very good. Remarkably, they've managed to retain smart management with the common sense to not mess up the original recipes with new processing technologies.

• Ditto for Gulden's Mustard established 1862, first made by Charles Gulden, now manufactured by the agricultural giant ConAgra, in keeping with the original secret recipe.

• Coke almost blew it with New Coke in 1985, 99 years after the original was introduced. Its management back-stepped only three months later with old-new Coke Classic and eventually dropped the Coke that failed. A close call averted by facing failure boldly and quickly.

• Perhaps the most ridiculous business type to claim value in longevity is a restaurant. It could be on the downslide with the original owners replaced by their spoiled or less-driven progeny, or not-as-committed outsiders. Even with consistent ownership, chefs tend to migrate from one establishment to another like bees collecting pollen, and wait staff can be as ephemeral as cumulus clouds. Maybe if they could brag, "Serving our clientele with the same owner, same chef and same support staff since...." Now that would be meaningful.
Then there are those that can advertise longevity but choose — perhaps wisely — not to:
• Microsoft was founded in 1975. In this case boasting "in business since" probably would be a negative, as newer technology companies like Google are perceived to be more nimble and innovative. Microsoft is smart not to advertise their relative longevity since for them even not-so-old is too old.

• KODAK has been in business since 1892, which means nothing given the technology of photography is primarily electronic today — and not chemical processing as it was for almost a century. The company that once touted film (remember Kodachrome?) and paper quality survives only because its management maintained its brand "image" while photo technology marched digitally forward.

• Ford Motor Company does not advertise that it's been in business since 1903. Perhaps because the company has had its back to the wall in recent years and extreme ups and downs throughout the 20th century. Fortunately its products have become excellent and rival the best of the Japanese and Europeans. But until the public's perception catches up with reality, it's probably better for them to stress the present rather than the past.

• The famous (infamous?) Yankee baseball team was established in the early 1920s. So what? What distinguishes one team from the other nowadays? With player turnover, team rosters can completely change within several years. Diehard fans brandish their enthusiasm with logoed shirts and caps. But what are they really fans of — the pinstripe uniform? The phrase "old is new again" is apropos.
(Ditto for your favorite team?)

And finally, there's a company that has found a way to advertise longevity and approach it in a clever, iconoclastic manner — for a product that is about as boring as straw. Post Foods has come up with a cereal-killer campaign to advertise their old-fashioned Shredded Wheat, with a theme deriding "Progress." The ad copy harks back to the groovy, Luddite, anti-establishment 1960s: "Honestly, what thanks do we owe progress? We're up to our necks in landfill, down to the wire on resources, and climate change is out to get us — or at best leave us with nasty sunburn." It continues, "Henry Perky created the Original Shredded Wheat in 1892. One man. One ingredient. One machine. We didn't give it any add-ons or plug-ins.... All we did was make it Spoon Size in 1961. Did we go too far?"
Shredded Wheat — brilliantly made hip with tongue-in-cheek after 117 years.

*****

So keep an ear out and an eye open for the next time you hear or see an advertisement touting "In business since....," "founded in...," or "established..." Contrary to its intent or implication, the practical value of the boast is questionable. Perhaps this newfound awareness can be termed The End of Inno"Since."

Time will tell...

posted by John Herman 4:54 AM

Monday June 1, 2009

Water On the Brain

Now that the lazy, hazy, crazy days of summer are practically upon us, it's a good time to use the sultry weather as an opportunity to revisit our feverish condition of "water on the brain." Way back in 2001 our "H2Ouch!" page began recommending the following:
"Pretend you were to pay $1/gallon the next time you take a shower or bath, brush your teeth, flush a toilet, wash the dishes, or God forbid — water the lawn! Begin to use less water than the average person. Set an example. Prevent H2Ouch!"

We still believe this is good advice, but perhaps the "hyperhydro" proposal was and still is naïve. The problem is that there is little incentive to conserve fresh water from the tap — given its ridiculously low price. For example, I recently received my "Annual Water Supply Report" from my local water company and was dismayed at how little fresh water costs. Here's the breakdown:

Quarterly Water Rates — Residential

Consumption (gallons) Charges
  • Up to 8,000
  • 8,001 - 58,000
  • 58,001 - 100,000
  • Over 100,000
  • $10.00 minimum
  • $0.90 / thousand gallons
  • $1.15 / thousand gallons
  • $1.40 / thousand gallons

A dollar for 1000 gallons of clean, fresh tap water? That's insane! By comparison, bottled water by the gallon costs about $1.99. Not bad, but that's $1990 for 1000 gallons. Why is there such a cost disparity with tap water? How can anyone be motivated to conserve water at these low rates,
Wouldn't it be prudent build "aqua equity" for future generations?
other than via a guilty conscious? And let's face it; there aren't many turning on their taps ladened with guilt. (If the water companies got savvy they'd upmarket their image with exotic brand names, pricing and refillable bottles with fancy labels adding cachet to their product. Imagine having bragging rights to elite-sounding potable water! It's not that silly a suggestion, since that is essentially what Coca-Cola did with Dasani and PepsiCo with Aquafina. They're both filtered municipal tap water.)

Actually what we really need is a "watershed moment"; a trickle-down epiphany to appreciate how finite and precious our water supply is. The first step should be to make users conscious of their water consumption — and that can be accomplished handily by raising the price per gallon and using a more dramatic cost gradient for excessive use. It sounds crazy, but those concerned about conservation should lobby for pricing increases.

Another way to raise awareness might be to move our water meters out from their usual obscure locations into full view in kitchens and bathrooms — fitted with big, red digital read-outs displaying gallons used in real time. Education on where our water comes from and how it's treated,
We should instill among ourselves an idiom-cum-mantra to "spend water like it's money."
stored, delivered and renewed would also serve to engender an appreciation of what is the major constituent of all living things.

There's an old saying attributed to spendthrifts that says they "spend money like it's water." Maybe we should instill among ourselves a new idiom-cum-mantra to "spend water like it's money." Since we all tend to waste water and take its abundance for granted — it even unintentionally spills over into the comics:



© 2007 Baby Blues Partnership. Reprinted with Special Permission of King Features Syndicate.

Did Zoe and Hammie's tub really have to be filled to the brim?

*****

Wouldn't it be prudent to pay more today to change our wasteful habits, while adopting a mindset focused on conservation to build "aqua equity" for future generations?

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 4:12 AM

Friday May 1, 2009

Stoplight HindSight

While stopped at a traffic light behind an old Ford Windstar minivan, I snickered at the chrome appliqué announcing in bold letters that it was equipped with "Traction Control." Wow, imagine that! Windstar Rear

Have we come that far in auto technology that it's laughable now to think that Ford would deem the now mundane TC worthy of rear-end vanity? But Ford is not alone in silly trunk bunk. Through the years I've studied many a car's derrière and have seen banality forged in chrome about:

  • engines — V-6, V-8, Hemi, Rotary, Turbo, Fuel Injection, Tri-Carb, Quadrajet, OHC
  • transmissions — Powerglide, Hydramatic, Automatic, 5-Speed
  • drivetrains — FWD, AWD, 4wd, 4 Matic, QuadraDrive, ABS
Then there are meaningless tags like GT, Touring, Limited, Unlimited and the dated Deluxe, Super and Custom.

In the good ol' days, most vehicles were simply named and branded eponymously (Chrysler, Ford, Toyota (Toyoda), Studebaker, Olds(mobile), Cord, Ghia, Dodge, Chevrolet, Buick, Mercedes-Benz, Hudson, Duesenberg, Tucker, Kaiser-Fraser, etc.). Specific models were typically distinguished with letters or numbers (Model T, Model 55, Series D, etc.).

Eventually, with the proliferation of models within brands, cars were named after:

  • animals — Impala, Mustang, Falcon, Tiburon, Stag, Bronco, Ram, Cobra, Barracuda, Lark, Rabbit, Jaguar, Colt, Eagle, Cougar, Stingray, Charger, Hornet, Beetle, Pinto, Hawk
  • gods and mythology — Mercury, Dragon, Titan, Fury, Demon
  • wind — Zephyr, Scirocco, Tempest
  • geographic places — Monte Carlo, Ventura, Sedona, Eldorado, Tucson, Lucerne, Bonneville, Plymouth, Capri, Windsor, Seville, LeMans, Manhattan, Monterey, Belvedere, Continental, Fleetwood, Monaco, Calais, Riviera, Biscayne, Park Avenue, Malibu, Bel Air, Catalina
  • macho types and legends — Cadillac, Chieftain, DeSoto, Ranger, Matador, Rebel, Lancer, Commodore, Valiant, Champion, Maverick, Challenger, Intrepid
  • royalty — Ambassador, Regal, Royal, Tudor, Signet, President, Victoria, Crown, Imperial, Coronet
  • weapons — Cutlass, Dart, Torpedo, Javelin, Armada, Excalibur, Arrow, Laser, Magnum
As brands and models continued to proliferate — and necessity being the mother of invention — newly coined words would, with clever marketing define the vehicle's image (Corvette, Galaxie, Electra, Chevelle, Polara, Camaro, Futura, Altima, Celica, Toronado, Jetta, Camry, Forenza, Corvair, Invicta, Sportage, Mystere, Impreza, Sentra, Boxster, Acura, Lexus, etc.).

Mercedes-Benz resisted this appellation temptation and continued with letters and numbers to distinguish their models. In the mid-1950s, its SL two-seater sports car debuted designating "sport light." A few years later came the SEL denoting an S-Class car with fuel injection (Einspritz) and a long wheelbase. (And just to add to the confusion, the "S" in SEL was not the same as the "S" in SL.)

Acceptable letter combinations are going fast. Soon only PU and BO will be left!

American cars toyed with the idea in the 1960s with the Ford XL and LTD, Chevy SS, Javelin SST and AMX, Dodge GTS, Plymouth GTX, Barracuda S, Cougar XR7 and Pontiac GTO.

As Mercedes' status and prestige spread worldwide, other manufacturers dove into the alphabet soup. (In 1996 Acura incredibly renamed its flagship Legend to RL, discarding a decade's worth of valuable brand equity.)

Now it's all the rage to name vehicles with meaningless (guess the brands) letter combinations like DTS, CTS, STS, XLR, SRX, ESV, EXT, TL, TSX, RSX, MDX, FCX, FX, QX, LS, ES, RX, IS, GS, SC, LX, LT, MX, RX, CX, CLK, CLS, CRX, SLR, SLK, SVT, GL, HHR, NSX, XC, XJ, XK, TT, C, A, E, G, Q, M, G, H, R, S... Zzzzzzzzzzzz. In an attempt to stay awake and in the game, Lincoln joined the chaos in 2007 by rebadging its year-old Zephyr to MKZ, followed by the introduction of MKX and MKS models. And the MKT is expected to debut in 2010. Got all that?
[With acceptable letter combinations going so fast, soon only PU and BO will be left!]

Even more prestigious today is to be able to brandish on a car's hindquarters "Hybrid" or a chrome "H" indicating "this car is electric, gets great mileage and averts global warming." Prius Rear

But we'll just have to wait a decade or so to find out — when stopped at some traffic light in the future staring at these no-longer-shiny chrome badges of status — whether today's state-of-the-art green technology will have become ubiquitous and familiar enough to elicit a snicker.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 8:04 AM

Wednesday April 1, 2009

Then and Now

Time is fascinating when you take the time to think about it. We tend to consider time a man-made commodity as if we created it — when if fact all we really do it measure it. Fooling ourselves in the process, we parse time into nanoseconds, microseconds, seconds, minutes, days, weeks, months, years, decades, centuries, millenniums, periods, ages, eras, eons, epochs.

As we're measuring we can have a great time; a sad time; a boring time; a wonderful time; even an exceptional time. We can also waste time, spend time, give up time, give back time, and squander time.

Our perception of time can vary. It can go fast; go slow; even stand still — like watching a DVD with a remote. And at times it may seem to repeat itself as Yogi Berra famously observed, "This is like déjà vu all over again." Unfortunately time cannot be rewound or made to go backwards. (Why is that?)

The musically gifted write songs about time: Turn Back the Hands of Time; Time Is On My Side; Time Passages; Does Anybody Really Know What Time It Is?; Till the End of Time; and As Time Goes By are just a few.

We also have colloquial phrases like "It's about time," "Time for change," and "Time is precious." Some are heard more often than others. (The recent U.S. presidential campaign comes to mind.)

We can use time to describe ourselves and others. We can be Big-time. They are only Small-time.

We can even use time to punish: "You're in time-out!" and "25 years to life!"

We can live our lives according to time, as in "Early to bed and early to rise." Most strive to live for the moment. Many yearn to go back the old days. Others impatiently anticipate the future.

Time can be ahead of us or behind us, which becomes more important for each of us as time goes by.

The age of time is very old. Physicists tell us it's been around since the birth of the universe about 15 billion years ago. (This used to sound like a big number before all of the recent financial bailouts.) During our relatively brief time here on earth we learn that some things disappear, new things arise, and some things stay relatively the same.

An interesting activity is compiling "Then and Now" images to visibly see how things have changed — noticeably and in some cases so much so as to almost revert back to the way they were.

Here are some examples:

Click images to magnify
The more things change...
Zenith Console Bolla Console
TV in a Console (1964) TV on a Console (2009)
...the more they stay the same.
The more things change...
TDK ad iPod ad
Johnny in 1973 Johnni today
...the more they DON'T stay the same.
Gripping advertising...
Armstrong ad Pirelli ad
Hawking tires (1965) Hawking tires (2009)
...never loses traction.
Some things thankfully never change...
Old Coke New Coke
Then and then (1968) Now
...at least on the inside.
In-flight departures from bias and stereotyping...
United ad Qatar ad
"How come our girls are so capable?"
(1951)
"Best cabin crew in the Middle East."
...thankfully arrived.
More great hits and stars! By snail mail!
Columbia ad iTunes
Wow! A record club! (1968) Wow! No record club!
Even more great hits and stars! Instantly!
Envisioning push buttons in 1959...
Western Electric ad iPhone ad
Then, then and 1959's future Now
...remembering push buttons today.
Recycling sheet metal...
Toyota 2000GT Pontiac GXP
1968 Toyota 2000GT 2009 Pontiac GXP
... in the figurative sense.

When you take the time to think about it time really is fascinating. After seeing Johnny with his TDK tape cassettes, the stewardesses enduring insipid questions, and push-button phones once defining our future — would anyone want time to go backwards in time?

Time will tell...

posted by John Herman 4:48 AM

P.S.
Perhaps in time we'll enhance this site with additional "Then and Now" examples.

Sunday March 1, 2009

Desperate for a Hero

Chesley B. (Sully) Sullenberger is a hero. If there's anyone who has not heard this they must be living in a cave. He is a hero for sure — to the people that he saved with his miraculous jetliner landing in New York's Hudson River after Canada Geese incapacitated both engines.

I don't consider Sullenberger my hero however — and I'm tired of the media trying to convince me that he is. I have my own modern day heroes, thank you. Generally, most of us do and they reflect ideals based on our values and experiences. And we tend to keep them private. But not any more!

Arnold Schwarzenegger is a big hero of mine. I admire his incredible story taking him from penniless bodybuilder, to seven-time Mr. Olympia, to Hollywood superstar, to governor of perhaps the most complex U.S. state to oversee. If you've ever lifted weights you can appreciate the mental and physical self-discipline involved to become a bodybuilding champion. The rest was a cakewalk. Well, perhaps not.

Musician Ian Andersen is another. Self-taught, he managed to turn the once-delicate flute into a "heavy metal" instrument worthy of integration with blaring electric guitars and booming drums — naming the unlikely amalgam Jethro Tull. Andersen crafted scores of timeless tunes that classically trained flautists never did, could or would — garnering worldwide notoriety while inspiring countless young flutist wannabees.

Being involved with banking for almost three decades, another personal hero is the late Walter Wriston, CEO of Citibank from 1967 to 1984. His legacy is leveraging computer technology to transform the company which was a big player in a relatively sleepy industry
Sullenberger succumbed to the self-serving advances of the media and opportunistic politicos.
into "The Citi Never Sleeps" 24/7 powerhouse. It then led with innovation for two decades with ATMs, credit cards, new money vehicles and online banking. His protégé John Reed, who technically implemented Wriston's vision, foolishly succumbed to the street-smart charm of Sandy Weill to create the financial giant Citigroup. Time magazine recently cited Weill as one of the prime culprits in melting down our financial system. Lesson learned here is that heroes do not necessarily beget heroes.

Bodybuilder, rock musician and corporate CEO; I prefer a diverse hero portfolio. Their selection was personal — I chose them without external prodding. I object to having heroes pushed upon me by anyone, particularly the media who are starving for hero types for self-serving reasons.

It used to be politicians were our heroes. Presidents Roosevelt and Kennedy were looked up to by many. Then Nixon tarnished the office. There was hope with Reagan, but then Clinton and Bush quashed that. Used to be professional athletes were our heroes. Mickey Mantle, Hank Aaron, Jackie Robinson, Joe Namath, Mohamed Ali come to mind. Today's diminished sports figures include Pete Rose, Roger Clemens, Alex Rodriguez, OJ Simpson and most recently Olympian Michael Phelps caught taking a marijuana hit.

Is it a dearth of character in modern times, or is it just the increased media scrutiny that has raised the bar? Perhaps both. Nevertheless, when someone with hero potential surfaces out of obscurity, the media latch onto to them with almost pathological zeal.

The CBS Early Show aired "Miracle on the Hudson," dragging out Sully and crew snippets through the entire two-hour show. Here are some of hosts Harry Smith and Maggie Rodriguez's insipid questions after incredibly stating "Instead of mourning a tragedy like we should be doing, we're celebrating the miraculous landing."

• "Is all this surreal for you?"
• "What made you so confident?"
• "Have you had a release, a moment where you just either cried, or screamed, or let it out somehow?"
• "What's the difference between landing on water and landing on a runway as far as how it feels?"
• "Do you all think about this every day?"
Then it became apparent that spontaneous answers to the same questions over and over morphed into contrived analogies. Sullenberger summed up his lifelong preparation for the event: "For 42 years I had made small, regular deposits of education, training and experience — and the experience balance was sufficient that on January 15 I could make a sudden, large withdrawal." Ugh.

Finally, in what might be politely described as being in very poor taste, one of the survivors Emma Sophina, an aspiring singer-songwriter from Australia, tried to jump-start her career by capitalizing on the attention by writing and performing the monotonous ditty "Send Another Prayer to Heaven." Yuck.

The most distasteful and bizarre treatment of the mishap however was making light of the whole incident on David Letterman. He joked that Sullenberger was on so many shows he would next be seen on Rachael Ray's show cooking up a recipe for Sullenburgers. It would be funny if it wasn't almost believable.

Letterman began the interview with a blow-by-blow chronology of the take-off:

• 3:25:48pm, flight 549 from LaGuardia leaves for Charlotte
• 3:27:01pm, birds strike at 3200 feet

Then wisecracked:
• 3:28pm, snacks and beverages were served.

Tasteless quips followed, and incredibly not only by Letterman:
• Letterman joked that it was good there was plenty of space for standing passengers and, "In what other aspects of your life have you heard brace for impact?"
• Letterman asked if it would have been worse in the ocean. Sully replied "Yeah, but the Hudson stinks."
• Letterman mentioned that they cleared the George Washington Bridge by 1000 feet. Copilot Jeffrey Skiles joked, "Better than zero feet."
Other banter included chuckling about failing to get the engines restarted, joking that sometimes "it's hard to get good help," and mocked receiving unsolicited letters and drawings for jet engine screens to thwart future engine failures from birds.

Worse, the commercial breaks were segued with tunes like "Take Me to the River."

In between all of this Sully sensibly revealed that he's "More than tired of telling this story," and Skiles indicated he just wanted his old life back. Letterman should have reminded them that action speaks louder than words.

After being so limelighted on TV — 60 Minutes, The Early Show, Good Morning America, Letterman, Larry King, Obama's Inaugural, the Super Bowl — Sullenberger showed up yet again in
Spontaneous answers to the same questions over and over morphed into contrived analogies.
an essay featured in Newsweek lamenting that he hasn't spent many nights at home; that he and his family are trying hard to remain true to themselves and not change, "but there's a steep learning curve."

What learning curve? Just be true to yourselves and go home already! Chalk it up to your (drawn-out) fifteen minutes of fame. Unless of course there'll be book and movie deals. It will be very interesting to see if any of the crew goes back to their old lives. Can they possibly, given the lure of the media and the intoxicating effects of fame?

With that question yet-to-be answered, Will Sully ever become one of my heroes?

I lost respect for the regular guy "just doing his job" when he succumbed ad nauseam to the self-serving advances of the media and opportunistic politicos. If he does go back to piloting and there's nary a sign of a book, movie or TV hosting job — then perhaps.

Time will tell...

posted by John Herman 2:28 PM


As expected, Sullenberger took the bait — even sooner than anticipated however. Crain's New York Business reported on March 11 that he "landed [pun intended?] a book deal with the William Morrow publishing company and got an advance worth as much as $3 million for two books."

So much for the "just-doing-his-job" regular guy. Oh, the burden of becoming a hero.

Gee, imagine all the overworked and underpaid pilots knowing that millions await them if they just crash-land their planes.

Now that's a scary thought.

Postscript by John Herman March 13, 2009

Sunday February 1, 2009

Ersatz Lives Imitating Art?

While watching an old Star Trek episode entitled "Return to Tomorrow," I realized that I and fellow Trekkies have been watching Kirk and crew beaming up via the transporter for over forty years. I wonder now after watching the series for so long, whether I'd be thrilled if the old-fashioned transporter (as well as warp drive, phasers, and photon torpedoes) finally became reality. With the ever-increasing sophistication of special effects that can sometimes seem more real than for real: Are we becoming desensitized not only to the present — but also to the yet-to-be?

Looking back, the goal of audiovisual technologies in the early days was to make the artificial seem real. Phonographic recording media began as scratchy cylinders and clay platters and progressed to crystal clear
Reality show popularity reflects an ironic yearning to experience reality, via TV!
multi-channel discs that can sound better than live. Films evolved to do the same, from the primitive "silents" to the first "talkies" to Technicolor and widescreen leading to today's mind-blowing digital IMAXs. With all this amazing technology, the real — when it is remarkable in some fashion — now evokes the artificial.

For example, living through a tornado might remind one of the movie Twister. Seeing a dangerous car chase could elicit Gone in 60 Seconds. If and when we're invaded by aliens, it'll probably recall Alien, ET, Close Encounters or War of the Worlds — depending on their demeanor. If the experts warning of global warming are correct, one day reality will summon The Day After Tomorrow. If biologists succeed in cloning extinct species, we'll all be able to visit Jurassic Park. Maybe life imitating art will be the norm and not the exception.

Perhaps; But how can we continue to consume escapist entertainment when the unreal and real become so blurred inside our heads? It used to be that we were only living the movie while watching it. Today they are etched in our minds for life, just waiting to be educed by circumstance. And as the special effects continue to dazzle beyond physical reality, we'll have trouble living in the relatively staid present without suffering mental spasms of withdrawal.

The Phrenicea scenario envisions future inhabitants swapping their present with the past to escape their boring existence — reconstituting moments lived by others by bidding on bits of memories stored within Phrenicea's braincomb. Will they just be experiencing old TV and movie clips?

Today, many young adults watch Sex and the City, Friends and Seinfeld et al until they can recite every line from every episode, fantasizing their real-life relationships will be as interesting as idealized fiction. Looking at it from a different perspective; Would they watch a Friends episode portraying the characters watching a Friends
Are we wasting our lives watching TV and other vicariousnesses like YouTube?
episode? Sounds awfully boring. And that's the point — Are they wasting their corporeal lives watching TV, DVDs and other vicariousnesses like YouTube? Ironically, the popularity of reality shows might reflect a yearning to experience reality, albeit via TV! Has real living become that tedious?

More worrisome, the fake is overtaking "for real" in other aspects of life as well. The music video game Guitar Hero simulates the playing of guitar using just five colored fret buttons. More than 20 million have been sold. Tournaments and competitions are the rage — all to electronically mimic the real thing without learning music terminology, theory, notes, or chords. Mastery of hand-eye coordination is the only requisite. Is too much time being squandered on skill-less gratification rather than on authentic knowledge acquisition and appreciation? Are there too many ersatz lives imitating art?

Star Trek's "to boldly go..." this isn't.

Time will tell...

posted by John Herman 8:32 AM

Thursday January 1, 2009

The Twelve Blogs of '08

Here we are again, lamenting yet another year's passing while suffering post-celebratory writer's block. Rather than trying to cobble together a half-hearted essay, we'll sheepishly follow the last two years' cop-outs to lethargically review the "Twelve Blogs of 2008" — with premature nostalgia and perhaps feigned interest as to whether they're still relevant.

December's blog warns of the ominous "Attack of the RFIDs!"

The bad scenarios are Big Brother scary. Governments could become Tinkerbells sprinkling RFID dust on unsuspecting individuals for tracking purposes. Inhaled and trapped in the lungs, you'd become an ambulatory transmitter for life... (Read more)

November's entry declared, "What we really need now is Common Sense 2.0!"

This is more than silly. To ascribe version numbers to the earth with the overblown illusion that it is now our creation because of our innate arrogance and egocentricity is laughable... (Read more)

October's blog laments how America's Founding Fathers would shudder at how their idealistic vision has blurred:

How did it come to this? Technology! As in sophisticated systems for marketing and polling. As in TV and satellites; (ab)used by both political camps to capture the attention of the harried and hurried, 24/7 connected lifestyles that afford little time to consume more substantive data even if it was presented... (Read more)

September's entry objectively evaluates subjective automobile styling:

Late to the Ugly Bug Ball is Acura. Traditionally conservative, parent Honda is throwing caution to the wind to have their upscale brand noticed in 2009 — for better or for worse. The latest designs that arguably out-ugly the competition found inspiration unfortunately from an oddball concept... (Read more)

• In August we were weighed down with heavy thoughts about lighter-than-air hydrogen:

We envision solar-based energy to be the way of the future. Mother Nature found a mechanism to capture the enormous energy of the sun with chlorophyll and photosynthesis, and we've been consuming the lowest hanging fruit of that miracle in the form of petroleum... (Read more)

• In July we worried about recycling dangerous technology:

There's little doubt that rising fuel prices and global temperatures are pressing issues requiring action. We owe it to ourselves and to posterity to become sufficiently educated to intelligently evaluate potential options. It's popular now to pander to those outraged over $60 fill-ups and brand global warming "bad" because that's been the predominant message... (Read more)

• In June we pondered the ephemeralness of our personal memories:

It seems impossible now, but there will come a day when the mechanism for memory assembly and storage within the human brain is elucidated. A next logical step would be to try to save or replicate these memories, perhaps for recollection by others... (Read more)

May's blog exposed the significance of treating angina with Viagra:

The unexpected rise out of Viagra illuminates an important point: Drug synthesis and discovery is not really by design, although most chemists in the industry will not readily admit such. The chemical compounds are created with as much art as science by unnatural means... (Read more)

• In April we did the math to sum up whether we've experienced exponential progress since entering this century:

A lot happened in the 20th century making its end appear quite different from its beginning. So, let's evaluate the most significant achievements/events that occurred in those incredible ten decades vs. the 21st century to date... (Read more)

• The March blog contemplated why many of today's richest, most influential and creative individuals broke from the mold and charted their own paths to success:

The solution to increased competition from an ever-growing global workforce will be attaining two important qualities sooner rather than later: maturity and passion. Unfortunately many of today's youth are lacking in both, and that quest seems to elongate with each succeeding generation... (Read more)

• In February we became more observant of our dependence on the baggage in what appears to be a burgeoning Bag Age:

Does convenience beget stupidity? Does stupidity beget convenience? Or do they complement each other synergistically? I contemplate this now as I find myself inundated with plastic bags carried home from virtually every type of retail store. And the problem seems to be getting worse day-by-day as behavior accommodates their omnipresence and so-called convenience... (Read more)

January's entry was tinged with nostalgia and torpor — and guilt for not writing a new essay:

Here we are again, lamenting yet another year's passing while suffering post-celebratory writer's block. Rather than trying to cobble together a half-hearted essay, we'll sheepishly follow last year's precedent-setting cop-out to lethargically review the "Twelve Blogs of 2007" — with premature nostalgia and perhaps feigned interest as to whether they're still relevant... (Read more).

You may or may not agree, but after reviewing the "Twelve Blogs of 2008" it appears most are still relevant as we roll into the New Year. And somehow after recalling the energy that instigated the blogs to begin with, we don't feel quite as lethargic.

Maybe that's the real benefit of looking backward, reminiscing and singing "Auld Lang Syne" — not for nostalgia per se, but to recharge our batteries and begin another year with renewed energy and enthusiasm.

Time will tell...

Happy New Year!

posted by John Herman 1:27 AM

Monday December 1, 2008

Attack of the RFIDs!

As you shop 'til you drop this holiday season be on the alert for RFIDs! Pronounced almost like Triffid or aphid, they're neither fictitious sci-fi creatures nor common insects. Yet, you may soon be endangered by or infested with them. Perhaps you have one on your person already, or in your car for electronic toll collection.

RFID is an acronym for Radio Frequency Identification. RFID tags are tiny microchips that have antennae to transmit to receivers data such as account numbers, physical location, product information, price, color, size, purchase date, etc.
The scary part is you will become a mobile device beaming all sorts of data to who knows who.
(To be technically correct, an RFID tag actually transmits a link into a database on a computer somewhere that stores the information it is reporting on such as account number, etc.)

Some of the larger RFIDs have batteries while the tiniest don't — they get their power from the radio transmitters asking for their information.

In addition to being on a key chain or stuck on a windshield, radio tags are to be attached or embedded in credit cards, clothing, grocery items, drug bottles, books, magazines, cell phones, computers, currency, tires, passports and even you — beneath your skin for access to buildings and for storing your ID and medical records.

The scary part is that you will become a mobile device beaming all sorts of data to who knows who — through barriers and from as far away as 700 feet. Imagine if your charge cards, clothing and shoes all were transmitting data — even the age and color of your underwear! Inquiring types could identify or profile you and track you everywhere. You could even get hassled when your stuff's expected lifespan is reached to buy replacements. And because RFIDs last about ten years, they can potentially transmit information for purposes well beyond their original intention.

RFID infiltration is already underway. Besides Mobil with its SpeedPass, banks are issuing key chains and credit cards with MasterCard's PayPass that can be used at subway stations, 7-Eleven, McDonald's and movie theaters. Their marketing literature hypes the benefits:

• No need for cash!
• Amazingly quick and easy way to pay!
• Feels like magic!
• Now you can fly through the checkout!
• Be the first to get what's next! [Be the first on your block!]
This sounds great, but because these tags are linked to one of your accounts, they will have the means to eavesdrop and monitor your life — recording where you go and what you buy. There is a concern too that thieves will be able to hack your radio tags to make unauthorized purchases.

Not to pick on banks per se, retailers want RFIDs to replace traditional barcodes to better track their inventory. They'll also be able to scan you coming and going and perhaps direct you to aisles based on their perception of your needs. [Your underwear is how old?]

Governments could become Tinkerbells sprinkling RFID dust on unsuspecting individuals.

And as is typical when technology is involved, things will get even more complicated. Hitachi has developed RFIDs that are so tiny they could fit between ridges of a fingerprint. They're small enough to be referred to as "powder RFIDs."

These powder tags have the potential to identify "trillions and trillions" of items. They'll be incorporated in just about anything you may purchase as they can be embedded in the packaging. A positive scenario to envision is bagging your groceries as you navigate the isles at your supermarket. When finished a scanner will compute the total of your entire cart as you approach the cashier station.

The bad scenarios are Big Brother scary. Governments could become Tinkerbells sprinkling RFID dust on unsuspecting individuals for tracking purposes. Inhaled and trapped in the lungs, you'd become an ambulatory transmitter for life — a life which might become shortened by this high-tech asbestos.

The Phrenicea scenario of the future envisions the total loss of privacy. Perhaps this is the way it will begin.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 2:11 AM

Saturday November 1, 2008

Two Point Uh-Oh

The term "Web 2.0" is a popular expression referring to the second generation of the World Wide Web. The media has created a "2.0" craze by slapping the trendy suffix (implying next or new) onto just about anything. We addressed this originally way back in May 2007 — and it seems there is no end in sight.

Version numbers like 2.0 are adopted from the software industry, from what's commonly referred to as the "development life cycle." The confounding terminology is being unabashedly usurped by advertisers making it part of our vernacular. Although their target market may not truly grasp the pretentious and technical jargon being exploited, what is undoubtedly implied is "new and improved."

To imagine where we would be today if this silly jargon began early in the 20th century, let's for fun try to retroactively apply (loosely) "something-point-something" version numbers to several ground-breaking advances from the past — innovations unleashed upon the masses at a time before software and its perpetual upgrades began controlling yet-to-be-developed computer hardware.

For example, "Radio 1.0" would enable millions of regular folks to acquire console-sized, beautifully crafted, wood-encased receivers to experience entertainment programming geared towards a wide audience. Breaking news would propagate across the land almost instantaneously. A new sense of mass identity and community was established using technology. For many, the radio became a focal point in the home — a place to gather each night.

So what innovation could we ascribe to Radio 2.0? The car radio perhaps. Radio 3.0? That might be the tiny, tinny, plastic and portable transistor radio. The pocket marvel that enabled millions of baby boomer teens to revel in their music — rock and roll — defining a generation while driving parents crazy. For many, the transistor radio became a focal point of the self — a personal gadget to hideaway with. Taking radio's progression further:
Radio 4.0 - FM
Radio 4.5 - FM stereo
Radio 5.0 - XM/Sirius satellite
Radio 6.0 - ?

Now let's try to apply version numbers to pre-recorded music:
Music 1.0 - phonograph - 78rpm
Music 1.2 - phonograph - 45rpm
Music 1.5 - phonograph - 33 1/3rpm "Long Playing"
Music 2.0 - hi-fi and stereo
Music 3.0 - 8-track tape
Music 3.5 - cassette tape
Music 4.0 - CDs
Music 5.0 - iPod
Music 6.0 - ?

How about culture-transforming electronic point-to-point communication:
Point-to-Point 1.0 - telegraph
Point-to-Point 2.0 - telephone rotary
Point-to-Point 2.5 - telephone pushbutton (touchtone)
Point-to-Point 3.0 - wireless cell phone
Point-to-Point 4.0 - BlackBerry
Point-to-Point 5.0 - ?

And finally the evolution of locomotion or travel:
Travel 0.1 - flagellum
Travel 0.9 - four-legged ambulation
Travel 1.0 - two-legged ambulation
Travel 2.0 - tamed horse
Travel 3.0 - covered wagon
Travel 4.0 - trains
Travel 5.0 - horseless carriage
Travel 5.5 - automobile
Travel 6.0 - prop(eller) planes
Travel 6.5 - jets
Travel 7.0 - transporter?

If applying version numbers retroactively seems silly — it probably is.

So, maybe we should view applying them willy-nilly nowadays as just as silly. It's apparently not too silly though for Scientific American magazine. Perhaps they took a cue from our original Phrenicea tongue-in-cheek analogies to seriously name a new magazine — Earth3.0 — based on the pretentious premise that humankind in a scant century or so has already outpaced 4.5 billion years of natural entropy.

Here's their justification (edited for brevity):

This planet is no longer simply the home of our species: it is also our creation.[sic] And as with any product, sometimes it is prudent to upgrade its quality.[sic]

Earth 1.0 was the world that persisted and evolved for billions of years, up until very recently. Even after we humans developed agriculture, which considerably enlarged our footprint on the environment, our overall influence was fairly small and localized. That changed two centuries ago with the arrival of Earth 2.0, when the industrial revolution gave the human race the leverage to achieve unprecedented health and prosperity but at the price of wanton consumption of natural resources.

Today we have unwittingly become the major drivers of potentially disastrous climate change. We have extinguished species at a rate not seen since the end of the dinosaurs. We have depleted ocean fisheries so severely they could collapse by midcentury.

Earth 3.0 is thus the new way forward that we need to establish, one with all the prosperity of 2.0 but also the sustainability of 1.0. And it is in that spirit that we present Earth3.0.

Click to read more


This is more than silly. To ascribe version numbers to the earth with the overblown illusion that it is now our creation because of our innate arrogance and egocentricity is laughable. Planet Earth will inevitably do just fine without us — whenever that ends up to be.

Nevertheless, maybe the idea for this new magazine was a good one, capitalizing on Scientific American's good name and our present day concerns, albeit with an unabashed goal to generate additional revenue.
Ascribing version numbers to the earth with the illusion that it's our creation is laughable.
Scientific American has been in existence for over a century. It's a blue-chip publishing brand built very slowly since 1845 that has garnered a reputation of quality. In contrast, How long will faddish Earth3.0 last? Unfortunately, everyone seems to think short term today — and that is why we are in a world of crisis on more than one front.

Like General Motors et al from the past, the publisher is leveraging the time-honored and respected name Scientific American for a quick profit. You'd think by now that lessons would have been learned given the state of companies like GM, who cheapened their once powerhouse names and reputations to virtually valueless trash.

What we really need now is Common Sense 2.0.

Time will tell...


Click here for more sightings of Point Uh-Oh terminology in the media.

Presented as a "Timely Yet Timeless" post by John Herman 8:58 AM

Wednesday October 1, 2008

In Sound Bites We Trust

Through all of the 2008 U.S. presidential campaigning hoopla, there's been little substantive discussion of important issues like energy, education, boomer retirement, immigration, conservation, environment, health care, evolution/intelligent design, bioethics and stem-cells, lingering security threats — as well as the fragile global economy.

Instead, we are bombarded daily with cursory, carefully crafted sound bites, incessant accusations, tough talk, innuendo, decades-old misdeeds, and of course poll results. Both political parties have employed character-manipulating handlers, and SWAT teams to verbally attack and counterattack each other with clever pyrotechnics that can instantly shoot around the world for anyone still interested.

How did it come to this?

Technology! As in sophisticated systems for marketing and polling. As in TV, satellites, cable, the Internet/web, PCs and hand-held devices. The unfortunate result is a desperate attempt
The Founding Fathers would shudder at how their idealistic vision blurred with the worst that technology has to offer.
by both political camps to capture the attention of the harried and hurried, 24/7 connected lifestyles that afford little time to consume more substantive data even if it was presented.

America's Founding Fathers would shudder at how their idealistic vision has blurred with the worst that technology has to offer.

Perhaps the most powerful influence is TV, although the Internet/web is closing in. Even before the fateful debate between Kennedy and Nixon, Eisenhower employed the marketing powerhouse Rosser Reeves (known for M&Ms' "Melts in your mouth, not in your hand" as well as other famous commercial tag lines) to subtly portray him in the TV ads as authoritative yet likable.

It was the first use of a political "ad campaign" with short 30 second "spots" to manipulate the viewer's perception of the candidate. Its success set the stage (so to speak) to employ ever more sophisticated principles and techniques for legal brainwashing en masse.

Wayne Stayskal
© Tribune Media Services, Inc. All rights reserved.

At about the same time, the science of polling "Galluped" ahead and provided the so-called pulse of the nation — which can itself influence opinions exponentially.

Finally, the ubiquitous electronic medium facilitates the infamous "debates," which have become anything but. The intense negotiations that precede them strive to minimize spontaneity and maximize the probability that there'll be no major gaffes that would be aired ad infinitum — leading to the destruction of the unfortunate perpetrator. News anchor Jim Lehrer explains, "It is the only opportunity to evaluate candidates side by side. For the candidates, it's their last chance to close the deal with voters." The candidates echo Lehrer:

  • "You're on guard; you don't want to make a mistake; you don't want to say anything that's going to offend." - George H.W. Bush
  • "It was intense and confrontational from beginning to end." - John Edwards
  • "The stakes are very, very high." - John Kerry
  • "Now, was I glad that the damn thing [1992 debate] was over? Yeah, maybe that's why I was looking at [my watch]. Only ten more minutes of this crap." - George H.W. Bush

*****

So when will this end?

Short term: After the 2008 U.S. election and there's a collective sigh of relief among its citizens — that it will be at least a year (hopefully!) before the 2012 election is mentioned.

Long term: When the institutionalization of Phrenicea occurs. "Phrenicea ensures that no voter is uninformed. Every eligible voter has complete access to the entire unbiased history of each candidate — objective factual and biographical information resident within the Phrenicea braincomb — to decide which candidate is the best choice. Any vote even partly based partly on subjective criteria is summarily rejected.

Sounds too good to be true.

Time will tell...

© 2008 by King Features Syndicate, Inc. World rights reserved.

Presented as a "Timely Yet Timeless" post by John Herman 2:27 AM

Monday September 1, 2008

What's Acura Spelled Backwards?

Late summer/early fall is typically the time of year when the buzz begins for new car introductions, touting engineering innovations and "all new" styling.

Unlike objective evaluations of technical advancement, a car's appearance is purely subjective, although most often a consensus eventually emerges translating into sales success or failure. Controversial designs that come to mind are Ford's Edsel and Pontiac's Aztek, both disastrous sales duds. Yet, germinal "ugly duckling" styling on a prestigious brand apparently can breed a litter of copycats as BMW's Bangle bungle has with complex curves, humps, bumps and creases.

Late to the Ugly Bug Ball is Acura. Traditionally conservative, parent Honda is throwing caution to the wind to have their upscale brand noticed in 2009 — for better or for worse.
Germinal "ugly duckling" styling can breed a litter of copycats.
The latest designs that arguably out-ugly the competition found inspiration unfortunately from an oddball looking concept car from 2006.

Acura's initial press release for its 2009s is typical corporate braggadocio, touting "the all-new, completely redesigned TL employs dramatic new styling...with a dynamic, new look." The new RL is "Ahead of the Curve." The TSX "is expected to appeal to image seekers who will appreciate its innovative styling..." And in a world of mundane and meaningless taglines, Acura follows through with just one word, "Advance."

The auto press seems not to agree however. Autoblog.com states "Acura's love-it-or-hate-it new shield-grille face is affixed to the front end..." Nextautos.com is more blunt, "The rather ostentatious front end treatment, most notably the chevron shaped front grill, could prove a deal-breaker for some potential buyers." Car and Driver is none too kind: "It's Official: It's Ugly, But Should be Fast." Autoweek is blunter still; "[Acura's] response to make it more distinctive may result in the baby being thrown out with the bath water." Ouch!

These are subjective comments based on myriad influences. We all have biases derived
The latest Acura designs arguably out-ugly the competition.
from our personal experiences. Case in point; being fan of sci-fi B-movies, I cannot help but associate Acura's new corporate face with 1954's robot Tobor the Great.

Could this likeness to a robotic visage be just a coincidence? Or did Acura designers decide look backwards (Tobor is r-o-b-o-t spelled backwards) in their quest to be forward looking?

Of course this off-the-wall association is a subjective one. Nevertheless, perhaps the professed resemblance can be amusingly presented herewith:
The excitement begins!
"Robot" spelled backwards "Acura" spelled backwards?
A double feature!
TOBOR 1954! 2009 RL!
Scene I
TOBOR say "cheese"! Say cheesy TL!
Scene II
TOBOR beaming High-beamed TSX
Scene III
E2EG TOBOR E2EG TL
Scene IV
The End. The End...of Acura?
Dénouement

Tobor the Great, a timeless 1950s classic movie with the then common plot elements centered on technology's omnipotence, the Space Race and Cold War paranoia. Who would have predicted that 55 years later its main character would resemble a 21st century automobile? Unfortunately, what's cute on an old robot doesn't necessarily translate well onto a new car.

Acura's plot in 2009 is yet to be written. Its unfolding depends upon how potential buyers react to what many auto experts are calling cheesy styling. Will the new comedic front-ends be the beginning of Acura's end? Will its future be turned upside down foreshadowing a downturning frown? After more than two decades of fine cars, that would be a tragedy:

Greek Comedy / Tragedy Acura's Comedy or Tragedy?
How will Acura's play end?

Upon further reflection, a Greek tragedy might be a better metaphor than a '50s B-movie given Acura's potentially suicidal makeover. Perhaps to protect its hard-earned brand equity, Acura should mask its identity and rebadge these grinning models "Aruca." (That's A-c-u-r-a spelled backwards.)

Time will tell.

posted by John Herman 8:34 AM

Friday August 1, 2008

Ubiquitous, and Ridiculous

The Phrenicea scenario envisions the eventual adoption of solar power for dwellings and transportation. So why did we not choose ubiquitous hydrogen, now being pandered to hopefuls — mostly by politicians and a General Motors teetering on bankruptcy — as the fuel of the future?

With the U.S. presidential election fast approaching and oil prices figuratively in the stratosphere, there's hopefully going to be some debate about our (in reality the world's) dependence on oil and the vulnerabilities that that engenders, not to mention the deleterious impact on the environment. So-called green alternative fuels that sound too good to be true, such as hydrogen, will be broached superficially — and a gullible public will once again be lulled into a false state of optimism. "Ok, next topic folks..."

Likely not discussed: the U.S.'s eventual loss of its monopoly on living standards supported by the high consumption of the earth's resources — as the so-called "have not" nations aggressively embrace capitalistic-like models, perhaps crudely at first. This will more than offset any progress that might be
Significant energy is required to split hydrogen from the bonds of a strong molecular grip.
garnered with expensive high-tech conservation efforts by the "already haves." The consequence is that things are going to get much worse before they get any better.

Not to digress further, Why not hydrogen, a potentially clean fuel?

It's true that hydrogen is literally everywhere, but not in a state at our (sea) level where it can be readily consumed. Hydrogen is found in nature usually tightly bound to other elements, the most abundant being in combination with oxygen to form water. If it is bound to itself (H2), it is literally in the stratosphere in trace amounts, being about 14 times lighter than air.

Significant energy is required to split hydrogen from the bonds of a strong molecular grip. Where would this energy come from? It can be "dirty," from fossil fuel — but that would be ludicrous. Or it can be clean from wind, waves or other (fanciful) sources — but that is frankly wishful thinking.

In addition, transporting and storing hydrogen is cumbersome and dangerous — requiring either cryogenic temperatures or 5,000-10,000 pounds of pressure
We've been consuming Mother Nature's lowest hanging fruit in the form of petroleum.
per square inch. Imagine living beside — or worse — driving around with storage tanks under such stress. And just imagine the cost to build a transport infrastructure as intricate as what has evolved for fossil fuels over the past century. The mind boggles.

Consequently, we envision solar-based energy to be the way of the future. Mother Nature found a mechanism to capture the enormous energy of the sun with chlorophyll and photosynthesis, and we've been consuming the lowest hanging fruit of that miracle in the form of petroleum. With enough research dollars, we're optimistic we'll one day match her adroitness. The price paid will be significant in terms of monetary funding — and perhaps initially with a diminished freedom to travel about and more modest abodes (cubicles?).

Presented as a "Timely Yet Timeless" post by John Herman 3:04 AM

Tuesday July 1, 2008

Nuclear Redux — or Déjà vu?

Our QuikPoll#17 revealed surprisingly that there is more concern with fossil fueled global warming today than with long-lived and deadly radioactive waste (and weapons grade fuel) produced by nuclear power plants. This is astonishing, especially with the threat of terrorism and the third-world's desire for nukes.

It's amazing how attitudes can vacillate over the course of time — about 60 years in this case. At the dawn of the Atomic Age, there was optimism that not only would "going nuclear" fuel power plants to generate electricity — it would also power our planes, cars and rockets — and cook our food! (Even Walt Disney was convinced, producing the movie and book, "Our Friend the Atom.")

That initial optimism was eventually supplanted with concern about bomb proliferation; paranoia associated with a perceived loss of U.S. technological supremacy to the Soviet Union; and uncertainty as to the long term effects of the radiation from weapons testing. The many science fiction movies from the era depicting awakening dinosaurs, giant insects, and incredible shrinking men attest to the period's uneasiness associated with radiation. Eventually Chernobyl and Three Mile Island would be the final straws that broke nuclear's back. But, memories fade.

Fast forward to today. Politicians and the nuclear industry are capitalizing on a recycled nuclear blaséness with talk of new reactors helping to counter skyrocketing fuel prices and ameliorate global warming. Cameco, the
Recycling dangerous technology may not be the answer to rising fuel prices and global warming.
"world's largest uranium producer" crows the slogan, "NUCLEAR. The Clean Air Energy." Entergy, self-described as the second-largest nuclear generator in the United States, obfuscates with the tagline "The Power of People," whatever that means.

Embedded in a past issue of Fortune magazine is a nuclear-industry funded feature posing as objective content proclaiming a "Nuclear Redux." The piece, cleverly written by freelancer Robert McGarvey (who might have sold his professional soul here), subtly conveys an environmentally friendly green theme with an innocuous graphic, a green pull-quote highlighting nuclear's return to center stage, and a green text box incredibly proclaiming that "Radiation is good for you."! How subtle. How frightening!

A more recent Fortune (June 9) is host to an essay titled "The Case for Nukes," where writer/blogger Elizabeth Spiers argues that "as oil climbs to where no one can afford it," we'll not be able to afford being afraid of nuclear energy, and that accidents like Chernobyl "would never happen again." [Oh yeah?]

And McGarvey, by now soulless, has been paid yet again to aid and abet the radioactive propaganda machine in an ad section published in Fortune's July 7 issue, gushing about today's "The Nuclear Renaissance" and its metaphoric journey from pariah to prom queen spurred by soaring energy costs.

*****

There's little doubt that rising fuel prices and global temperatures are pressing issues requiring action. However, the answer may not be to recycle dangerous technology. We owe it to ourselves and to posterity to become sufficiently educated to intelligently evaluate potential options. It's popular now to pander to those outraged over $60 fill-ups and brand global warming "bad" because that's been the predominant message; and nuclear energy as comparatively "good" because it's been out of the tabloid news for over twenty years.

Don't listen passively. The world and its inhabitants are at stake — and the clock is ticking.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 4:38 AM

Feedback: Your Two-Cents!

Date: July 17, 2008 10:06:27 AM EDT

John
Please see http://www.channel4.com/science/microsites/G/great_global_warming_swindle/ or Google Channel4 Global warming Swindle. Those who are interested in this great fraud perpetrated by Al Gore do some research on the Mini Ice Age, you will discover that we have had numerous centuries where the temperature has gone up and gone down. (Greenland was lush with greenery and the Vikings settled there around 1300, but then came the ice age and they all perished.) In-fact in 1975 scientists were predicting a global cooling, see Time magazine 1975 and Newsweek 1974. They have also determined by numerous soil samples that between 25-50 million years ago there was much more CO2 in the atmosphere than there is now, due to lots of volcanic activity. It is amazing how the public has been swindled, and taking this hook, line and sinker - it goes to show what happens when we are dominated by the liberal press like Newsday, hard to find balanced news.

Oh, by the way the polar bear has been on this earth for at least 100 thousand years and has survived the warming and cooling trends, I think it will make it through this warming cycle.

John Sertic

Sunday June 1, 2008

Memories: Firsthand & First Person

Our memories, and the memories of us, are precious. Are they not? Well, at least to some authors they appear to be.

Cliff Pickover, prolific writer and research staff member at IBM's T. J. Watson Research Center, admits that much of his motivation to publish comes from his desire to exit this world with something to leave behind for future generations. He laments:

After you die, will the world remember anything you did? Most of us rarely leave marks, except on our immediate family or a few friends. We'll never have our lives illuminated in a New York Times obituary or uttered by a TV news anchorperson. Even your immediate family will know nothing of you within four generations. Your great-grandchildren may carry some vestigial memory of you, but that will fade like a burning ember when they die — and you will be extinguished and forgotten.

That's pretty depressing. As we try to live each day to the fullest — being productive, learning, and ultimately creating memories for ourselves and others — we rarely ponder the ephemeralness of it all. (Although the often-heard dispassionate phrase, "Who will care a hundred years from now?" stems from the sad reality of our short time here on earth.)

But does it have to be this way?

The Phrenicea scenario envisions a time when all our memories and experiences will be stored forever, within our own brain as well as within others' — firsthand memories that are deemed rich enough to be bought and sold at auction — like memorabilia traded today on eBay.

Imagine sharing the actual memory of one accepting a Nobel Prize or an Academy Award, winning a marathon, falling head-over-heels for         your favorite actor         , starving in the third world, learning of a terminal disease; even of dying!

Most are tempted to say, "Yeah right! No way."

Memories bought and sold at auction — like memorabilia on eBay.

It seems impossible now, but there will come a day when the mechanism for memory assembly and storage within the human brain is elucidated. A next logical step would be to try to save or replicate these memories, perhaps for recollection by others.

When it does come to fruition, imagine the regret for the many first-person memories already — or soon to be — lost forever:
- the excitement of witnessing man's first flight
- the despair of the 1929 Stock Market Crash
- the horror of the Holocaust and Hiroshima
- the excitement of purchasing a new color TV in the 1950s
- the anticipation of finding out "Who Shot JR?"
- the relief of learning of a cure for polio
- the nostalgia of catching the last feature at the local drive-in movie theatre just before its closing forever
- the shock of President John F. Kennedy's death
- the thrill of setting foot on the moon
- the marvel at the first "horseless carriage," telephone, "talkie" talking movie, ballpoint pen, transistor radio, Polaroid camera, VCR

Of course you could read or see videos about these events. But nothing can approach an actual memory — just ask Neil Armstrong.

Gee, when you think about it, memories really are precious.

Time will tell...

Presented as a "Timely Yet Timeless" post by John Herman 8:38 AM

Thursday May 1, 2008

Workin' the Pharm:
"Ask your doctor about Vesi • lev • esta • vix • rum • vix • ami • cele • max • gel • ser • iza • vet • avo • cal • zet • via • ara • nex • xol • quel • xyz • luc • lis • tis • tor • itra • ast • tia • ica • gra • iva...!"

We're bombarded nowadays with drug ads in print and on TV called "direct to consumer advertising." Their proliferation is a result of a 1997 FDA change allowing pharmaceutical companies to promote drugs without having to elaborate the negative side-effects. A great business sales model immediately emerged to recruit consumers into believers, who then cajole their doctors into prescribing drugs.

Here's a "short" hawking list compiled from TV and a few magazines:

Celebrex
Amitiza
Viagra
Vytorin
Nexium
Crestor
Plavix
Actonel
Lunesta
Evista
Nasonex
Asmanex
Boniva
Symbicort
Vesicare
Roserum
Flowmax
Caduet
Lipitor
Avodart
Singulair
Lunesta
Januvia
Zetia
Reclast
Levitra
Xyzal
Lucentis
Cialis
Seroquel
Xolegel
Marketing firms are paid big money to create these arbitrary (meaningless) names that are then registered as unique trademarks. The guidelines appear to be short names, five to eight letters, constructed from a common set of syllables — perhaps explaining why they all have a "drug-sounding" resonance.

Reading the list above for the first time you'd probably guess that they were drugs. The challenge then for creating new names is to find new syllable combinations not yet coined. Choosing six syllables from the heading above — born here are Nexquel, Estavix and Vexizet. They sure sound like drugs and per Google they're not in use. (The futuristic entity Phrenicea was coined analogously nine years ago by combining phrenic and panacea !)

The intent of pharmaceutical companies is to make their names familiar with unrelenting advertising using memorable jingles like Viva Viagra!, cryptic hints on par with "When the moment is right, you can be ready" and citing identifiable conditions such as acid reflux (Nexium), osteoporosis (Boniva), allergies (Singulair), heart risks (Lipitor), diabetes (Januvia), bladder urges (Vesicare or Flowmax), enlarged prostate (Avodart), high cholesterol (Crestor or Zetia), or high blood pressure (Caduet). And of course everyone knows what Viagra purports to cure.

To see how well the drug companies have been able to brand their names into your brain, scan the list above and click a check inside the boxes for those that are familiar.

How many did you check? Perhaps more than you would have guessed. That's the power of advertising.

Very few new drugs make it to a list like this however. Only one out of ten earns FDA approval after three phases of exacting clinical trials. Most new drugs either have no effect or are harmful.
The unexpected rise out of Viagra makes clear that drug discovery is not by design.
Another outcome is the unexpected. Many are not aware that Viagra was originally tested to treat angina and by serendipity became famous with a surprising side effect. After the (probable) chuckles during trial testing subsided, the once meaningless v-i-a-g-r-a would become an official entry in Webster's dictionary. Perhaps one day it will even become a genericized trademark like zipper, kleenex, velcro, scotch tape, band-aid, coke and Q-tip.

The unexpected rise out of Viagra illuminates an important point: Drug synthesis and discovery is not really by design, although most chemists in the industry will not readily admit such. The chemical compounds are created with as much art as science by unnatural means, with elaborate apparatus to control physical conditions of temperature, pressure, etc. They're then evaluated via empirics with animals and humans, which is a fancy way of saying they watch for indications (good effects) and reactions (bad effects) and contraindications (bad combinations) with other drugs, chemicals, and nutrients. The entire process can take years or decades.

Because so many of the intricacies of human body chemistry are a yet to be learned or explained, oftentimes how a drug works (pharmacodynamics) is a mystery. The pamphlet insert (which hardly anyone bothers to read) for Aldera cream states, "The mechanism of action is unknown." In layman's terms it would read, "We have no idea how this stuff works."

That explains too why so little is known about the long-term impact of these synthetic concoctions, and why in some cases they have to be pulled from the market due to unforeseen negative complications (Vioxx).

Not ironically, the benign sounding drug names drummed into us belie all this complexity. The trade name Plavix has the clunky generic name of clopidogrel, which pales next to its chemical name:

(+)-(S)-methyl 2-(2-chlorophenyl)-
2-(6,7-dihydrothieno[3,2-c]pyridin-5(4H)-yl)acetate

Imagine putting music and images to clopidogrel, or worse — methyl dihydrothiano pyridin acetate. Would you be as easily swayed to "ask your doctor" as the Plavix advertisements implore?

Here's more:

Trade Name Generic Name Chemical Name
Celebrex celecoxib 4-[5-(4-methylphenyl)-3-(trifluoromethyl)
pyrazol-1-yl]benzenesulfonamide
Lunesta eszopiclone (5S)-6-(5-Chloro-2-pyridinyl)-7-oxo-6,7-dihydro-
5H-pyrrolo[3,4-b]pyrazin -5-yl 4-methyl-
1-piperazinecarboxylate
Levitra vardenafil 4-[2-ethoxy-5-(4-ethylpiperazin-1-yl)sulfonyl-phenyl]-
9-methyl-7-propyl- 3,5,6,8-tetrazabicyclo[4.3.0]
nona-3,7,9-trien-2-one
Viagra sildenafil 1-[4-ethoxy-3-(6,7-dihydro-1-methyl-
7-oxo-3-propyl-1H-pyrazolo[4,3-d]pyrimidin-5-yl)
phenylsulfonyl]-4-methylpiperazine citrate
Boniva ibandronic acid 1-hydroxy-3-(methyl-pentyl-amino)-1-phosphono-
propyl]phosphonic acid

If you've taken a course in organic chemistry you might comprehend more of this. But then you would better appreciate the complexity of the human body, and the precarious chances taken when ingesting these novel
Plow a furrow of skepticism in your brow as harvests past yielded unexpected results.
chemicals never before seen in nature; created in laboratories with equipment that defies evolutionary rules and the eons of time to be in harmony with biological systems.

Nevertheless, the next time you find yourself "workin' the pharm" by asking your doctor about a drug you saw on TV or in a magazine, don't be afraid to get your hands dirty beforehand by digging up your own proverbial dirt via the Web or elsewhere to learn "what's in a name" — and then muster the vigor to plow a furrow of skepticism in your brow as harvests past have tended to yield unexpected crops.

Time will tell...

posted by John Herman 1:28 AM

Tuesday April 1, 2008

A Tough Act to Follow

It's been a while since we took Ray Kurzweil to task, the self-proclaimed "visionary futurist," whom we fondly call Chiphead, our respectfully disrespectful appellation for the champion of chip-based intelligence. He predicts "We won't experience 100 years of progress in the 21st century — it will be more like 20,000 years of progress (at today's rate)."

We love skewering Kurzweil because of his unabashed arrogance. A lot happened in the 20th century making its end appear quite different from its beginning. So, let's evaluate the most significant achievements/events that occurred in those incredible ten decades vs. the 21st century to date, which is already approaching one-tenth of its 100 years. You can then adjudge the veracity of Kurzweil since we should have already progressed 10 percent of 20,000 years. (Do the math, that's 2000 years of progress since entering this new century.)

Significant achievements/events in the 20th Century:

  • broadcast radio; cultural homogenization
  • airplane; shrinking of the globe
  • telephony; connecting the world
  • tungsten light bulb; 24 X 7
  • automobile; suburbia, petroleum economy, pollution
  • TV; cultural homogenization, couch potatoes
  • synthetic chemistry; Nylon, Dacron, PVC, Big Pharma and more — the end of "all natural"
  • Pearl Harbor; U.S world predominance
  • drive-in theater; sci-fi B-movies*
  • atomic bomb; fear of non-fungal mushrooms, MAD
  • transistor; gadgets galore
  • antibiotics; (perceived) microbial domination
  • DNA double helix; life built from lifeless molecules
  • rock 'n' roll; generational segregation 1.0
  • credit cards; easy debt
  • Sputnik; space race, spin-off technology
  • The Pill; equality of the sexes
  • Cable TV; cultural splintering
  • microprocessor; PCs and computerized everything
  • MTV; generational segregation 2.0
  • Internet/Web; digital revolution, offshoring

Wow! It could be debated though whether the list is the most significant. Perhaps there should be more, maybe less. And which is the most influential? Depending upon your perspective, it might be the automobile; The Pill; or the microprocessor. Then again, the influence of an event then might not be as significant in this new century.

An objective approach is to use some measuring criteria. In keeping with this month's spirit of April Foolery*, we hereby propose the "B-movie plot test," using the fodder of drive-in theaters from days gone by. (From Wikipedia: The term B-movie originally referred to a motion picture made on a modest budget and intended for distribution as the less-publicized, bottom half of a double feature. U.S. production of movies intended as second features largely ceased by the end of the 1950s.)

Based of the following plethora of radiation-based B-movie plots, we submit that the atomic bomb had the biggest overall impact during the 20th century. Just feast on the beasts below:

  • The Beast From 20,000 Fathoms: an atomic test in the arctic thaws a dinosaur so it can migrate back to New York to cause havoc.
  • Godzilla: American hydrogen bomb tests awaken and mutate the monster Godzilla. You know the rest.
  • Them!: the first U.S. nuclear test causes ants to mutate into giants.
  • The Beast of Yucca Flats: a defecting Russian scientist is chased by the KGB and winds up amidst a nuclear mushroom cloud. The radiation turns him into a killing beast.
  • The Amazing Colossal Man: a U.S. soldier suffers serious burns following exposure to plutonium from a bomb blast. He survives but the radiation causes him to grow into a giant.
  • The Crawling Eye: a radioactive cloud sitting atop a mountain has its climbers winding up decapitated without explanation.
  • Hideous Sun Demon: decades before sunblock and SPF, a scientist exposed to a radioactive isotope devolves into a scaly reptilian when caught in the rays of the sun.
  • The Giant Behemoth: at a science conference it is noted that atomic tests have contaminated plankton, fish, and birds in a "biological chain reaction" of radiation culminating with a monster that burns flesh with radioactive waves.
  • The Incredible Shrinking Man: a man is subjected to a radioactive mist that causes him to shrink beyond detection.

You could dismiss all this as April Fools' folly. But the 1950s were indeed scary and the atomic bomb is right up there in terms of centurial impact.

Now for a comparison let's list the significant aspects of the 21st Century to date:

  • Bush wins presidency; Al Gore popularizes (invents?) Global Warming
  • cell phone mania; piercing ringtones, texting, rude cacophony
  • 9/11; global terrorism
  • Google IPO; domination of "search" and...?
  • iPod; ubiquitous earbuds, cultural isolation
  • human genome mapping; TMI — revelations that we'll not want to cope with
  • Web 2.0; virtual socialization (seeds of Phrenicea?)
  • Internet dependence; on par with electricity and fresh water

What will be the most significant? Bush? 9/11? The iPod? Dependence on the Internet? It's hard to infer from the narrow perspective of the present day (and sans the B-movie test!). But one thing's certain — we'll not have 2000 years of progress in this decade, despite what Kurzweil predicts and proselytizes.

*****

In sensible and realistic terms, the sweeping change of the 20th century is going to be a tough act to follow.

Time will tell...

posted by John Herman 5:37 AM

Saturday March 1, 2008

A Model Shattered?

The world has changed. Jobs are being lost to technology, outsourcing and the "flattening" of organizations. As intellectual skills rise globally, many of the jobs left are lower paying, heeding the old law of supply and demand.

This new reality shatters the half-century old model for graduating high school, attending college, and then getting a good "white collar" job.

You may have heard stories like these:

  • A fine arts graduate working as a "sales associate" at electronics chain
  • A serial changer of undergraduate majors that becomes a perpetual full-time student pursuing one or more bachelor degrees well beyond the expected four years
  • A mathematics graduate stocking shelves at a health food store
  • Add one or more of your own here

Typically these are students who aren't sure of their purpose in life, and thus are funneled into a college course of study for the wrong reasons: pleasing parents; it's what's expected; anticipation of good money after graduation; a prestigious title; or perhaps the lure of landing a job with summers off and generous benefits.

The solution to increased competition from an ever-growing global workforce will be attaining two important qualities sooner rather than later: maturity and passion. Unfortunately many of today's youth are lacking in both,
The flood of intellectual talent morphs the corporate ladder into a horizontal plank.
and that quest seems to elongate with each succeeding generation. Today's extended reliance on parents precludes the development of independence and can encourage a preoccupation with proving adulthood through self-destructive means like binge drinking and worse. If young adults truly had adult responsibilities, such conspicuous indulgences would be pointless.

This is not folly. The February 25 issue of U.S. News reports on a new film, "Two Million Minutes: A Global Examination" comparing American teenagers' attitudes to those in India and China. The conclusion: U.S. students are preoccupied with "having fun," and are less focused and motivated.

It's not necessarily the kids' fault. They're pushed, prodded and subjected to structured programs in academics and sports that rob them of extemporaneous life experiences and a sense of personal accountability. With little time to experiment and make self-inflicted mistakes, absent is the benefit from the consequent lessons that would result. (Ironically, many times it's the accumulated wisdom from mistakes that is most beneficial years hence.)

Here's an excerpt from one student's college essay:

The competition is fierce. I've heard the infamous question, "What do you want to be when you grow up?" ever since I first started school. The expectation of teachers, guidance counselors and even family members contributes to the constant pressure for young adults to distinguish their future. When I was younger, I promised myself that I would end up doing something I truly loved. Feeling overwhelmed makes it harder for students to discover who they really are as people. Before all the stress, I used to have a long list of potential professions in mind, but now it all seems faint.

Not to paint an overly bleak picture, in reality many are ready to attend college right after high school;
Alternate pursuits not congruent with expectations should not be deemed inferior.
they're the fortunate ones. Too many are not however. For them, this is not a recommendation to not attend college, but to suggest that alternate pursuits not be deemed inferior if not congruent with entrenched expectations. Entering college prematurely, before some self-initiated responsibility and maturity have been garnered, is an invitation to disappointment.

It's not obvious today that the expectation of attending college is a relatively new phenomenon, first fueled by the post WWII GI Bill and mature, serious-minded veterans determined to climb the corporate ladder. But given the flood of intellectual talent nowadays, that ladder oftentimes morphs into a horizontal plank.

Perhaps its time to go back to the model that existed prior to WWII. Very few went to college; most found jobs right from high school. The benefit however was that the responsibilities of adulthood came very fast.

Just imagine a parent encouraging their college-aged son or daughter to:

Go out and experience life. Get a job. McDonalds', Home Depot, Starbucks and a zillion retail stores are starving for competent help. Then strive to earn some responsibility. Become a supervisor or assistant manager. Learn to effectively deal with customers, and employees both more senior and junior than you. These jobs are hard work. They are at times monotonous and can even be degrading. Become aware of what it's like to begin work without a specific skill set. Still, there are invaluable lessons to be learned. Wisdom to be acquired. The attainment of compassion and empathy. The opportunity to earn respect from others as well as for yourself.

Then after this experience perhaps you'll be better prepared for college. And you won't need your parents hounding you to earn good grades. You'll want them all by yourself.

Heresy? Of course this goes against the grain of today's expectations. But you may have heard of several well-known and successful individuals who followed unorthodox career paths, including Bill Gates (co-founder of Microsoft), Richard Branson (entrepreneur, Virgin Records & Airways), Arnold Schwarzenegger (bodybuilder/actor/governor), Steve Jobs (co-founder of Apple), Michael Dell (founder of Dell Computer), Craig Venter (iconoclastic genomic pioneer, first to decipher human genome), Billy Joel (pop musician), Ian Anderson (self-taught musician, salmon farm entrepreneur).

Is it a coincidence that many of today's richest, most influential and creative individuals broke from the mold and charted their own paths to success?

With the ever-increasing worldwide competition for jobs, the key to success may be discovering very soon after high school innate talents and a passion for something; anything. (As Google's super-successful executive chef Josef Desimone imparts, "I never wanted to be an astronaut... I only wanted to cook.")

What accompanies passion is an inexhaustible reservoir of energy for accomplishment. To quote an old proverb: "Do work you love and you'll never work a day in your life."



© Zits Partnership. Reprinted with Special Permission of King Features Syndicate.

*****

Maturity and Passion — a fortuitous combination — perhaps the 21st-century passport to success. Luckiest are the ones that have it at an early age. Fortunate are the ones that acquire it eventually. Sorry will be the ones unable to achieve it in time, or at all.

Time will tell...

posted by John Herman 4:28 AM

Friday February 1, 2008

Bag Age Baggage

con•ve•ience
Freedom from difficulty, discomfort, or trouble.
(Merriam-Webster)

stu•pid•i•ty
The quality or state of being stupid; given to unintelligent decisions or acts.
(Merriam-Webster)

*****

Does convenience beget stupidity? Does stupidity beget convenience? Or do they complement each other synergistically?

I contemplate this now as I find myself inundated with plastic bags carried home from virtually every type of retail store. And the problem seems to be getting worse day-by-day as behavior accommodates their omnipresence and so-called convenience.

The phenomenon has made me more observant of our dependence on the baggage in what appears to be a burgeoning Bag Age:

  • When buying six replacement glass chimney light shades at a local big-box home center, the clerk first wrapped each individually with unopened plastic bags before placing them in bags for carrying, two per bag. That's nine bags! (In times past they would have been wrapped in newspaper and placed in an old cardboard box.)
  • While fumbling for cash to pay for a watch battery, the clerk placed the tiny two-inch square cardboard package and a three-foot-long receipt inside a large two-foot square bag. Unsuccessful in giving it back, I self-consciously carried the nearly empty bag out and almost lost it to a gust of wind. (What happened to small paper bags? This is convenience?)
  • A local beer outlet insisted that I not walk the five feet to the exit carrying a six-pack naked of a plastic bag. "It's the law," chided the proprietor. (Government mandated stupidity or a manufactured "convenience"?)
  • Suddenly this year, curbside Christmas trees that filled homes with holiday cheer were discarded in huge white plastic bags — large enough to cocoon a small car. (I cannot fathom any convenience here.)
  • Taking advantage of the annual supermarket "can can" sale, the clerk did me a "favor" by packing only two cans per plastic bag. No amount of coaxing would convince otherwise. (Convenience or stupidity?)
  • As I was entering a supermarket checkout line, the elderly shopper before me said to the cashier, "You can put more items in those bags." The cashier replied, "I don't want you to have work too work hard loading them into your car!" I tried to count the mountain of bags in her cart but was unsuccessful. (I left the store in a state of dismay.)
  • Upon exiting a supermarket checkout, the cashier asked the shopper behind me, "Plastic or paper, sir?" He chose plastic. (Which definition above applies here?)
  • Waiting for a train I watched as a garbage bin maintenance worker mindlessly emptied bin after bin replacing the mostly empty plastic liners with not one, but two new bags. That's waste times two. (I could only shake my head at the moronic stupidity of a management that most likely mandated this policy.)

Pathetically, these scenarios play out, day in and day out. Unfortunately, it's now become business as usual.

I've tempered some of my own guilt by reusing used bags as garbage liners. A decade ago I purchased a half dozen "eBaskets." They're simply but ingeniously designed to accept standard issue bags enabling them to complete
It's been a decade since purchasing bags that are Glad just to carry trash.
their usable life cycle more responsibly. It's been that long since I purchased bags that are Glad just to carry trash. Particularly annoying then are odd-shaped bags that are too tall and narrow or too short and wide — and good for absolutely nothing afterwards.

Still, having so many more standard-sized bags than trash needing a liner, I end up annually with bags stuffed full of themselves from supermarkets, department stores, stationery stores, etc. Fortunately my municipality recycles them. Even so, how many just thoughtlessly toss them out?

Evidently too many — since the company that produced the eBaskets, Green Earth Products, went out of business in 2007. All the politically correct rhetoric nowadays about going "green" apparently is just that. Many talk the talk. But when it comes to walking the walk, the difficult part that's just too much trouble, that's when we fall short.

Now I have to ask:

  • What happened to good old biodegradable paper bags? (Plastic is now cheaper.)
  • How did it come to this? (Wasting ubiquitous resources easily becomes habit forming.)
  • Are reusable cloth bags a viable solution? (Probably not without an outright ban on plastic. Beyond the perception of being a hassle, with repeated use they could become unsanitary; since it's doubtful many would bother laundering them.)
  • What will this (now worldwide) plastic waste eventually do to the environment, given that the bags are non-degradable polyethylene that will last at least 1000 years? (Time will eventually tell.)

So, in final analysis:

  • Does convenience beget stupidity?
  • Does stupidity beget convenience?

I deem yes on both counts here — unless we one day have the wherewithal to sack today's Bag Age baggage.

Time will tell...

posted by John Herman 8:27 AM


Click here for what might be construed as a calculated publicity ad by Target Stores.

Postscript by John Herman — April 13, 2008

Tuesday January 1, 2008

The Twelve Blogs of '07

Here we are again, lamenting yet another year's passing while suffering post-celebratory writer's block. Rather than trying to cobble together a half-hearted essay, we'll sheepishly follow last year's precedent-setting cop-out to lethargically review the "Twelve Blogs of 2007" — with premature nostalgia and perhaps feigned interest as to whether they're still relevant.

December's blog bemoans an unintended social experiment whose impact appears generationally exponential:

Even casual observation of random family interactions reveals that selfishness has eclipsed selflessness. Today it's often difficult to discern who is in charge — the parent or child. Recalcitrant behavior is the norm in schools, where teachers field disciplinary problems with too little authority to reprimand. And it's evident in society at large, where common politeness and respect for others has gone by the wayside...(Read more)

• In November we wrung our hands over being "all thumbs":

Looking at the impact of our silicon-infused, technological society it is obvious that very little is done "by hand" nowadays — by young and old alike... (Read more)

October's blog turned to humble introspection:

Why do we think we're so smart? It seems no matter what the year or age, our perception is that we're always on the cutting edge of information and technology — even (or especially) if it contradicts previous wisdom...(Read more)

September's entry lamented the metaphorical switching of gears:

Shanthi Gears is not located in California, Ohio, Maryland, Massachusetts, Indiana, Connecticut or New York. It's manufacturing those gears in Tamilnadu, India. And while it is turning the wheels of industries worldwide, the U.S. has shifted into high gear towards a service economy. The once ubiquitous "Made in U.S.A." is rarely seen today...(Read more)

• In August one smart cookie crumbled:

I can't wait to order Chinese takeout again. Will I crack open another smart cookie to find an abstruse, recondite maxim with a scientific theme? Or will I have the misfortune of reading just another silly platitude? I'm really hoping for the former.

Forever puerile, I'll imagine my next postprandial surprise to be the work of some underemployed subatomic particle physicist, shedding an optimistic photon beam on my two left feet so I can finally show off some dance floor prowess...(Read more)

• In July we suffered the recurring affliction of "water on the brain":

Maybe we should instill among ourselves a new idiom-cum-mantra to "spend water like it's money." We all tend to waste water and take its current abundance for granted.

Wouldn't it be prudent to pay more today to change our wasteful habits, while adopting a mindset focused on conservation to build "aqua equity" for future generations? (Read more)

• In June big thoughts about small-minded behavior prevailed:

And after thousands of years we're still at it. We can make just about anything into a symbol of status. But one person's object of distinction might be another's folly. So we have to be among like-minded people to make an impression. And you can't even take it with you! (Read more)

May's entry exposed the "Two Point Uh-oh" media disorder of applying the trendy 2.0 suffix onto almost everything:

Version numbers like "2.0" are adopted from the software industry, from what's commonly referred to as the "development life cycle." The confounding terminology is being unabashedly usurped by advertisers making it part of our vernacular. Although their target market may not truly grasp the pretentious and technical jargon being exploited, what is undoubtedly implied is "new and improved" ...(Read more)

• In April we maintained a cool head on global warming:

If every energy spendthrift of modern society performed a one-eighty lifestyle change by adopting a conservation mindset, the synergy of "the power of one" and "strength in numbers" would likely reduce consumption and demand for energy sufficiently to render the global warming argument moot...(Read more)

• The March blog pitted brainy scientists against nerdy computer wiz's:

A while back I emailed Dr. Terrence Deacon, professor of Biological Anthropology and Neuroscience at University of California-Berkeley. He's an honest-to-goodness, genuine brain scientist — as opposed to certain computer scientists who are brain-expert wannabees, typically proselytizing in the media their tenuous computer-brain analogies and artificial intelligence predictions... (Read more)

• In February we escaped the harsh reality of winter through the wonders of armchair TIME-travel:

Entranced into an armchair TIME-traveler steeped in a '60s mindset, I randomly opened to page 52, featuring "Welcome to Wi-Fi-Ville." Imagine how puzzled I would be reading about: free wireless Internet, wireless-fidelity (wi-fi) network, the Web, sunbathers Web surfing, municipal wi-fi, broadband prices, high-speed access to rural areas stuck with dial-up, VOIP (voice-over-Internet protocol), telcos, EarthLink, DirecTV, DSL, Yahoo, Google, surfing porn and downloading... (Read more)

January's entry was tinged with nostalgia and torpor — and guilt for not writing a new essay:

Along with the passing of 2006 comes inevitable reflection on what was and wasn't accomplished — as well as looking ahead to 2007 with hopeful optimism. Another consequence is a bit of laziness from too much of too much... (Read more).

You may or may not agree, but after reviewing the "Twelve Blogs of 2007" it appears most are still relevant as we roll into the New Year. And somehow after recalling the energy that instigated the blogs to begin with, we don't feel quite as lethargic.

Maybe that's the real benefit of looking backward, reminiscing and singing "Auld Lang Syne" — not for nostalgia per se, but to recharge our batteries and begin another year with renewed energy and enthusiasm.

Time will tell...

Happy New Year!

posted by John Herman 7:52 AM

Saturday December 1, 2007

Spock's Children's Children's Children

All the fuss over Tom Brokaw's latest book, Boom! Voices of the Sixties, is partly and deservedly due to its excellent predecessor, The Greatest Generation. With Boom! he tackles the "spoiled-est generation" during their heyday (hair day?) in the 1960s — the decade of hormone- and drug-infused baby boomer post-pubescence; arguably the 20th century's most significant decade in terms of cultural change. (The book no doubt will be a sales success given boomers' infatuation with themselves.)

As expected, the electronic media and major newsweekly magazines gushed over the book's release. Newsweek scooped an exclusive excerpt, while Time and U.S. News featured Brokaw interviews.

When asked by Time whom he considered "the most influential person[s] in the last 40 years," Brokaw answered Mikhail Gorbachev, Ronald Reagan, and Osama bin Laden. I found his choices quite underwhelming. (Perhaps not anticipating the question Brokaw's answer was extemporaneous.) Given that the theme of Time's questioning was about his new book and the 1960s, I would have expected more boomers to make his list. Bill Clinton or George Bush perhaps. Definitely Bill Gates or Steve Jobs.

Despite surprise at his picks, my top choice too would not be a boomer. Without hesitation, I would nominate whom I consider the "Father of the Boomers," the late Dr. Benjamin Spock, born way back in 1903.

"What?" you might exclaim, "Who is Dr. Spock?" (No, he's not Star Trek's Mr. Spock with a medical degree.) In 1946, Pediatrician Spock published a paperback titled The Common Sense Book of Baby and Child Care, a handbook of iconoclastic child rearing views.

The book immediately struck a chord with other "progressive" doctors and was an unexpected best seller in America, and eventually internationally. The book would be translated into 39 languages and sell more than 50 million copies (second in sales only to the Bible).
Spock spawned a worldwide social experiment whose impact is generationally exponential.
Spock's advice to parents was laced with permissiveness; likening children to be more like equal partners in their development. This was in sharp contrast to other childcare books of the time that were preaching traditional authoritarianism.

Although boomers proudly proclaim that they defined their generation, it was Dr. Spock who unintentionally took on the role of surrogate parent and defined their values. Postwar parents who lived through the depression and WWII wanted nothing but the best for their children. Better educated and with economic conditions improving, they were receptive to a book purportedly written by an expert promulgating juvenile well-being through a more liberal disciplinary path.

So how did Dr. Spock come to write the most influential parenting book ever written? It was not by design. Its popularity was a surprise even to him. And it's not that he was especially qualified to profess such authority to effectively raise an entire generation. While specializing in pediatrics he studied psychoanalysis for six years, making him the only practicing pediatrician of his time with this combination of training. Without controlled clinical study however, widespread adherence to his methods turned out to be a worldwide social experiment whose impact seems to be generationally exponential.

Even casual observation of random family interactions reveals that selfishness has eclipsed selflessness. Today it's often difficult to discern who is in charge, the parent or child. Recalcitrant behavior is the norm in schools, where teachers field disciplinary problems with too little authority to reprimand. And it's evident in society at large, where common politeness and respect for others has gone by the wayside.

Dr. Spock's legacy is far reaching, affecting by now "his" children's children's children. Nevertheless, Brokaw's book incredibly cites just one reference to Spock, when he joined the rebellious cohort he helped create to speak out against the war in Vietnam.

Missing the opportunity with Boom!, perhaps Brokaw's next book should assay the (rude) behavior spawned by Spock as evidenced in the present day, and entitle it The Ungrateful Generations.

Time will tell...

posted by John Herman 7:07 AM

Thursday November 1, 2007

Look, No Hands!

Given that most websites are maintained using powerfully sophisticated software, I usually elicit shock (well, perhaps astonishment) when explaining that I update the Phrenicea site "by hand," utilizing MS Notepad and basic HTML coding. (Being somewhat fastidious, at least I'm content knowing that every HTML character and tag has been hand-keyed, sans extraneous source code that's automatically generated.)

Beyond the tedious development of Phrenicea however, the terms "by hand" and "handmade" seem to be on the wane — and it stems from our growing disconnect from the physical world that unfortunately begins today early in childhood.

A case in point, this is from the December 1960 issue of Popular Mechanics:

Model making has replaced stamp collecting as the nation's number one hobby. A week after the U.S. Air Force announced its Starfighter jet set a new altitude record, miniature construction kits of the plane were sold out in stores from coast to coast.
It's hard to fathom today with YouTube, Facebook, MySpace, cell phones and iPods that model making back then was the favored pastime of America's youth — and that stamp collecting ever was! (I wonder if kids today desensitized by computerized fiction would even get excited about a manned Mars landing.)

Looking at the impact of our silicon-infused, technological society it is obvious that very little is done "by hand" nowadays — by young and old alike. For example:

  • What happened to the popularity of Erector Sets with their nuts, bolts, screws and mechanical parts? What about paint-by-number kits, Lincoln Logs and chemistry sets? These required tactile finesse while nourishing portions of the brain that no computer could possibly stimulate.
  • What happened to mending and darning of clothes? Most likely the blemished garment is just tossed out or given away.
  • What about home-cooked meals? This activity is diminishing with prepackaged and fast-food becoming evermore popular — expanding waistlines the world over.
  • Who maintains their own cars? Who would even have a clue how?
  • Who attempts to fix... anything? A throwaway mentality prescribes to buy new, as "good with their hands" skills eventually become extinct.

Those that are savvy have taken advantage of our evolution towards being "all thumbs." Given their rarity in today's mechanized world, the terms "made by hand" and "handcrafted" are implied synonyms for quality and usually an excuse to command excessively high prices. Other adjectives often associated with pricey handmades are "fine" and "exclusive." Watch out for them.

The comparatively few who have bucked the highbrow trend to attend college, pursuing instead one of the hands-on trades, are now enjoying very successful and lucrative
The savvy have taken advantage of our evolution towards being "all thumbs."
careers — especially if their work is top quality. They are in the minority vis-à-vis today's dime-a-dozen, white-collar, so-called professionals stuck in cubicles with little practical physical skills. The blue-collars today conceivably are living better than they ever dreamed and doubtless have more work than they can possibly handle.

Since 1999, the Phrenicea scenario of the future has predicted a priority bestowed upon hands-on work. In fact, what are termed "blue collar" jobs today are projected to be the highest paid professions in the future. We may be seeing signs of that already today.

*****

A lot was done "by hand" in the old days. Will it be done as well — or at all — in the future?

Time will tell...

posted by John Herman 5:37 AM


Handmade Vipor

Postscript with irony by John Herman November 3, 2007 (from New York's NEWSDAY November 2, 2007)

Monday October 1, 2007

We Sure Are Smart!
            -and-
What's Good is Bad (and vice versa)

Why do we think we're so smart? It seems no matter what the year or age, our perception is that we're always on the cutting edge of information and technology — even (or especially) if it contradicts previous wisdom.

Take coffee for example. Recent studies indicate that drinking coffee could lower your risk of diabetes, Parkinson's disease, and colon cancer. It might even reduce the consequences of smoking and heavy consumption of alcohol. But it wasn't long ago that coffee was suspected of causing bladder, pancreas and even colon cancer. So those brave (addicted?) people who went on drinking coffee regardless of that warning helped to provide the statistical data that now says it can be good for you!

*****

Doting parents self-assessed as intellectually sophisticated, armed with a little knowledge and determined to give their newborns an advantage, spent millions on infant-targeted DVDs to stimulate their nascent senses with music and colorful images. The latest studies suggest however, that tiny babies watching TV could have retarded development and are better off being stimulated by mommy and daddy — as nature intended.

*****

Warning! Although the following is a gross simplification, it may be too esoteric for many. If you find that is the case, just skip to the smiley face and continue reading this month's weblog.

In 1953 Watson and Crick discovered "The Secret of Life," which of course is the structure of DNA, a double helical chain of billions of four types of molecules prefixed with the letters A, T, G, and C. Soon after, researchers found that these letters combine in different sequences of threes to code for specific amino acids, which in turn combine in various combinations to form our proteins. This became known as the "central dogma" of genetics.

The four letters A, T, G, C can make up to 64 three-letter combinations (codons). But because there are only 20 types of amino acids, it was surmised and proven true that more than one codon can code for the same amino acid. For example, GCC and GCU might match up to the same amino acid. Your heredity could determine which of the two codons your cells prefer.

So for years, it was thought that some of us could have the same proteins even though our genes might be a little different. But of course nature is not that simple. It turns out that in our example, GCC might hook up with its amino acid faster than GCU, and that the difference in speed actually makes the resulting protein a different shape, which is important to our cells (one shape might promote cancer). So, like the word right (as opposed to left) and right (as in correct), what might appear the same can be quite different.

:-)

Will we ever learn to be humble?

*****

In 1957 the town of Tulsa, Oklahoma decided to celebrate its centennial by burying a time capsule to be unearthed 50 years later. The ambitious plan was to entomb a brand new Plymouth Belvedere car along with various other artifacts of the day. Every precaution was taken to ensure the giant coffin would be impermeable and protected from Mother Nature's penchant for making things deteriorate. A special concrete "tub" was built onsite with state-of-the-art technology. The car was slathered with preservative and cocooned in a special plastic cloth. A thick concrete slab topped it off before it was covered over.

On June 15, 2007 there was a huge town gathering for the exhumation, anticipating that the car would appear sparkling new and be an instant collector's item. Here are the before and after photos that speak for themselves:

Plymouth Before Plymouth Before
We sure are smart!

*****

In the early 1970s an environmental project endorsed by the US Army Corps of Engineers was launched to deposit about two million used tires, bound into large bundles with steel bands, off the shore of Fort Lauderdale, Fla. The project was touted as a win-win: discarding the tires responsibly while creating an artificial reef to lure more game fish to the area.

Unfortunately Mother Nature's tendency to corrode steel eventually unleashed the tires to wicked waves and wind, stripping off whatever life clung to them. Even worse, they became mobile weapons destroying the fragile natural coral reefs nearby. A massive tire clean-up is now underway.

*****

Of course we all know that vitamins are good for us, right? Vitamin C prevents scurvy. Vitamin D prevents rickets. And the B-complex vitamins,

  • thiamine (B1)
  • riboflavin (B2)
  • niacin (B3)
  • pantothenic acid (B5)
  • pyridoxine (B6)
  • cyanocobalamin (B12)
  • folic acid
  • biotin
do all sorts of good things like help to build protein, form blood cells and produce DNA. We're advised to take a multivitamin pill daily and we see vitamins fortifying many of the foods we buy
We're gullible suckers blowing with the intellectual winds of change.
including cereal, bread and more. Then studies in the 1990s indicated that a certain amount of folic acid taken before and during pregnancy was also found to help prevent birth defects, including spinal bifida. So before the decade was out it was added to rice, flours and pastas.

Now a recent study indicates that folic acid added to too many foods may cause colon cancer. Yet again, unforeseen negative consequences may result from something seemingly benign.

**********

Woody Allen's 1973 movie "Sleeper" makes fun of this flip-flopping tendency when his character Miles Monroe wakes up after 200 years of being cryogenically frozen to find out that all that was bad for you is deemed beneficial two centuries later — including smoking, deep fried fatty foods and marbled beef. We can laugh but it's at ourselves. We're incessantly gullible suckers blowing with the intellectual winds of change.

So, what are you doing today that is supposed to be good for you? Eating oily fish high in omega-3 fatty acids perhaps? Going organic? Exercising five times a week? Consuming smaller portions? Restricting saturated fats? Limiting processed foods? Ingesting antioxidants? Avoiding the sun? Trusting in herbal therapies?
Drinking lots of coffee?

Just wait for the next study.

Time will tell...

posted by John Herman 6:47 AM

Saturday September 1, 2007

Switching Gears...

While trying to peer into the future I also find enjoyment looking backwards in time. A particular fascination is juxtaposing the past to the present and future by flipping through old magazines from decades past. Revealing are the ads that reflect the mood of the time as well as general economic conditions. (Unfortunately this ability may well be lost going forward if much of today's print media falls victim to purely online content.)

On a recent treasure hunt, I found in the stacks at a nearby university library a musty bound collection of Scientific American magazines from 1954. Most of the ads were by manufacturers touting their engineering and technical might to attract clients as well as recruit talent. The ads reflect unabashedly the post-WWII booming optimism and fixation on progress — that era's buzzword. (It would be at least a decade before "progress" would become almost a pejorative term in hindsight of the unintended consequences of unbridled technical advancement of that time.)

Here's just a sample of the testosterone-infused ads from those vintage magazines:

Scientific American Scientific American
Scientific American Scientific American
Scientific American Scientific American
Click images to magnify


Ohio's Cleveland Tool Company "discovered how to shrink motors by floating a screw on a stream of balls," eliminating the need for excess power to overcome friction.

California's Hewlett-Packard Co., then a "World leader in electronic measuring instruments," brags that "advanced electronic test instruments are invaluable in rocketry, nuclear physics and research into interstellar phenomena."

Air Research Manufacturing Co. in Los Angeles answered the U.S. Air Force's call to build a
The U.S. was an energized hotbed of industrial activity.
jet engine starter four times as powerful as anything before and only slightly larger than the original.

Haynes Alloys Co. from Kokomo, Indiana produced "alloys for every wear condition shaped to your specifications." All you had to do was send them a blueprint of a part that was prematurely wearing out and they'd solve your problem.

Lycoming Co. in Stratford, Connecticut crows about "Peak performance by any product requires big performance from small parts. Lycoming's skill at producing such custom parts explains why so many leading manufacturers look to Lycoming with its 2 million feet of floor space, and 6,000-plus machine tools ready to serve you."

Ford Instrument Co. in Long Island City, New York boasts that "Taming the monster power of a nuclear reactor requires precision control of all the elements. Ford Instrument is designing controls that seek and hold the optimum power level of the pile and keep the rods so exactly set that the reactor's energy is harnessed safely, securely."


As I continued to flip the pages, Doelcam Corporation from Boston touted micro-precision synchros. Maryland's Bendix Aviation Corp. bragged that Ni-Span diaphragms were heat treated in a vacuum furnace and tukon[!] tested for hardness. California's Kollsman Instrument Corp. instructs that "The old Roman god Janus lives today in servo mechanisms, instruments and controls which take past information and use it to guide the future." Bersworth Chemical Co. in Framingham, Massachusetts for more than a quarter century devoted all their time, talent and energies to the study of chelate chemistry.

And this goes on and on — page after page, ad after ad. What becomes obvious is the U.S. was an energized hotbed of industrial activity.

You just don't see ads like this anymore and I thought for sure I'd never see them in today's
"Made in U.S.A." is rarely seen today.
magazines. That's why I almost fell off my chair when reading the August 11 issue of Business Week, turning to a full-page ad filled with a swagger and bravado and masculine images of meshing gears and heavy-duty gearboxes. The advertiser was Shanthi Gears, "turning the wheels of industries worldwide" with "no compromise, total trust and quality at its best manifestation."

Business Week


Wow! This page would fit quite comfortably inside those old Scientific American magazines.

Then I noticed the one major difference from the old days:
Shanthi Gears is not located in California, Ohio, Maryland, Massachusetts, Indiana, Connecticut or New York. It's manufacturing those gears in Tamilnadu, India. And while it is turning the wheels of industries worldwide, the U.S. has shifted into high gear towards a service economy. The once ubiquitous "Made in U.S.A." is rarely seen today.


Looking backwards in time adds perspective when trying to peer into the future. Questions to ponder include whether shifting to a service economy is wise in the long term, and whether India (and of course China) will one day switch gears to follow that same path.

Time will tell...

posted by John Herman 7:48 AM


BusinessWeek

Postscript by John Herman June 16, 2008 (from BusinessWeek June 23, 2008)

Wednesday August 1, 2007

One Smart Cookie...

Help Wanted

FORTUNE COOKIE WRITER:
PhD in biology required. Sociology BA helpful. Brevity and a belief in the supernatural a plus. Only smart cookies will become fortune(ate) candidates.

The above want ad sounds ridiculous of course. But that is what I envisioned recently after opening a Chinese cookie to find the following fortune:

"Even as the cell is the unit of the organic body, so the family is the unit of society."

A socio-bio-based aphorism in a fortune cookie? Could this have been created by an underemployed graduate of biology or sociology?

Perhaps this is early evidence bolstering the Phrenicea scenario of the future that envisions not enough jobs to employ most of the world's population. As higher education becomes the norm globally, college graduates will have to assume lower scale jobs formerly taken by those with high school diplomas or less.

Already, more and more sales associates, telemarketers, customer service reps, bank tellers, bookkeepers, etc. are bringing to the job the benefit of a four-year degree.

But who would have thought about a "fortune cookie writer"?

With the popularity of Chinese food on the rise, there may be a strong demand for offbeat fortune writers! Fortune cookies are a lot like horoscopes. Our intellectual side tells us it's all
As higher education spreads globally, graduates will have to assume lower scale jobs.
not to be believed. Yet, just as many feel compelled to read what the day may bring for their zodiac sign, opening a fortune cookie for its bit of wisdom or prediction can be irresistible. And if the theme just happens to coincide with a life situation, that's provides reinforcement to look forward to a next time.

Ever the skeptic, I saved a few choice fortunes through the years that had some relevance to see if their message would ever be realized. So here goes:

"Good fortune is just around the corner."
Unfortunately, I haven't turned this corner yet.
"You will soon gain something you have always wanted."
This might have come true;
but it was probably so trivial I didn't realize it.
"Financial hardship in your life is coming to an end. Enjoy!"
All I know is that I'm still waiting for this end to come.
"Two small jumps are sometimes better than one big leap."
I have no idea why I kept this one, although I'm wondering now if Neil Armstrong took that fortune cookie job.
(What does one do after landing on the moon?)

Even though it's evident I've not had much luck with fortune cookies, I still can't wait to order Chinese again. Will I crack open another smart cookie to find an abstruse, recondite maxim with a scientific theme? Or will I have the misfortune of reading just another silly platitude? I'm really hoping for the former.

Forever puerile, I'll imagine my next postprandial surprise to be the work of some underemployed subatomic particle physicist, shedding an optimistic photon beam on my two left feet so I can finally show off some dance floor prowess:

"You will learn to be like a meson — a strongly interacting boson — a hadron with integral spin!"
I'll certainly save this one. "Dancing with the Stars" look out!
*****

Imagine — physics graduates crafting cookie fortunes. An inane exaggeration? Perhaps, but let us hope that's not how the cookie crumbles for many other fields of study the world over.

Time will tell...

posted by John Herman 7:34 AM

Sunday July 1, 2007

Water on the Brain

Now that the lazy, hazy, crazy days of summer are here again, it's a good time to use the sultry weather as an opportunity to revisit our feverish condition of "water on the brain." Way back in 2001 our "H2Ouch!" page began recommending the following:
"Pretend you were to pay $1/gallon the next time you take a shower or bath, brush your teeth, flush a toilet, wash the dishes, or God forbid — water the lawn! Begin to use less water than the average person. Set an example. Prevent H2Ouch!"

We still believe this is good advice, but perhaps the proposal was and still is naïve. The problem is that there is little incentive to conserve fresh water from the tap — given its ridiculously low price. For example, I recently received my "Annual Water Supply Report" from my local water company and was dismayed at how little fresh water costs. Here's the breakdown:

Quarterly Water Rates — Residential

Consumption (gallons) Charges
  • Up to 8,000
  • 8,001 - 58,000
  • 58,001 - 100,000
  • Over 100,000
  • $10.00 minimum
  • $0.90 / thousand gallons
  • $1.15 / thousand gallons
  • $1.40 / thousand gallons

A dollar for 1000 gallons of clean, fresh tap water? That's insane! By comparison, bottled water by the gallon costs about $1.99. Not bad, but that's $1990 for 1000 gallons. Why is there such a cost disparity with tap water? How can anyone be motivated to conserve water at these low rates,
Wouldn't it be prudent build "aqua equity" for future generations?
other than via a guilty conscious? And let's face it; there aren't many turning on their taps ladened with guilt. (If the water companies got savvy they'd upmarket their image with exotic brand names, pricing and refillable bottles with fancy labels adding cachet to their product. Imagine having bragging rights to elite-sounding potable water! It's not that silly a suggestion, since that is essentially what Coca-Cola did with Dasani and PepsiCo with Aquafina. They're both filtered municipal tap water.)

Actually what we really need is a "watershed moment"; a trickle-down epiphany to appreciate how finite and precious our water supply is. The first step should be to make users conscious of their water consumption — and that can be accomplished handily by raising the price per gallon and using a more dramatic cost gradient for excessive use. It sounds crazy, but those concerned about conservation should lobby for pricing increases.

Another way to raise awareness might be to move our water meters out from their usual obscure locations into full view in kitchens and bathrooms — fitted with big, red digital read-outs displaying gallons used in real time. Education on where our water comes from and how it's treated,
We should instill among ourselves an idiom-cum-mantra to "spend water like it's money."
stored, delivered and renewed would also serve to engender an appreciation of what is the major constituent of all living things.

There's an old saying attributed to spendthrifts that says they "spend money like it's water." Maybe we should instill among ourselves a new idiom-cum-mantra to "spend water like it's money." We all tend to waste water and take its current abundance for granted.

Wouldn't it be prudent to pay more today to change our wasteful habits, while adopting a mindset focused on conservation to build "aqua equity" for future generations?

Time will tell...

posted by John Herman 7:56 AM

Friday June 1, 2007

The Evolution of Status

After many years of blood and sweat, I was fortunate enough to have the opportunity to purchase a house constructed from the ground up. It was an exciting experience watching barren land evolve into livable structures; the genesis of yet another suburban development supplanting what was once a dairy farm.

As the homes were completed and occupied in almost perfect sequential order, the display of owner status in various degrees became evident and resembled a stadium crowd performing "the wave" at a sporting event.

After the wave subsided, what followed was a hushed assessment of each owner's financial means and personal taste. Judgment was based on observations such as: Who had splurged on the fanciest window coverings? Who installed the most and most expensive exterior lighting fixtures to replace the builder cheapos? Who laid the lushest sod lawn? Who erected the largest or most original custom-made curbside mailbox? Who was able to watch broadcast TV without a fuzzy picture, as witnessed via an antenna on the roof? Who were among the first to install automatic irrigation systems? Who had resplendent landscape designs immediately realized into manicured mini-arboretums? Who had their monotone wheat-colored walls professionally refinished with faux, murals and other elaborate wall coverings?
And on and on...

About eight months later cable TV finally arrived, followed by an incredible aerial flip-flop. Those who'd brandished clear reception via rooftop antennas quickly removed their one-time status symbols, since it was now embarrassing to be perceived as one not paying the premium.

As the years passed it became more difficult to recognize changes that might be discerned as enhanced cachet, but not for long. Without warning, a new wave swept through swelling heads ever higher — leaving in its wake huge curbside dumpsters signaling interior renovations or extensions. Paving stone became the rage too, and out went plebeian concrete walks and blacktop driveways. And not long after, expensive foreign and sports cars graced the spiffy new driveways.
And on and on...

Finally, after two decades and with all visible forms of home status exhausted, the ultimate bragging right today is to erect a "For Sale" sign, host a garage sale and move out to a carefree leisure village where it's warm and sunny all year 'round.

*****

So you have to wonder — where does this pettiness and vainglory stem from? It's apparently embedded in our designer genes, going back thousands of years. Anthropologists actually consider this to be advanced behavior — when compared to our more ape-like ancestors that is.

New York University's Randall White explains:
"One of the things that we know from studying modern humans is that personal adornment and the symbolic communication of a social identity is involved in maintaining differences within a society. By studying artifacts we imagine that what was going on 40,000 years ago was the first time in human evolution that we have the internal subdivision of human societies into different categories of social persons."

And after thousands of years we're still at it. We can make just about anything into a symbol of status. But one person's object of distinction might be another's folly. So we have to be among like-minded people to make an impression. And you can't even take it with you!

You have to ask then, after all these millennia, isn't it about time we evolved beyond this small-minded behavior? [Nah!]

Time will tell...

posted by John Herman 7:17 AM

Tuesday May 1, 2007

Web 2.0 — New & Improved?

Suddenly the term "Web 2.0" seems to be everywhere in popular media. From a layman's perspective, it's defined as a more interactive web beyond passive browsing that enables social networking, and content creation and collaboration. Applications commonly associated with Web 2.0 include Wikipedia, MySpace, Second Life, Facebook, Gold Rush, Digg, Twitter, Yahoo! Answers and YouTube.

Version numbers like "2.0" are adopted from the software industry, from what's commonly referred to as the "development life cycle." The confounding terminology is being unabashedly usurped by advertisers making it part of our vernacular. Although their target market may not truly grasp the pretentious and technical jargon being exploited, what is undoubtedly implied is "new and improved." But is it really?

To attempt to answer that question, lets for fun try to retroactively apply (loosely) "something-point-something" version numbers to several ground-breaking advances from the past — innovations unleashed upon the masses at a time before software and its perpetual upgrades began controlling yet-to-be-developed computer hardware.

For example, "Radio 1.0" enabled millions of regular folks to acquire console-sized, beautifully crafted, wood-encased receivers to experience entertainment programming geared towards a wide audience. Breaking news would propagate across the land almost instantaneously. A new sense of mass identity and community was established using technology. For many, the radio became a focal point in the home — a place to gather each night.

So what innovation could we ascribe to Radio 2.0? The car radio perhaps. Radio 3.0? That might be the tiny, tinny, plastic and portable transistor radio. The pocket marvel that enabled millions of baby boomer teens to revel in their music — rock and roll — defining a generation while driving parents crazy. For many, the transistor radio became a focal point of the self — a personal gadget to hideaway with. Taking radio's progression further:
Radio 4.0 - FM
Radio 4.5 - FM stereo
Radio 5.0 - XM/Sirius satellite
Radio 6.0 - ?

Now let's try to apply version numbers to pre-recorded music:
1.0 - phonograph - 78rpm
1.2 - phonograph - 45rpm
1.5 - phonograph - 33 1/3rpm "Long Playing"
2.0 - hi-fi and stereo
3.0 - 8-track tape
3.5 - cassette tape
4.0 - CDs
5.0 - iPod
6.0 - ?

How about culture-transforming point-to-point communication:
1.0 - telegraph
2.0 - telephone rotary
2.5 - telephone pushbutton (touchtone)
3.0 - cell phone
4.0 - BlackBerry
5.0 - ?

And finally the evolution of locomotion:
0.9 - four-legged ambulation
1.0 - two-legged ambulation
2.0 - tamed horse
3.0 - covered wagon
4.0 - trains
5.0 - horseless carriage
5.5 - modern automobile
6.0 - prop(eller) planes
6.5 - jets
7.0 - transporter?

They say rarely is a sequel as good as or better than the original. Nevertheless, in each case above the successive "upgrades" were indeed improvements. I don't think any of us would want to go back to Locomotion Version 3.0 — the covered wagon. Or Version 1.0 for point-to-point communication. Or even Version 1.2 for pre-recorded music.

However, there are downsides to progress, as would be expected. The evolution of point-to-point communication has now diminished the distinction between work and leisure. Progress in locomotion has brought us pollution, (potential) global warming, wide swathes of concrete, airport and traffic jams — and too many fatalities. And now the iPod has the potential to disconnect us from reality as we go about our business with earbuds stuck in our ears.

Today the Internet's web is a treasure trove of easily accessed information and knowledge. It would be a shame for it to parallel TV's evolution towards mediocrity and blatant commercial profit. Indeed, considering the impact of the web so far, and that we're only approaching version "2.0," it's hard to imagine what a Web 5.0 might bring.

Which leads us to the inevitable question, Will we one day wish for the good ol' days of Web 1.0?

Time will tell...

posted by John Herman 12:23 AM


Click here for unusual sightings of "2.0" terminology in the media.

Postscripts by John Herman — May 16, May 29, & and July 18, 2007

Sunday April 1, 2007

Wanted: Cooler Heads!

Thanks to Al Gore, Global Warming is again a "hot topic." Given the renewed popular debate, we thought it appropriate to bring attention to an email posted on our Q&A page exactly four years ago to highlight how little attitudes have changed.

Hopefully the situation doesn't get "too hot to handle" as we squander precious time arguing back and forth.

John S. H. wrote:
One of Phrenicea's Q&A responses made the declarative statement, "that global warming will become a worldwide concern beyond just talk." How is Phrenicea going to respond to those thousands of celebrated and famous scientists that disagree with the global warming theory? And also to me!

Mother Nature since the beginning of time has been creating havoc with the earth's environment with volcanic eruptions, which spread dust and a multitude of gasses into our atmosphere, jungle rot, forest fires, as do our treasured wet lands that emit all sorts of gasses and strange odors. Mother Nature however, eventually cleans up all that mess with a magic formula that neutralizes all of the above natural phenomena; otherwise our civilization would not exist.

Incidentally, where are all the proponents of global warming whenever there is a never-ending deep freeze as it is occurring this winter season? But as soon as there is a 3-day heat wave during the summer, our liberal press come out and spew their unproven agenda all summer long. Phrenicea is controversial, but on the subject of global warming it is definitely leaning towards the liberal camp.

Phrenicea, I believe, did not give this subject sufficient thought because global warming is still an unproven theory and too soon to become a scientific fact. Phrenicea has a lot to learn, revise, and change as we all do as time goes on.

Phrenicea replied:
First we want to make it clear that we are responding to your email, and not to the "thousands of celebrated and famous scientists" wherever they are.

Next we want to define "global warming." It is a fact, at least since we're able to measure, that the average worldwide temperature is increasing. That is "worldwide" — not your easy chair, your state or even your hemisphere. It is a fact that the last several years have been the warmest on record. The debatable issue is whether this warming is ultimately deleterious to our environment and our way of life.

History is replete with scientific controversy. Very few believed Copernicus when he proposed the sun and not the earth was the center of the universe [solar system]. Again, very few believed Galileo when he offered proof with his telescope. Few believed the earth was round until the return of Columbus from his trans-Atlantic voyage. Few believed Darwin's theory of evolution — that such complex life forms could evolve. Eventually, with further study and tests, the majority of learned persons were convinced of these radical propositions.

This may or may not turn out to be the case with the global warming controversy. But given our relative ignorance now, wouldn't it be wise — for our children's sake — to err on the side of caution?

You mention Mother Nature's "magic formula." That magic formula is nothing more than time, large chunks of it. Nature's tendency is to establish a state of equilibrium after what could be described by us as catastrophe or a "wild card" event. But the quantity of time required is enormous — thousands and millions of years. To put our time here on earth in perspective: picture a tall skyscraper to represent the age of the earth. We humans have been around for what would only be a thin coat of paint on the roof. And it's only the last couple of centuries that we increased the rate of change on this earth beyond what was natural before.

Our egocentric thinking often masks the reality that nature does not care about us, and is not there to protect or ensure our survival. She could care less if we live or die. She is not liberal, conservative or cognizant of our vainglory and pettiness. The stark realization then is that our survival as a species is now up to us — and that our penchant to change the earth is outstripping nature's capability to repair damage we inflict. Worrisome too is that the "we" will soon expand to include the developing nations as they strive to catch up with the U.S., Japan, Russia, Canada, Australia and Europe.

*****

That was our reply four years ago.

Today we propose to cool down our hot (sorry!) heads and approach this potential problem from the bottom up (green grassroots?), instead of waiting for or depending on government edicts. It's time to act more responsibly as individual consumers. Regardless of what side of the debate you're aligned with, there's no excuse for being so wasteful with our precious natural resources.

Grab a copy of Al Gore's DVD — if for no other reason than to read the inside jacket suggesting "ten things to do" to mitigate your own consumption. Even if the theory of catastrophe is one day proved incorrect, there'll be no harm done if everyone was less profligate and more frugal going forward.

In conclusion...

If every energy spendthrift of modern society performed a one-eighty lifestyle change by adopting a conservation mindset, the synergy of "the power of one" and "strength in numbers" would likely reduce consumption and demand for energy sufficiently to render the global warming argument moot.

Our children and grandchildren would be grateful.

Time will tell...

posted by John Herman 12:27 AM


TIME Quote

Postscript by John Herman April 18, 2007 (from TIME Magazine April 23, 2007)

Thursday March 1, 2007

Brains and neurons and computers! Oh, my!

The premise of the Phrenicea scenario of the future is that we give up trying to copy the human brain utilizing computer technology and artificial intelligence, and instead appreciate its amazing complexity by immuring our brains upon death and keeping them alive within a "braincomb" to function and assist those still living their ambulatory lives.

A while back I emailed Dr. Terrence Deacon, professor of Biological Anthropology and Neuroscience at University of California-Berkeley. He's an honest-to-goodness, genuine brain scientist — as opposed to certain computer scientists who are brain-expert wannabees, typically proselytizing in the media their tenuous computer-brain analogies and artificial intelligence predictions. (We at Phrenicea like to call them "chipheads.")

I asked Prof. Deacon for his views on computer technology and artificial intelligence, and even the feasibility of keeping brains alive a la the Phrenicea scenario. I also asked, "Given that 'brains are not designed the way we would design any machine' [a direct quote by Dr. Deacon], do you agree or disagree with my major assumption that we will not (within the next 100 years) be able to mimic the brain's function utilizing non-biological technology?"

Given Dr. Deacon's lofty stature and busy schedule, I was pleasantly surprised and appreciative to receive his prompt and cogent reply as follows:

1. Brains are quite unlike computers. The closest thing to computation in the brain is probably the performance of highly over-learned rapid ballistic movements and systems that regulate visceral systems. And even these are in effect the production of "virtual computations" — not true computations but simulations of very simple computation by stochastic approximation. Most of cognition is much more analogous to chaotic attractor dynamics and evolutionary processes.
[My interpretation: Brains are not like computers. (Yes!) Learning physical movements to the point of where they become "automatic" (becoming proficient in a sport, musical instrument or even driving a car) might cause the brain to act as if it were computing from our perspective — and not an exact form of computing at that, more like a controlled anarchy of processes.]

2. Despite the incredible network size of brains and the additional fact that even the subcellular level of intraneuronal information processing rivals current "neural net" simulations, neither complexity nor unattainable dynamical organization offer an unsurpassable threshold.
[My interpretation: We'll eventually unravel the mysteries of how the brain works.]

3. Once we get past our fascination with the brain-as-computer dead-end metaphor it should not be impossible to begin to build (or more likely grow) devices that utilize this kind of information-generation logic (notice I didn't say "information processing") and at least some of the adaptive and agentive features of brains will be possible to achieve (and I don't mean simulate) this way.
[My interpretation: The brain is not analogous to, and does not work like a computer. However, we may be able to one day grow biological structures that mimic the brain in some limited ways.]

4. Neurons don't live forever and are subject to spontaneous functional degradation even in perfect metabolic support conditions.
[My interpretation: The envisioned Phrenicea "braincomb" will need more research and development to sustain human brains forever. Back to work guys and gals!]

5. Also, contra sci-fi stories, the idiosyncrasy of representational encoding in different brains and the incompatibility of cognition and computation processes will make transfer of a person's memories, thoughts, personality traits, and experiences to different "media" (e.g. brain to machine) impossible.
[My interpretation: Our knowledge, memories, etc. will never be transferred to silicon or any other inorganic computer-memory substrate.]

Terry

*****

Prof. Deacon in a very short space provided authoritative views on computer technology and artificial intelligence vis-à-vis the human brain. He is pessimistic about our ever mimicking the brain's function utilizing non-biological technology. He even addressed the feasibility of keeping brains alive a la the Phrenicea scenario.

You may or may not agree with all of his assessments (or my interpretations!). Regardless, this perhaps is the most exciting field of study today — and probably will be for most of the 21st century.

Time will tell...

posted by John Herman 4:54 AM

Thursday February 1, 2007

Armchair TIME-traveler...

For years I've subscribed to TIME, long ago touted as "The Weekly Newsmagazine." It, along with the other mainstays of the genre, Newsweek and US News & World Report, usually arrives by mail on Monday or Tuesday to competently report and analyze in almost "Monday morning quarterback fashion" the prior week's events.

Then suddenly the January 15 issue of TIME arrived on Friday the 5th. Was this a case of TIME-travel or had TIME's publishers found a way to supercharge snail mail?

Believing neither, and curious as to how TIME beat the others, I began flipping its pages for some clue. Sure enough, on page six was my answer. The managing editor explained, "In fact, it's the first copy of TIME magazine to go on sale on Friday in more than 50 years. We've moved our publication schedule because the news environment has shifted..." He then boasted, "The traditional newsmagazine was retrospective, looking back at what happened the previous week. But today's TIME is much more forward-looking..."

Conversely, on TIME.com it's easy to travel backwards in TIME by viewing past issues. It's really a neat feature. The covers reflect the most important events and concerns of the day and are well executed with illustration or photography. And it turns out those old-TIMEer editors were pretty "forward-looking" too, though unintentionally. While moseying about I found covers from the 1960s asking "WHAT IF WE JUST PULL OUT?" [of a then-unpopular war] and "WHAT'S WRONG WITH U.S. MEDICINE?". Others featured: "WATER: Worldwide Use and Misuse," "THE DEMOCRATS REGROUP," "TODAY'S TEENAGERS" and "THE COMPUTER IN SOCIETY."
All could just as well be cover stories from today!

Ironically, while losing myself in TIME.com I stumbled upon the April 12, 1968 issue proclaiming, "We try to channel the flow of events into a coherent pattern of stories, to emphasize the important details and, whenever possible, to provide perspective." I guess that's the "traditional" retrospective approach that's being abandoned. Gee, how TIMEs change!

Extemporaneously, I then pictured my newly arrived January 15 issue miraculously intermixed amongst those from the '60s — and wondered what it would've been like to read it way back then. Would anything be relevant? Intelligible? Would any of the advertisements be familiar?

Entranced into an armchair TIME-traveler steeped in a '60s mindset, I randomly opened to page 52, featuring "Welcome to Wi-Fi-Ville." Imagine how puzzled I would be reading about: free wireless Internet, wireless-fidelity (wi-fi) network, the Web, sunbathers Web surfing, municipal wi-fi, broadband prices, high-speed access to rural areas stuck with dial-up, VOIP (voice-over-Internet protocol), telcos, EarthLink, DirecTV, DSL, Yahoo, Google, surfing porn and downloading!

Now, if only eBay existed for me to sell a magazine from the future!

*****

Oh well, back to reality. I guess there can be a lot of changes over the course of 40 years — including new strategies on how to publish a weekly newsmagazine. But if TIME really wants to be "forward-looking," they could deliver (implant?) by Friday an issue from 2045. I'm curious now as to whether it would be totally incomprehensible — plus I'd get to sell it on eBay.

TIME will tell...

P.S.
A TIME Trivia Question:
What issue was the last to proclaim on its cover, "The Weekly Newsmagazine"?
Click here for the answer.

posted by John Herman 5:29 AM

Monday January 1, 2007

Y2K + Seven ? Yikes !

It's hard to believe that the infamous Y2K scare was seven years ago already. Time flies as "they" say — and the past twelve months surely did.

Along with the passing of 2006 comes inevitable reflection on what was and wasn't accomplished — as well as looking ahead to 2007 with hopeful optimism. Another consequence is a bit of laziness from too much of too much.

So it is with this lethargic spirit that we revisit the "Twelve Blogs of 2006" with premature nostalgia — and perhaps feigned interest as to whether they're still relevant.

December's blog questioned many questions without answers:

Does our preoccupation with spiritual perpetuity stem from our species' ignorance of what death was centuries ago, when the living became lifeless? Was it our forebear's utter confusion of death that led to speculation on where the animation went — like into the heavens?

What is it about humans that makes most of us think we're superior to all other life on earth? Is it an evolutionary trait that helped us survive in the wild — empowering us to conquer not only natural prey but predators as well (and eventually wrecking the balance of nature in the process)? And since this perception is almost universal, is it genetically endeared and not cultural? (Read more)

• In November we intended to address unintended consequences resulting from continued advancement of scientific knowledge:

Nevertheless, as our quest for scientific knowledge marches on (making our lives ever more complex) — particularly related to biology — we are going to have to face the fact that we are not all created equal.

Just as our disruption of the macro-environment has led to pollution, species extinction and global warming — analogous unintended consequences await our intrusion into the ancient inner workings of cells... (Read more)

October's blog questioned Dow Chemical's weird advertising campaign:

These days it seems the importance of marketing is elevated beyond the product. In this example it has taken on a life of its own, surpassing any practical objective. It seems to function mainly to mold subjective feelings.

Is it because we're beyond being impressed with lubricants, epoxies, and analgesics? How else should a chemical company advertise in an age where the behind-the-scenes, nuts-and-bolts of society are deemed boring — or worse, beyond general comprehension? (Read more)

September's entry was up-front about rear-end vanity:

Have we come that far in auto technology that it's laughable now to think that Ford would deem the now mundane TC [Traction Control] worthy of rear-end vanity? But Ford is not alone in silly trunk bunk. Through the years I've studied many a car's derrière and have seen banality forged in chrome... (Read more)

• In August we ran from the "Attack of the RFIDs!":

The scary part is that you will end up being a mobile device beaming all sorts of data to who knows who — through barriers and from as far away as 700 feet. Imagine if your charge cards, clothing and shoes all were transmitting data (even the age and color of your underwear)! Inquiring types could identify or profile you and track you everywhere... (Read more)

• In July it was too hot to be serious so we played some games:

When the Phrenicea scenario of the future was first presented on this website six years ago, its pages stated that all the world's knowledge was immediately available via engagement with Phrenicea, and that boredom had ensued without the challenge to learn the traditional (aka hard) way with study. Could we already be approaching this point with access to the amazing capabilities of today's search engines? (Read more)

June's blog braved the waves of offshoring crashing upon these shores:

Will there be new faddish, jargon-laced management-speak still to come whose consequence will reduce jobs and salaries even more — and further level the highest standards of living with the poorest on earth? How many more employees with years of dedicated service, along with new job seekers with expensive academic degrees, will yet find that their career path is ultimately a horizontal plank? (Read more)

• In May we tackled the vices of phone answering devices:

Nowadays, many households have both parents working — and with several jobs, so it's time to stop apologizing for not being where others might think we belong for their convenience. Recorded messages to convey an apology or to ameliorate a caller with humor or niceties is vestigial behavior from the days when much more time was spent at home, and leisure was truly that — not catching up on errands, attending classes, pursuing entrepreneurial endeavors, or sifting through email...(Read more)

April's blog celebrated one student's epiphany:

Many of today's high school and college students often wonder (vociferously!) why they need to memorize boring equations, formulae and other seemingly trivial or useless information. What many times is not emphasized by their teachers is the origin and significance of man-made expressions of what is essentially describing the workings of nature. They're not taught that many of these discoveries required lifetimes of effort — often by iconoclasts, eccentrics, heretics and recluses...(Read more)

March's entry bemoaned a disgraced General, as in Motors:

GM is struggling with another type of baggage — the consequences of short-term strategies and practices that wrecked its spectrum of brands (Chevrolet to Pontiac to Oldsmobile to Buick to Cadillac), and the dismal reputation it earned so well in the '60s, '70s and '80s for producing cars that frankly were just plain inferior... (Read more)

• In February we suffered the first wave of the "Baby" Boomer generation turning 60 years of age. (Ugh!):

Now turning 60, this famously spoiled (sans "baby"?) boomer demographic anomaly will be a force to be reckoned with. We'll be retired, but not retiring gray-haired activists with plenty of time on our hands to leverage our generational clout to lobby for our interests. By sheer demographic heft we'll again effect change as in years past. We'll read cover stories at the 70- and 80-year milestones chronicling our deterioration and the burden our cohort will be inflicting upon younger generations... (Read more)

January's blog lamented a disturbing trend among chiropractors tempted with franchising to boost profits:

Chiropractors traditionally have spent lifetimes developing successful practices of respectable size, which reflect their character and values. They're now tempted with visions of big profits by wooing masses of clients less sophisticated and discerning — and more receptive to being dazzled with faux technologies and procedures. Secondary is keeping existing patients that may be sharp enough to see through the hype and perceptive enough to sense the tawdry goal of profit...(Read more)

You may or may not agree, but after reviewing the year's blogs it appears most are still relevant as we roll into the New Year. And somehow after recalling the energy that instigated the blogs to begin with, we don't feel quite as lethargic.

Maybe that's the real benefit of looking backward, reminiscing and singing "Auld Lang Syne" — not for nostalgia per se, but to recharge our batteries and begin another year with renewed energy and enthusiasm.

Time will tell...

Happy New Year!

posted by John Herman 9:58 AM

Friday December 1, 2006

Questions, ad infinitum...

Time. Newsweek. Wired. U.S. News & World Report. The Humanist. (Did I miss any?)

All of these publications recently ran stories dealing with various themes associated with consciousness, the soul, life after death and the belief in God.

The debate of these topics in the media can be vociferous and has gone on for decades. I can vividly recall Time magazine's then-shocking April 1966 cover asking "Is God Dead?" and the ensuing controversy.

The least this media dialog is good for is to get us to think beyond our day-to-day realities and to ask ourselves important questions such as:

Does God exist? Were we divinely created? Or are we merely just another evolved life form on earth that happens to be conscious of our own existence and able to contemplate the past, present and future? As U.S. News & World Report recently pondered, "If consciousness exists only to respond more effectively to information in service of life, then [perhaps] we are nothing more than Darwinian survival machines." Yet so many do believe in a God or creator. Given that we have no observable proof, why is that?

What is it like after death? Is there an afterlife? Or is death like sleep but without dreaming — totally unaware? It's depressing, and perhaps that's why the widespread belief in an afterlife came about — to act as a crutch to overcome a depressing reality of impermanence for those who cannot bear to face the possibility of being totally extinguished upon death. Unfortunately the living will probably never experience death and come back to tell about it. And even if they eventually could — would it be like waking someone up from a deep sleep who's unable to describe it?

Does our preoccupation with spiritual perpetuity stem from our species' ignorance of what death was centuries ago, when the living became lifeless? Was it our forebear's utter confusion of death that led to speculation on where the animation went — like into the heavens? (And this preoccupation with eternal existence can make humans do astonishing things. Do you think the Japanese kamikaze pilots would have committed aerial suicide if they believed this life was all there is?)

And still more questions abound...

What is it about humans that makes most of us think we're superior to all other life on earth? Is it an evolutionary trait that helped us survive in the wild — empowering us to conquer not only natural prey but predators as well (and eventually wrecking the balance of nature in the process)? And since this perception is almost universal, is it genetically endeared and not cultural?

Many might argue that because we are conscious of ourselves that that makes us superior. But as science unravels the mechanisms within the brain, it might just corroborate philosopher Daniel Dennett's position that "consciousness is about fame in the brain." So, are we all just legends in our own minds?

Does technology perpetuate our attitudes of superiority? Just as hydraulics and electronics can empower a relatively tiny person to operate machines many times their size — an airliner for example — technology can at the same time inflate our perception of ourselves. But is this genuine prowess? In the grand scheme of things, are we still minuscule entities with just an overgrown sense of importance?

Finally, here's a really important question. Are state-sponsored lotteries fair?
What?!

A local newspaper featured a story about a lottery winner with a spate of bad luck who attributed his win to his recently deceased wife "looking down upon him." Imagine if this were fact. Wouldn't that be unfair to all the other lottery players? Imagine if every lottery winner was the recipient of some dead relative's influence. Should those with deceased family members against gambling in life bother to enter?

This thinking is more common than not. How many times have you heard someone say that the weather miraculously worked out for their special occasion — as if to say that because of their occasion, the weather was made to accommodate them by some entity or someone watching over them? This begs the question: Why are humans so egocentric?

*****

There are so many unanswered questions — too many in fact to even debate the unknowns. But we can still continue to ask them while acknowledging our ignorance, and perhaps in the process become more tolerant of those with incongruent views.

Time will tell...

posted by John Herman 5:43 AM

Sunday October 1, 2006

How Now DOW?

Isn't marketing amazing? It can come at you subtly or be right in your face. It can turn you off or lure you into its lair.

Or if you're like me, it can trigger a small effort investigating a company's marketing history.

What motivated this month's Two-Cents weblog is an unusual advertising campaign by Dow Chemical.

For what seemed like the entire summer, I kept opening the front covers of Newsweek, Business Week, Scientific American and several others to find a prominent ad spanning two pages. Impossible to miss on the left-hand page was an in-your-face facial portrait overlaid with the large letters "HU," followed by the word "HUMAN" and an unusual equation "7E+09."

For weeks I would glance at the face, the letters, the equation — and while not really focusing on the right-hand page, subliminally recognize the company Dow by the red diamond logo. Then I would just turn the page.

After perhaps the tenth iteration of this scenario I finally read the accompanying copy. What hit me was that this was an advertisement for a chemical company, yet there was no mention of a product or service. Instead there was flowery prose like "Bonds are formed between aspirations and commitments..." and "...energy released from reactions fuels a boundless spirit..." What? This is a company that manufacturers chemicals!

Curious now how this advertising approach differed from years' past, I searched through some musty boxes of old Scientific American magazines. (Crazy, yes?) Sure enough I was able to find a few Dow advertisements.

In the booming post-war 1950s and 1960s, the function of marketing was simple and direct — to sell.

  • In 1954 Dow bragged about their permanent silicone lubricant. Imagine that it gave "consistent performance over a wide temperature span" and "lasted 30 times longer" and was "semi-organic and inherently stable." (Today that might make great copy for a prophylactic!)
  • In 1959 Dow boasted, "Today, the imposing list of high quality pharmaceutical chemicals supplied by Dow in abundance includes bromine, medicinal salicylates, epsom salt, chloroform, analgesic drugs and elemental iodine..." (Great, especially if you got a bad cut!)
  • And in 1962 Dow broached the sticky subject of resins, "Consider first, an extremely pure epoxy resin with a viscosity of 4,000 - 6,400 cps, and color 1 max. This is essentially the pure diglyceride ether of bisphenol..." (Yes, now that's a real macho chemical company ad!)

These days it seems the importance of marketing is elevated beyond the product. In this example it has taken on a life of its own, surpassing any practical objective. It seems to function mainly to mold subjective feelings.

Why is that?

Is it because we're beyond being impressed with lubricants, epoxies, and analgesics? How else should a chemical company advertise in an age where the behind-the-scenes, nuts-and-bolts of society are deemed boring — or worse, beyond general comprehension?

Is it the negative connotations associated with the word "chemical," given past incidents of pollution? Is it the unexpected consequences and deleterious effects that have been associated with man-made concoctions? Or is it that we are so detached nowadays from the "brick and mortar" of what keeps the complex infrastructure running.

*****

So what does "HU" and "7E+09" mean in the Dow ad? For that I had to visit the Dow website and read the "Around Dow" company newsletter. Sure enough, my observations were confirmed regarding the absence of product in order to concentrate on building Dow's reputation.

I can just imagine the Dow executives drinking up their marketing firm's mumbo-jumbo proposal to adopt chemistry's periodic table as the basis for the "Human Element." And that the contrived atomic weight of the human population 7E+09 would be symbolic of Dow's commitment to humanity's well-being.

But is it smart advertising when it's incomprehensible to the casual observer? Or even to those intrigued enough to try to figure it all out? Should one have to search a company's website in order to garner the intent of their ad campaign?

Worse still, will there come a time when most advertising supplants the mundane indignity of pushing product with haughty subliminal messages, propagated by a self-indulgent marketing profession bent on impressing peers while attempting to make us feel good about them?

Time will tell...

posted by John Herman 8:53 AM

Friday September 1, 2006

Stoplight HindSight

While stopped at a traffic light behind what must have been an old Ford Windstar minivan (even the current models look old), I snickered at the chrome appliqué announcing in bold letters that it was equipped with "Traction Control." Wow, imagine that! Windstar Rear

Have we come that far in auto technology that it's laughable now to think that Ford would deem the now mundane TC worthy of rear-end vanity? But Ford is not alone in silly trunk bunk. Through the years I've studied many a car's derrière and have seen banality forged in chrome about:

  • engines — V-6, V-8, Hemi, Rotary, Turbo, Fuel Injection, Tri-Carb, Quadrajet, OHC
  • transmissions — Powerglide, Hydramatic, Automatic, 5-Speed
  • drivetrains — FWD, AWD, 4wd, 4 Matic, QuadraDrive, ABS
Then there are meaningless tags like GT, Touring, Limited, Unlimited and the dated Deluxe, Super and Custom.

In the good ol' days, most vehicles were simply named and branded eponymously (Chrysler, Ford, Toyota (Toyoda), Studebaker, Olds(mobile), Cord, Ghia, Dodge, Chevrolet, Buick, Mercedes-Benz, Hudson, Duesenberg, Tucker, Kaiser-Fraser, etc.). Specific models were typically distinguished with letters or numbers (Model T, Model 55, Series D, etc.).

Eventually, with the proliferation of models within brands, cars were named after:

  • animals — Impala, Mustang, Falcon, Tiburon, Stag, Bronco, Ram, Cobra, Barracuda, Lark, Rabbit, Jaguar, Colt, Eagle, Cougar, Stingray, Charger, Hornet, Beetle, Pinto, Hawk
  • gods and mythology — Mercury, Dragon, Titan, Fury, Demon
  • wind — Zephyr, Scirocco, Tempest
  • geographic places — Monte Carlo, Ventura, Sedona, Eldorado, Tucson, Lucerne, Bonneville, Plymouth, Capri, Windsor, Seville, LeMans, Manhattan, Monterey, Belvedere, Continental, Fleetwood, Monaco, Calais, Riviera, Biscayne, Park Avenue, Malibu, Bel Air, Catalina
  • macho types and legends — Cadillac, Chieftain, DeSoto, Ranger, Matador, Rebel, Lancer, Commodore, Valiant, Champion, Maverick, Challenger
  • royalty — Ambassador, Regal, Royal, Tudor, Signet, President, Victoria, Crown, Imperial, Coronet
  • weapons — Cutlass, Dart, Torpedo, Javelin, Armada, Excalibur, Arrow, Laser, Magnum
As brands and models continued to proliferate — and necessity being the mother of invention — newly coined words would, with clever marketing, define the vehicle's image for which they were named (Corvette, Galaxie, Electra, Chevelle, Polara, Camaro, Futura, Altima, Celica, Toronado, Jetta, Camry, Forenza, Corvair, Invicta, Sportage, Mystere, Impreza, Boxster, Acura, Lexus, etc.).

Mercedes-Benz resisted this appellation temptation and continued with letters and numbers to distinguish their models. In the mid-1950s, its SL two-seater sports car debuted designating "sport light." A few years later came the SEL denoting an S-Class car with fuel injection (Einspritz) and a long wheelbase. (And just to add to the confusion, the "S" in SEL was not the same as the "S" in SL.)

American cars toyed with the idea in the 1960s with the Ford XL and LTD, Chevy SS, Javelin SST and AMX, Dodge GTS, Plymouth GTX, Barracuda S, Cougar XR7 and Pontiac GTO.

As Mercedes' status and prestige spread worldwide, other manufacturers dove into the alphabet soup. (In 1996 Acura incredibly renamed its flagship Legend to RL, discarding a decade's worth of valuable brand equity.)

Now it's all the rage to name vehicles with meaningless (guess the brands) letter combinations like DTS, CTS, STS, XLR, SRX, ESV, EXT, TL, TSX, RSX, MDX, FCX, FX, QX, LS, ES, RX, IS, GS, SC, LX, LT, MX, RX, CX, CLK, CLS, CRX, SLR, SLK, SVT, GL, HHR, NSX, XC, XJ, XK, TT, C, A, E, G, Q, M, G, H, R, S... Zzzzzzzzzzzz. (Here's the latest: For 2007, Lincoln's year-old Zephyr model will be known instead as MKZ.)
[And a note to manufacturers: Acceptable letter combinations are going fast. Soon only PU and BO will be left!]

Yet, even more prestigious today is to be able to brandish on a car's hindquarters "Hybrid" or a chrome "H" indicating "this car is electric, gets great mileage and averts global warming." Prius Rear

But we'll just have to wait a decade or so to find out — when stopped at some traffic light in the future staring at these no-longer-shiny chrome badges of status — whether today's state-of-the-art green technology will have become ubiquitous and familiar enough to elicit a snicker.

Time will tell...

posted by John Herman 7:56 AM

Tuesday August 1, 2006

Attack of the RFIDs!

Watch out for RFIDs! Although pronounced almost like triffid or aphid, they're neither sci-fi creatures nor common insects. Yet, you may soon be infested with them. Perhaps you have one on your person already, like a Mobil Speedpass, or in your car for electronic toll collection.

RFID is an acronym for Radio Frequency Identification. RFID tags are tiny microchips that have antennae to transmit to receivers data such as account numbers, physical location, product information, price, color, size, purchase date, etc. (To be technically correct, an RFID tag actually transmits a link into a database on a computer somewhere that stores the information it is reporting on such as account number, etc.)

Some of the larger RFIDs have batteries while the tiniest don't — they get their power from the radio transmitters asking for their information. Amazing isn't it?

In addition to being on a key chain or stuck on a windshield, radio tags will soon be attached or embedded in credit cards, clothing, grocery items, drug bottles, books, magazines, cell phones, computers, tires, passports and even you — beneath your skin for access to buildings and for storing your ID and medical records.

The scary part is that you will end up being a mobile device beaming all sorts of data to who knows who — through barriers and from as far away as 700 feet. Imagine if your charge cards, clothing and shoes all were transmitting data (even the age and color of your underwear)! Inquiring types could identify or profile you and track you everywhere. You could even get hassled when your stuff's expected lifespan is reached to buy replacements. And because RFIDs last about ten years, they can potentially transmit information for purposes well beyond their original intention.

RFID infiltration is already underway. Besides Mobil with its SpeedPass, banks have begun issuing key chains with MasterCard's PayPass that can be used at subway stations, 7-Eleven, McDonald's and movie theaters. Their marketing literature hypes the benefits:

• No need for cash
• Amazingly quick and easy way to pay
• Feels like magic
• Now you can fly through the checkout
• Be the first to get what's next [Be the first on your block!]
This sounds great, but because these tags are linked to one of your accounts, they will have the means to eavesdrop and monitor your life — recording where you go and what you buy. There is a concern too that thieves will be able to hack your radio tags to make unauthorized purchases.

Not to pick only on banks, retailers want RFIDs to replace traditional barcodes to better track their inventory. They'll also be able to scan you coming in and perhaps direct you to aisles based on their perception of your needs. [Your underwear is how old?]

The Phrenicea scenario of the future envisions the total loss of privacy. Perhaps this is the way it will begin.

Time will tell...

posted by John Herman 6:05 AM


Click here to learn how banks are planning to play you for a fool by positioning RFID cards as "upscale."

Postscript by John Herman — June 12, 2007

Saturday July 1, 2006

Summertime Fun

Now that the lazy, hazy, crazy days of summer are here, it's time to chill out for a change. For those yearning to read more serious matter, just scroll down to previous blogs. Unfortunately most of them are still relevant.

So how about a "connecting the dots" puzzle!

In your mind's eye, or on paper if you draw the dots as shown, starting from any point, draw four continuous lines (without lifting your pen or pencil) so that each of the nine dots has at least one line running through it.

Although the solution will be alluded to below, it will not be posted here so feel free to enjoy this blog to the end without your fun being ruined.

.     .     .

.     .     .

.     .     .


When you've finished (or have given up!) with the Nine Dots Puzzle or want to multitask, here's another one. Try to determine the next sequence of numbers based on the following pattern:

1
11
21
1211
111221
312211
13112221
1113213211

What's the next sequence?

Did you get it? Did you give up?

When these puzzles were posed to me (at a Six Sigma seminar break no less), I like you was in front of a PC. Rather than racking my brain for an answer like many of my colleagues, I cleverly googled on Yahoo! (I crave contrarianism) "9 dots puzzle" and immediately had pages and pages of links to the correct answer. Try it!

For the second puzzle I used a search engine again using the last sequence number in quotes as in "1113213211" and boom, I again had pages and pages of links to the answer and its simple derivation. I don't think I would have ever figured out the logic for this one on my own. Give it a try!

Later on at a more serious point in the seminar it hit me that for both of these puzzles I did not even have to think — and I'm not sure this is a good thing. (I did not experience any sense of accomplishment either, although I was first to arrive at the answers to the puzzles.)

When the Phrenicea scenario of the future was first presented on this website six years ago, the engage and somnam pages stated that all the world's knowledge was immediately available via engagement with Phrenicea, and that boredom had ensued without the challenge to learn the traditional (aka hard) way with study.

Could we already be approaching this point with access to the amazing capabilities of today's search engines?

Time will tell...

posted by John Herman 7:57 AM

Thursday June 1, 2006

Offshore = On the Beach?

A recent Business Week magazine addressed morale problems the "Big Three" auto manufacturers (GM, Ford, and Chrysler) are facing related to recent and expected layoffs:
"And because the companies are erasing layers of management, the opportunities for advancement are dwindling. Many industry professionals believe the tough medicine will help their companies, but the turmoil is enough to wear down even the most determined optimist."
The following week's issue then sensationalized IBM's epiphany to reorganize its global workforce in the mold of the start-up tech companies sprouting in India; the first step being the elimination of 15,000 veteran IBMers. One software specialist divulged:
"They came and said our job is being outsourced and you'll train your replacement. If you don't, you won't get the severance package."

These articles recall an editorial letter I submitted to Fortune magazine that was published almost 12 years ago (October 17, 1994) with a pull-quote that now appears timeless:

The corporate ladder has already taken the form of a horizontal plank.
The trends back then that instigated my letter included downsizing, reengineering and outsourcing. Today we have the latest buzzword "offshoring" and its deceivingly benign-sounding synonym "globalization" added to the mix.

The pursuit of these strategies industry by industry is almost inevitable given the perception of a competitive disadvantage otherwise. Downsizing eliminates jobs usually with the known consequence of harsher working conditions for the survivors and often with a concomitant reduction in the quality of product or service — a reality which unfortunately we've grown accustomed to. Reengineering eliminates jobs primarily
How many more employees and job seekers will find their career path is a horizontal plank?
with the use of (or more efficient use of) technology. (The frustration of menu-driven automated voice answering systems immediately comes to mind.) Outsourcing is the process of transferring job functions to external entities more specialized and usually more efficient, again reducing jobs. Now offshoring is the newest darling of corporate management. It is similar to outsourcing, only the jobs being replaced end up outside national borders in order to pay well below market — yielding huge cost savings for ever more companies. Regardless of the latest buzzword that's in vogue however, the overall impact is the permanent loss of jobs.

A question that should be asked now after more than a decade of job cuts is, Where will this all end?

Will there be new faddish, jargon-laced management-speak still to come whose consequence will reduce jobs and salaries even more — and further level the highest standards of living with the poorest on earth? How many more employees with years of dedicated service, along with new job seekers with expensive academic degrees, will yet find that their career path is ultimately a horizontal plank?

The Phrenicea vision of the future predicts that there will not be enough work to go around for a worldwide population that is essentially overqualified. Perhaps this grim scenario is not too farfetched after all.

Time will tell...

Do you believe the corporate ladder has already taken the form of a horizontal plank?

posted by John Herman 5:40 AM

Monday May 1, 2006

Before the Beep

It seems many people don't care to conserve time when recording their automated phone greetings. And why should they? It's the callers that are paying to give ear to them! And with almost universal usage, I'm beginning to wonder how much extra revenue the phone companies are passively procuring as a result. And contrary to conventional wisdom, we don't always get what we pay for.

On hectic days, I occasionally get annoyed trying to reach someone not available and the obvious is stated: "...I'm not here right now, blah blah blah" or "I can't come to the phone right now..." Or when I suspect they should — but obviously will not record: "I'm standing right here next to the phone but don't want to pick up and speak with you right now!"

Some, in the process of recording prolonged salutations, unintentionally project their personalities into their greetings. I have endured a variety of types — from maudlin: "I am so sincerely sorry for not being at the phone for you and would truly rather be talking to you than be where I am now, boohoo..." — to inanely cheerful: "... and even though I'm not here right now, I want to wish you a happy, wonderful, bright, sunny and especially nice day!" Give me a break, please!

Those with a so-called sense of humor can choose from a host of pre-recorded messages. I've been greeted by ersatz British butlers and maids with French accents. The more creative add special background effects to add yet another dimension. I've been soothed to the roll of ocean waves, serenaded with classical music and hip-hopped with rap. These are innocuous but can be frustrating if you have more productive pursuits pending.

It also seems that those who test your patience with long-winded greetings give you the least amount of time to respond. After what is perceived to be only seconds, an ominous beep interrupts implying "you just got cut off, ha-ha." I then have to re-dial to finish, but must first sustain the same protracted electronic monologue. I'm tempted to conclude that the disparity between greeting and message time is a subconscious power thing.

An unintended benefit of using this technology is that you can covertly gauge when a household's kids have reached adolescence. A sure sign is when the staid, long and humdrum greeting of a friend or colleague is suddenly supplanted by background chaos and a squeaky kid's voice rising above the din — whose tone proclaims, "My parents have lost control over me!" And the consequence of this and other valueless data that may be garnered before the beep? At month's end your earful will have been translated into an eyeful of phone charges!

I've noticed too that electrical outages can cause havoc with some of the solid-state units. Those affected reset themselves to the default simulated voice greeting: "Pleeeease leeeave aaah mess-saage aaff-terr thaaa beeeep." At least it's short! And then I can have some harmless fun by complimenting the owner on their new high-tech greeting, knowing they were unknowing that their interminable, carefully crafted silicon-chipped annoyance got vaporized.

Voicemail, an evolutionary step up from answering machines, has provided even more opportunity for us to prolong a call and jack up the phone bill. Replacing prolix dialogue are cold menu choices, now becoming popular even on personal lines: "Touch one for this, two for that, and three or four or five..." I usually forget what the
It's time to stop apologizing for not being where others think we belong.
earlier options were and inevitably have to hang up and then get charged for another call. And you know ever more businesses love this technology to avoid personal contact — especially when we're paying the toll. Once after a violent storm I called a landscaper to remove a fallen tree. After getting knee deep in options, and finally choosing the appropriate one, the message nicely indicated the mailbox was full. "Call again later, we appreciate your business," it said. "Thanks a lot," was my vain retort, while yearning for simpler times when I could at least leave a message!

There is hope however — even if only on the home front, but it will take a mind-set change. Nowadays, many households have both parents working — and with several jobs, so it's time to stop apologizing for not being where others might think we belong for their convenience. Recorded messages to convey an apology or to ameliorate a caller with humor or niceties is vestigial behavior from the days when much more time was spent at home, and leisure was truly that — not catching up on errands, attending classes, pursuing entrepreneurial endeavors, or sifting through email. If used properly, phone-answering technology can make our lives easier with shorter and fewer calls — saving precious time which in this case IS also money.

So, until we can engage each other without the need for phones and answering technology, why not just say "hi, please leave a message" before the beep and leave it at that?

Time will tell...

posted by John Herman 6:27 AM

Saturday April 1, 2006

An Epiphany

Many of today's high school and college students often wonder (vociferously!) why they need to memorize boring equations, formulae and other seemingly trivial or useless information.

What many times is not emphasized by their teachers is the origin and significance of man-made expressions of what is essentially describing the workings of nature. They're not taught that many of these discoveries required lifetimes of effort — often by iconoclasts, eccentrics, heretics and recluses willing to shed lots of sweat and probably tears in order to solve nature's mysteries.

Still, many students past and present have stumbled upon these truths on their own, often with epiphanic delight.

Below is one finally-getting-serious college student's "Epiphany" written way back in 1971, stripped bare with numerous misspellings illustrating a misspent youth, yet with genuine astonishment that this seemingly simple realization took so long to gel. It was hand written pen to paper and found in a musty old box after 35 years. (Today's student might blog such a personal thought — with little chance of rediscovery years hence.)

A message to those who are in the same plight as I:

If you question the ways of the sciences — concepts — rediculous [sic] equations — symbols etc and become completely fatalistic toward them — think back a moment to your forefathers who devised these methods.

These are just building blocks to understanding. Just as you need tools to produce a manual task — tools are essential in building knowledge.

Nature does what is does without any influences (until recently however). Man has not and will never harness nature by merely understanding its processies [sic]. This form of study makes use of abstract concepts to make understanding less tedius [sic] and to standardize the methods of expressing our understanding of them possible and eliminate a caotic [sic] consequence.

This must be remembered if excelence [sic] in any science is achieved.

Written by J. Herman some time in 1971

Perhaps some day education will go beyond mere memorization and copying teachers by rote to include a real appreciation of our current state of knowledge — knowledge that allows us to not only understand the workings of nature, but to leverage and alter them for our benefit, as well as our peril.

Time will tell...

posted by John Herman 6:46 AM

Wednesday March 1, 2006

A Disgraced General

Get ready to hear a lot about Toyota vs. General Motors, as Toyota is poised to overtake GM later this year as the largest auto manufacturer in the world.

Toyota was once a gnat and General Motors was once the largest corporation ever in the United States; and one of the largest employers in the world. In 1953 GM president Charles E. Wilson was tapped by President Eisenhower to be Secretary of Defense. It was during the senate hearings that he uttered the famous phrase, "...what was good for the country was good for General Motors and vice versa."

That perhaps was true then, and some would like it to still be true now, given GM's looming plunge into bankruptcy. More than losing its Number One global position, GM incredibly is on the brink. Executive egotism, greed, arrogance and misjudgment are to blame, although the current stewards are doing their best to rewrite corporate history. Their scheme is to pin the behemoth predicament on the backs of the
GM leveraged its brand equity to excess — yielding five tarnished, almost fungible nameplates.
retired rank and file — those that mass produced product that ultimately funded senior management's exorbitant salaries, perks and golden parachutes.

It's a valid point that some corporations have retiree costs that make them less competitive. But to blame those costs for General Motors' financial woes and impending market position loss to Toyota is bogus.

GM is struggling with another type of baggage — the consequences of short-term strategies and practices that wrecked its spectrum of brands (Chevrolet to Pontiac to Oldsmobile to Buick to Cadillac), and the dismal reputation it earned so well in the '60s, '70s and '80s for producing cars that frankly were just plain inferior.

And after leveraging its once powerful brand equity to excess — yielding five tarnished, almost fungible nameplates, what did top management do next? They created yet another brand — Saturn, wasting billions in the process. Imagine if all that capital was instead applied to advanced technology, like gas-electric hybrids!

Admittedly, these days GM's products are better built. But the damage is done. It would take as many years to repair the company's reputation as it took to destroy it — several decades perhaps. Today, those cognizant of quality would rather play it safe with a Toyota (or a Honda or Nissan; and soon maybe a Hyundai) regardless of price. GM and the other US manufacturers underestimated Toyota — but more significantly — they underestimated the consumer.

It appears now that as a result of years of US corporate mismanagement, no matter the industry, what today is good for the country is good for a foreign company — a reality euphemistically touted as globalization.

Time will tell...

posted by John Herman 5:58 AM

Wednesday February 1, 2006

"Baby" Boomers Turn 60 — Ugh!

Tick...tick...tick. The "baby boomer" population bulge is approaching 60. I'm a leading-edge boomer and frankly am having trouble accepting that fact. I seem to have a lot of company too — most of my friends, coworkers, and fellow (grand?) parents are boomers and many of them feel similarly distressed. Why are we having so much trouble accepting this aging thing?

Through the years I've developed a belief that it's because baby boomers were raised to think of themselves as privileged, special and young, as the moniker implies. We are the product of an unprecedented birthrate and as youngsters were bathed in attention — regardless of whether it was for our potential in those promising and optimistic postwar years or for profit by opportunistic marketeers. We were the center of attention and the target market virtually from birth. Consider the following:

Pediatrician Dr. Benjamin Spock wrote a baby and child care book just for us — to ensure that our parents would provide a nurturing environment to help us grow to our full potential, both mentally and physically. It was a best seller for years.

Our parents, who experienced deprivation during the Great Depression and World War II, were determined to give us what they did not have: a comfortable home, plenty of food, new clothes and a good education which included a college degree. Everyone would be "white collar." Yes, we were really special.

Whole neighborhoods were built just for us, a phenomenon that became known as "suburban sprawl." Virtually everything was new. We had new houses, roads, stores, schools, buses, playgrounds and pools. We were handed new books and new tests which were designed to measure us if we simply darkened the proper circle with #2 pencils. Eventually we would even have "new math."

The Golden Age of TV catered to us with Captains Video, Midnight and Kangaroo; Space Cadet and Space Patrol; Kukla, Fran & Ollie; Davy Crockett, Superman, Howdy Doody, Mickey Mouse Club, Roy Rogers, Lassie, Mr. Wizard et al. We watched and matured with wholesome situation comedies depicting the idealized kid-centered family like Ozzie & Harriet, Father Knows Best, Donna Reed and Leave it to Beaver. TV would later contribute to and reflect our loss of naïveté with such ground-breaking shows as Laugh-In, All in the Family and Saturday Night Live. Television's impact cannot be overstated — we were the first to grow up with it and it helped to reinforce our preoccupation with youth and Self.

A plethora of toys and gadgets were invented just for us: Slinky, Mr. Potato Head, Barbie, Etch A Sketch, Silly Putty, Hula-Hoop, Frisbee, a watch with a Mickey Mouse face and the world's biggest toy — Disneyland. Naïvely optimistic slogans professed things like "Progress is Our Most Important Product." Planned obsolescence brought us the "new and improved" designed for quick and easy disposal. We had transistor radios and record changers playing 45s. Stereo LPs, 8-track and cassettes would soon follow. So would calculators, computer games, and digital watches. Yes, we were spoiled and literally growing up like no generation before us.

We were comforted to know that we would always have all the food we would ever need — with farm mechanization and the liberal use of pesticides. The threat of serious disease was virtually eradicated for us with "miracle drugs" such antibiotics and vaccines. Advanced surgical techniques would enable us to change our features, our gender and even replace internal organs when they wore out. We learned that we could take drugs to cure infertility or The Pill to ensure it. What power was brought forth for us!

We were dazzled with jets, rockets and space flight spurred by Sputnik. One day we would all travel through space. Impatient families would ride in automobiles sporting bullet-nosed bumpers and huge tail fins. I can remember being lectured at school that we should feel privileged growing up in the Modern Age; and that our eyes would eventually see what no others saw before us.

But we didn't have to wait. We saw tremendous progress during those prosperous 1950s and '60s and were often reminded that we would be the prime beneficiaries because "our whole lives were ahead of us." Y-O-U-N-G was etched into our psyche. We felt catered to and it made us feel very important — and that the world should revolve around us.

This attitude combined with our critical mass gave us leverage to have an impact on society beginning in our late teens and twenties. We began to question everything and inadvertently turned into a political force when our implanted optimism was replaced with disillusionment. We became demonstrators, hippies, college drop-ins and establishment drop-outs — and we didn't trust anyone over 30. We affected the style of dress, hair, and music; and hedonistically espoused "free love." We marched to end the arms race, the Vietnam War, and to give Peace a chance. The unintended effects of the so-called miracle drugs were beginning to show up with cancer and birth defects, and we read Rachel Carson's Silent Spring. We became concerned about the impact "progress" was having on the environment. And many tried but failed to escape this reality explosion with mind-altering drugs.

Then we turned 30. To remain trustworthy we selfishly redefined Young vs. Old. Our idealism was replaced with good old-fashioned materialism. These would be the years of the Me Generation. "Free love" was replaced with free spending. We sold our ideals for the good life that selfishly included expensive vacations, cars and houses. While immersed in our indulgence, we reached "the big 4-O." We were now a massive group of "yuppies" and although we read cover stories about how we were losing our hair and gaining weight, we still were unconvinced that we were getting old.

Then in the blink of an eye we hit 50 — that (gasp) half-century milestone. Although chronologically defined as hard-core middle-agers, we still refused to succumb to the stereotype of age that we held for our parents. We continued to rock'n'roll, workout to stay fit and tried to eat healthy — or at least followed the fads from oat bran to pasta to tofu. It was a good thing too; we needed all the energy and stamina we could muster to survive the waves of downsizing, rightsizing, outsourcing and reengineering. Although many succumbed to early retirement, that did not stop us from learning new tricks; from embracing the Internet to becoming entrepreneurs. We'll survive, we told ourselves — we're resilient and Y-O-U-N-G.

So, what does all this have to do with the future? Plenty...
Now that we're turning 60, this famously spoiled (sans "baby"?) boomer demographic anomaly will be a force to be reckoned with. We'll be retired, but not retiring gray-haired activists with plenty of time on our hands to leverage our generational clout to lobby for our interests. By sheer demographic heft we'll again effect change as in years past. We'll read cover stories at the 70- and 80-year milestones chronicling our deterioration and the burden our cohort will be inflicting upon younger generations. And the marketeers will be there pushing the latest pharma miracles, adult diapers, adjustable mattresses, hearing aids and life insurance. They'll capitalize on our penchant for nostalgia that will inevitably precipitate an onslaught of hyped retro-fads. We'll call ourselves rockers even when we're sitting in them nodding off with Modern Maturity. And still we won't believe we're O-L-D.

Time will tell...

posted by John Herman 5:58 AM

Sunday January 1, 2006

Chiropractic Tactic?

The Phrenicea scenario of the future depicts chiropractic supplanting what is today considered traditional medicine — a profit-driven medical/insurance industry treating various states of disease with drugs and/or surgery. Unfortunately, this idealistic vision appears to be in jeopardy with the emergence of a new and ominous trend among chiropractors.

To compete with today's established "big business" of medicine, many chiropractors have been participating in a trend to further legitimize their profession and increase revenue by expanding their application of physical manipulation of the spine and other body structures to include the pursuit of "wellness." This may include the addition of other alternative, non-traditional modalities such as acupuncture, naturopathy, message therapy, yoga and more. The wellness approach attempts to proactively prevent disease holistically with proper lifestyle, rather than treating symptoms of disease already in progress.

The proliferation of wellness centers is a trend that is expected to continue, since chiropractors have found receptive, health conscious, well-heeled clients willing to pay out of their own pockets for treatments that appear to be effective, as well as for expensive natural supplements.

A more recent trend, and an ominous one in my opinion, is the growth of franchised centers of wellness with the chiropractor as the hub. The franchise concept is not new to the service industry — certainly not with fast-food like burgers. Over the years it has expanded to include niche restaurants, haircutting, package shipping, travel, eyeglass fitting, chimney cleaning, home inspections, lawn maintenance, maid service and more. It is quite new to the business of chiropractic however.

It's apparent that the goal of franchising in this profession is to "McDonald's-ize" the chiropractic experience in terms of creating recognizable brands to a broader population, while bringing in extra cash for programs of questionable benefit — for the patient at least.

Chiropractors traditionally have spent lifetimes developing successful practices of respectable size, which reflect their character and values. They're now tempted with visions of big profits by wooing masses of clients less sophisticated and discerning — and more receptive to being dazzled with faux technologies and procedures. Secondary is keeping existing patients that may be sharp enough to see through the hype and perceptive enough to sense the tawdry goal of profit; the same patients that are at risk of being alienated with slick, sterile, and generic branding, flashy placards and gee-whiz computer-facilitated tests of dubious value.

It's not clear, at least to me, whether the cookie-cutter success associated with the formulaic, sanitized operation of a franchise is transferable to the practice of chiropractic. And even if it is, should it be? (A moot question if more profit is the primary motive.)

The inevitable result of franchising in general is the lowering of service quality to merely acceptable or tolerable, vis-à-vis stand-alone businesses. (How many four-star restaurants are members of a franchise? How many top-notch hair cutters?) This settling consequence might be acceptable for a meal or a haircut — but for healthcare?

If this scenario of greed plays out as it did with traditional medicine, there may have to be a new alternative to today's alternative healthcare.

Time will tell...

Happy New Year!

posted by John Herman 8:28 AM

Jump to a blog
2005
64 Survivor — Everyday Reality
63 Knowledge is Power?
62 Nuclear Redux?
61 Cubicle Dwelling?
60 Memories: Firsthand
59 Too Old to iPod?
58 Bebop, Doo-Wop & Hip-Hop
57 Us vs. Nature
56 Taglines du Jour
55 Darwinism vs. Creationism
54 Happy New Internet!
2004
53 Microsoft & Creativity
52 Nature's Complexity
51 Technology vs. Democracy
50 Solar is the Future
49 Artificial Pets
48 Purpose of Phrenicea
47 Choice vs. Happiness
46 Internet Anarchy
2003
45 The Age of Biotech
44 Delicious Polynutriment
43 Naïve Intelligence
42 Credentials vs. Passion
41 Motherless Children
40 Artificial Intelligence
39 Plausible Futures
38 Cubicle Dwelling
37 Pets vs. Poop
36 Survival of the Fittest
35 Us vs. Nature
34 DNA Turns 50
2002
33 Newsie Bellwethers
32 phreni-FINDs
31 The Press & Violence
30 French Fried
29 Future of Futurism
28 Playing with Nature
27 Us vs. Nature
26 Earth vs. Humans
25 Cloning of Humans
24 Noway Phrenicea
23 Worldwide Warming
22 Cat Cloning
21 Terrorism Defined
2001
20 Unintended Consequences
19 Harking Hawking
18 Cloning or Media Hype
17 Civilized or Centralized
16 H2Oouch!
15 George W(arming)
14 Overloaded
13 Human Cloning
12 Ecological Future
11 Quaint Cloning
10 Peer & Water Pressure
9 Collecting Fog
8 Global Warming is Hot
7 Peeyoo CO2
6 Water is Cheap
5 The TH!NK
4 Global Warming
3 "Hot" Topics
2 NASA
1 Blogger in Training

Monday November 14, 2005

Survivor — Everyday Reality

While sitting in my car at a neighborhood mega mall parking lot, situated beside a busy intersection leading to a circuitous exit path towards Suburbia Major, it occurred to me that every passing car was a survivor — literally — of millions of split-second decisions over the course of its operating life.

I could see this firsthand in real-time as drivers approached, stopped, waited (some confident, some sheepish, some blowing-horn impatient), and finally darted into the fray of streaming vehicles filled with occupants impatient to get to their next destination.

Newer autos had the benefit of being free of the cosmetic consequences of human decisions, with shiny and dent free flanks and bumpers. Older models reflected their years with amazing precision — analogous to the wrinkled skin of an elderly human. Their hard lives of stop and go, left and right, high and low, and forward and reverse were divulged with their dents, chips, dings, touch-ups and dull paint.

Was it luck or the good judgment of each car's driver that it was still on the road after several years or perhaps a decade or two? (Probably both.) And for a select few it appeared to be that, plus good old-fashioned elbow grease. These pristine, eat-off-the-paint beauties would rumble proudly through the queue — recipients of lovingly applied plastic surgery and makeup — turning heads all the way in their battle against entropy.

And to think, all of this orderly chaos is the result of minuscule electrochemical signals traveling within our brains — that amazing gray, convoluted organ sans any visible moving parts. But then looking around at the mall itself — the big box stores, gaudy signage, sculptured pavilion artwork, and the occasional plane flying overhead — all of it was designed, constructed and maintained as a result of countless neurochemical operations. It is truly incredible when you take time to ponder it.

Miraculous too is to be able to wonder where these almost infinite brain processes will take us next. With the many problems and issues facing us today — worldwide trends pointing in directions both good and ominous — it's really up to us all to focus and channel our thoughts and energies responsibly.

But, will we?

Time will tell...

posted by John Herman 7:19 AM

Thursday October 27, 2005

Knowledge is Power — and a Burden

"Knowledge is Power" is a well-known quotation, first uttered by Sir Francis Bacon in 1597. Unfortunately in today's world knowledge is also a burden; a cause of worry.

Example: The anticipation of a new bird flu pandemic.
Many around the world are involved envisioning horror scenarios should the H5N1 bird-borne virus mutate into a form that can be transmitted among humans — eliminating birds as the primary host. Just enough is known about viruses and past pandemics to ponder the consequences. Much energy is expended discussing "What if?", brewing worldwide fear. It may never happen. But then again, it might.

Example: Global Warming.
Much worry is attributed to our knowledge of carbon dioxide emissions and the theoretical impact of high concentrations in the atmosphere. Endless cerebral energy is spent debating whether weather is made heavier and more erratic, and if/when coastal cities will submerge. Yet it still may be proven to be a false theory. [We don't think so.] But might we be better off being dumb and happy?

Example: Events calculated to be "overdue" based on mathematical laws of probability.
Anxiety can be engendered in a locale or region susceptible to devastating hurricanes or earthquakes if, based on historical data, it is deemed at risk "any year now." It makes for great TV news on a slow day, and is a cause of (needless?) apprehension.

Example: The mapping of the human genome.
As knowledge of genetics becomes more sophisticated, susceptibility to ever more diseases will be determined by markers (gene sequences) within a person's genome. Imagine the agony knowing that a terminal or debilitating disease is inevitable, well before any symptoms and before an effective cure becomes available. Many will find themselves cheated out of a joyful life knowing that someday they are likely to succumb to an inherited genetic disorder.

Could our sophisticated state of knowledge be burden enough to instigate proactive worry — and contribute to the reasons why a great many people smoke, drink and participate in other activities that assist them to take leave of their senses?

*****

Looking ahead to the future, too much knowledge might even be mentally debilitating. Imagine an over-the-top scenario [again?!] where we had to know from birth an important aspect of our future — the date and time of our demise. Once cognizant, would we not count each day in anticipation of "The End," effectively zapping our lust for life? Who knows, conditions related to population or some other critical parameter may eventually warrant a predetermined appointment with death... To donate a brain perhaps?     ;-)

So, is knowledge beneficial or baneful?

Time will tell...

posted by John Herman 10:59 AM


Dilbert 10/28/2007

Postscript by John Herman October 28, 2007

Saturday August 20, 2005

Nuclear Redux — or Déjà vu?

Our current QuikPoll so far indicates surprisingly that there is more concern with fossil fueled global warming than with long-lived and deadly radioactive waste (and weapons grade fuel) produced by nuclear power plants. This is astonishing, especially with the threat of terrorism and the third-world's desire for nukes.

It's amazing how attitudes can change over the course of time — about 60 years in this case. At the dawn of the Atomic Age, there was optimism that not only would "going nuclear" fuel power plants to generate electricity — it would also power our planes, cars and rockets — and cook our food! (Even Walt Disney was convinced, producing the movie and book, "Our Friend the Atom.")

Initial optimism was eventually supplanted with concern about bomb proliferation; paranoia associated with the loss of U.S. technological supremacy (on this date in 1953 the Soviet Union acknowledged it had tested a hydrogen bomb); and uncertainty as to the long term effects of the radiation from weapons testing. The many science fiction movies from the era depicting awakening dinosaurs, giant insects, and incredible shrinking men attest to the uneasiness associated with radiation. Chernobyl and Three Mile Island seemingly were the final straws that broke nuclear's back. But, memories fade.

Politicians and the nuclear industry are capitalizing on today's radioactive blaséness with talk of new reactors helping to solve the global warming predicament. Cameco, the
Recycling dangerous technology may not be the answer to global warming.
"world's largest uranium producer" crows the slogan, "NUCLEAR. The Clean Air Energy." Embedded in Fortune magazine's August 8 issue is a nuclear-industry funded feature posing as objective content proclaiming a "Nuclear Redux." The piece, cleverly written by freelancer Robert McGarvey (who might have sold his professional soul here), subtly conveys an environmentally friendly green theme with an innocuous graphic, a green pull-quote highlighting nuclear's return to center stage, and a green text box incredibly proclaiming that "Radiation is good for you."! How subtle. How frightening!

There's little doubt that global warming is a pressing issue requiring action. However, the answer may not be to recycle dangerous technology. We owe it to ourselves and to posterity to become sufficiently educated to intelligently evaluate potential options. It's popular now to brand Global Warming "bad" because that's been the predominant message; and nuclear energy as comparatively "good" because: (1) it's been out of the tabloid news for twenty years; and (2) that's what some politicians and the industry would now have us believe.

Don't listen passively to either view. The world and its inhabitants are at stake and the clock is ticking.

Time will tell...

posted by John Herman 7:22 AM

Monday August 1, 2005

Cubicle Dwelling? Not Yet...

Now is as good a time as any — during these lazy, hazy days of summer — to assess whether the world is catching up with the Phrenicea scenario.

So, is it? Well, not really. We're not yet donating our brains to the Phrenicea braincomb. Money, pets, and newspapers are still around. Cars have not been banned, although they're getting awfully expensive to operate. Human cloning hasn't replaced procreation, but at this point we probably wouldn't be too surprised to hear of a successful attempt. And no, we're not living in cubicles — yet.

Nevertheless, it probably could be said that the Phrenicea scenario today is perceived as a bit less bizarre than when it was unveiled way back in May, 1999. Of course(!) this conclusion is based on objective data — that being visitor ranting via email. The hysterical ones have been on a steady decline, although the predominance of critical feedback we get still centers around the "ridiculousness" of the Phrenicea scenario as presented on the website.

A very (un)popular idea continues to be that of people living in cubicles. Granted, compared with today's mobile population, it does seem unbelievable. But we are becoming increasingly isolated from fellow human beings in an imperceptibly incremental fashion.

Right under our noses, there's been a gradual reduction of human interaction fostered by technology. It began way back in the 1950s with TV and the sprouting of the first "couch potatoes." After TV's novelty wore off, many found it preferable to just stay indoors than to socialize or pursue physical activity.

As technology marched on, the need to interface with real people declined even further. Some examples:
- the elimination of personalized attention at self-service gas stations, supermarkets, home centers, etc.
- answering machines and voicemail replacing conversation; business conducted via telephone tag
- live customer service reps replaced by impersonal automated systems with annoying nested menus and digitized voice
- DJ radio personalities supplanted with computer programmed "Jack" and similar formats; essentially mechanized music shuffling
- impersonal email, instant messaging and phone-based text replacing the spoken word
- iPod zombies existing in their own little worlds.

Imagine that strangers a century ago would actually greet each other with a "Hello!" and then strike up a conversation. Today that "strike," with near total detachment from fellow humans being the norm, is ever more likely to be savage perpetrations.

If the rate and course of technological change continues, which indeed it will, our cubicle scenario of the future will just be a physical manifestation of the social isolation that will ultimately result.

Time will tell...

posted by John Herman 7:22 AM

Saturday, June 11, 2005

Memories: Firsthand & First Person

Our memories, and the memories of us, are precious. Are they not? Well, at least to some authors they appear to be.

Cliff Pickover, prolific writer and research staff member at IBM's T. J. Watson Research Center, admits that much of his motivation to publish comes from his desire to exit this world with something to leave behind for future generations. He laments:

After you die, will the world remember anything you did? Most of us rarely leave marks, except on our immediate family or a few friends. We'll never have our lives illuminated in a New York Times obituary or uttered by a TV news anchorperson. Even your immediate family will know nothing of you within four generations. Your great-grandchildren may carry some vestigial memory of you, but that will fade like a burning ember when they die — and you will be extinguished and forgotten.

That's pretty depressing. As we try to live each day to the fullest — being productive, learning, and ultimately creating memories for ourselves and others — we rarely ponder the ephemeralness of it all. (Although the often-heard dispassionate phrase, "Who will care a hundred years from now?" stems from the sad reality of our short time here on earth.)

But does it have to be this way?

The Phrenicea scenario envisions a time when all our memories and experiences will be stored forever, within our own brain as well as within others' — firsthand memories that are deemed rich enough to be bought and sold at auction — like memorabilia traded today on eBay.

Imagine sharing the actual memory of one accepting a Nobel Prize or an Academy Award, winning a marathon, falling head-over-heels for         your favorite actor         , starving in the third world, learning of a terminal disease; even of dying!

Most are tempted to say, "Yeah right! No way."

Memories bought and sold at auction — like memorabilia on eBay.

It seems impossible now, but there will come a day when the mechanism for memory assembly and storage within the human brain is elucidated. A next logical step would be to try to save or replicate these memories, perhaps for recollection by others.

When it does come to fruition, imagine the regret for the many first-person memories already — or soon to be — lost forever:
- the excitement of witnessing man's first flight
- the despair of the 1929 Stock Market Crash
- the horror of the Holocaust and Hiroshima
- the excitement of purchasing a new color TV in the 1950s
- the anticipation of finding out "Who Shot JR?"
- the relief of learning of a cure for polio
- the nostalgia of catching the last feature at the local drive-in movie theatre just before its closing forever
- the shock of President John F. Kennedy's death
- the thrill of setting foot on the moon
- the marvel at the first "horseless carriage," telephone, "talkie" talking movie, ballpoint pen, transistor radio, Polaroid camera, VCR

Of course you could read or see videos about these events. But nothing can approach an actual memory — just ask Neil Armstrong.

Gee, when you think about it, memories really are precious.

:-)

Time will tell...

posted by John Herman 7:43 AM

Wednesday May 25, 2005

Too Old to iPod?

Should so-called mature, grown-up parents enthusiastically adopt hot gadgets like the iPod, that are targeted more for the younger crowd?

A recent (May 16 issue) Newsweek "My Turn" segment features a self-described busy, stay-at-home mom that nevertheless found sufficient time to download over 800 songs ("and counting"), create sophomorically titled playlists for her equally-mature friends, and enjoy the tiny gadget "at any given time," even at sporting events. Immersed so deeply in her technology-enabled musical cocoon, she's apparently unaware as to why she's garnering double takes by fellow spectators eyeing her peripheral engagement.

She also proudly proclaims that her kids love their iPods too, which isn't surprising given the example she's setting.
It might be too soon to succumb to a life of earbudded zombiism.
Is this really a good thing? Steve Jobs has created a landmark in entertainment, but with a potential for irresponsible abuse rivaling TV's a generation ago.

So — should grown-ups embrace today's technology to be more like their kids?
Alternatively, adults with so much leisure time might instead consider volunteer work or a part-time job. There are many local and global issues and problems needing attention — where still-youthful enthusiasm could be applied productively.

Despite our infatuation with gizmos and technology, it might be too soon for any of us to succumb to a life of earbudded zombiism.

What do you think?

Time will tell...

posted by John Herman 7:17 PM

Friday April 22, 2005

Bebop, Doo-Wop & Hip-Hop

Recently it was reported that rapper 50 Cent has monopolized the pop singles chart in a manner approaching the Beatles' dominance in the 1960s.

This begs the question, using yesteryear's "cool" vernacular: Why do today's youth "dig" hip-hop?

Empirical observation tends to reveal that musical taste follows a generational track. It seems people bracketed by parameters defining a "generation" assent to a particular musical genre making it popular or "in," to a degree where it's translated into measured sales on the pop music charts.

Why is this? Is it additives in the food? Pollution in the air? Society's morals and values at the time? Is it the effect of that other medium — television?

Somehow the part of the brain that facilitates music appreciation is influenced similarly en masse. Consequently, the popular music of the latter 20th century can be categorized as follows (with some overlap notwithstanding):

- the 1940s was the decade of big band swing;
- the 1950s was the decade bebop, doo-wop and simple rock'n'roll;
- the 1960s was the decade of experimental, progressive rock;
- the 1970s was the decade of disco and punk rock;
- the 1980s was the decade of new wave;
- the 1990s was the decade of grunge, super-slick r'n'b, and lip-synched girl and boy bands.

So now it's hip-hop. Arguably today's popular music is not as rich lyrically or musically vis-à-vis past genres. And why is that? Today's technological sophistication should encourage greater complexity.
The part of the brain that facilitates music appreciation is influenced en masse.
Why don't kids today approach the intricate styles of the Beatles' later work, Jethro Tull, Pink Floyd or Yes? Or, why don't they embrace big band jazz? Why not disco?

And why do the adolescents from decades past, the boomers in particular, abhor hip-hop? Like their parents, they continue to cling to their generation's music. Could it be the effects of atomic testing? Howdy Doody? Silly Putty?

Public television has discovered this phenomenon and to its delight is breaking all kinds of pledging records by producing more and more nostalgic concert programs. Featured are wrinkled, grayed and frayed performers — some that can barely move or hold a tune. Yet they evoke delight for themselves and their audiences, as the music floods their clouding minds with memories of youth and the "good old days" gone by.

The once super popular groups and solo performers seem incredulous that they're up there again on stage. And surely the so-called "one hit wonders" never dreamed they'd be performing their one song ad infinitum five decades later!

Incredible too is the irony that what was once thought to be throwaway tunes — trash by previous standards and deemed junk by parents — are instead being performed so many years later by these original performers in front of their original fans. That's a miracle of technology — not musical but medical!

All this begs another question: Fifty years from now — in the Age of Phrenicea — will today's youth still be listening to what might by then be considered a new form of classical music... Hip-Hop?   ;-)

Time will tell...

posted by John Herman 7:22 PM

Saturday, March 26, 2005

Us vs. Nature vs. Us

The extraordinary events associated with the attempts by Terri Schiavo's parents to keep her alive have become subject matter for worldwide debate.

What has brought this about? Technology!
The ability to keep one alive artificially for more than a decade questions our notion of what constitutes life.

This debate about compassion and ethics would not have been possible a century ago — before feeding tubes, respirators, heart-lung machines, etc.

Similarly, the technical ability to extract embryonic stem cells from tiny sixteen-cell blastulas beclouds the definition of when life begins — and whether destruction of the life potential can be considered unethical at the least, or murder at the extreme.

As our medical technical prowess continues to advance, these hard questions will increase and become ever harder. Unfortunately there really are no "correct" positions, since regardless of what side of the argument you adopt; either view can harbor substantive points.

Hence the reality for us becomes (if based on our adoption, adaptation and cultural inurement thus far), How rapidly will our tolerance evolve to accept the application of our technical abilities for the so-called good?

And just imagine the debate several decades from now...

  • when we'll likely have the capability to perform face transplants and even brain transplants, muddying the very definition of identity and self;
  • when intelligence, innate mental and physical skill sets, and the minimization of disease potential become programmable before birth or even conception, to those with the wherewithal to afford the micro-tinkering.
Will it be ethical to tamper with nature's plan? Will it be ethical to permit the well-to-do to feast on the fruit of the latest technologies, while the rest of us rely on nature's crapshoot?

And finally, imagine the global debate towards midcentury when the first person chooses to pursue immortality — by volunteering their brain to be incorporated into the Phrenicea braincomb!   ;-)

Time will tell...

posted by John Herman 5:31 PM

Tuesday, March 1, 2005

Taglines du Jour

The quest for "Corporate Identity" is big business. In the hope of creating something positive and memorable in the minds of the consumer, major companies spend hundreds of thousands, if not millions of dollars on brand development.

A major component is the tagline — a word, phrase, or (lately) a musical clip or lyric that attempts to convey the corporate mission du jour. The hope is to "brand" their brand into your brain.

Consequently, we're constantly bombarded with taglines — most of which barely make it to our conscious level. A few make it to legendary status and popularity. Here's a sample extemporaneous list:

Aetna, I'm Glad I Met Ya! (Aetna)
Where's the beef? (Wendy's)
Melts in your mouth, not in your hands (M&Ms)
Have it your way (Burger King)
See the USA in your Chevrolet (GM)
Look, Ma! No cavities! (Crest)
It's the Real Thing (Coca Cola)
The Uncola (7 UP)
You deserve a break today (McDonalds)
Please don't squeeze the Charmin (Charmin)
Do you know me? (American Express)

These famous and memorable taglines were simplistic, in that they generally were based on some factor or characteristic of the product. Their very success paved the way for today's inordinate emphasis on taglines by marketing powerhouses that take themselves all too seriously, mostly with the employment of psychology theories and esoteric mumbo jumbo.

Curiously, the effectiveness of a tagline does not guarantee its longevity. You would think that their recognition power would be leveraged and promoted forever. It certainly would decrease the monies devoted to marketing and perhaps eliminate many marketing jobs. (Hmmm.)

For example, Do you know what Aetna's current tagline is?

We want you to know.
Get it? Probably not.

But we're not the only ones not getting it. For all the money spent, companies today are not getting their marketing money's worth. You can be sure the agency that developed the Aetna ad campaign was paid handsomely after presenting convincing brand-speak nonsense for senior management's consumption. And they fell for it!

So as not to pick on Aetna, here's a list of similarly poor taglines gathered from a couple of business magazines. Far from being based on anything tangible, most are meaningless and unmemorable — and most so vapid as to be interchangeable within the list of companies:

High performance. Delivered. (Accenture)
Dreams Made Real (Agilent Technologies)
Never Follow (Audi)
it's all inside. (Penny's)
it's the cola (Pepsi)
Taking you forward (Ericsson)
Call Me (Fidelity)
With Sprint, Business is Beautiful. (Sprint)
Inspire the Next (Hitachi)
There's no better way to fly. (Lufthansa)
Good life. Great price. (Sears)
Get more from life (T-Mobile)
a vital part of your world (Tyco)
that was easy. (Staples)
good goes around (Delta)

"good goes around" is almost laughable. So is Sprint's and Hitachi's. (Notice the subtle variations in the use of lower case and punctuation; surely the agencies concocted a load of gobbledygook rationalizing how each would stand out from the crowd.)

Sadly, you can be sure their corporate managements collectively paid zillions to relinquish creativity to "brand architects" to develop these sorry examples. Beyond boring, they would be difficult to purposely memorize if one were motivated enough to try.

Albeit, some taglines are cute or clever [Petco — Where the pets go.]. Some are staid [Expect More. Pay Less. (Target)]. Some are even weird [What can brown do for you? (UPS)], suspicious [The Pharmacy America Trusts (Walgreens)], or puzzling [Make Progress Every Day (Verizon)]. Some can be ultimately embarassing, like when your spokesman becomes a "smokesman" of pot [Dude, yer getting a Dell! (Dell)].

And to the chagrin of generation X, Y and Zers, ever more companies are now borrowing from the past in trying to capture good ol' boomer feelings using decades-old hit songs:

"Draggin' the Line" by Tommy James (Mitsubishi)
"Takin' Care of Business" by Bachman Turner Overdrive (Office Depot)
"Be My Baby" by The Ronettes (Cialis)
"The Weight" [...and you put the load right on me] by The Band (Cingular)
"Just what I needed" by The Cars (Circuit City)
"Dream On" by Aerosmith (Buick)
"Rock and Roll" by Led Zeppelin (Cadillac)
This too shall pass.

In the near future we'll come full circle and be back-to-basics with meaningful taglines developed without the benefit of a marketing budget — their clever brilliance destined to become classics in their own time:

Visit Your Future (Phrenicea)
Chronicling the Future (Phrenicea)
The Future — It's All In Your Head! (Phrenicea)

;-)

Time will tell...

posted by John Herman 6:18 AM

Tuesday, February 01, 2005

Darwinism vs. Creationism Redux

Recent issues of both Time and Newsweek magazines report on the 80-year on-again, off-again dispute on whether to teach Darwin's theory of evolution in public schools. The latest rendition is spurred on by a clever sect of neocreationists who maintain biological organisms are just too complex to have evolved without the hand of some form of intelligence (perhaps God, perhaps extraterrestrials).

What's new is that these followers of creationism are well-funded, wear a scientist's mask, and are positing an alternate theory called "Intelligent Design," which essentially states that complex biological entities (such as the eye or brain) could not have evolved by chance. Their surreptitious strategy is to outfox the U.S. Supreme Court's 1988 ruling that declared it unconstitutional to intermingle science and religion in the classroom.

We consider this decades-old sparring between near fanatics on both sides a waste of cerebral energy. So here's our brief take on this now ancient argument:

Intellectually, we are limited to arriving at conclusions based on what our five senses can detect. It is fruitless to argue and debate over beliefs that rely on more than these senses — or our manmade tools that enhance them — can tell us.

Faith is believing in things that lie beyond both our realm of detection and our ability to discretely analyze. However, believing in science per se is not necessarily antithetical with adopting one or more spiritual faiths — of which there are many. There are countless scientists that harbor religious beliefs that can't stand up to the scrutiny of the "scientific method."

The fact that there are so many forms of religion — and religious-like beliefs — is the reason we align with those that believe there must be a separation between what is taught in public school, facts based on knowledge obtained through worldly observation, versus what might be better discussed in houses of worship. There are just too many religions practiced today to teach them secularly with any chance of competence, thoroughness or equitable fairness.

Those that are skeptical of the accuracy of our scientific knowledge, hypotheses and theories arrived at through empirics need only acknowledge the splitting of the atom, our landing on the moon, or the mapping of the human genome. All incredible accomplishments indeed — based on precise measurement, formulae and equations that apparently reflect the realities of nature.

To those that scoff at the value of spiritual or religious beliefs — scientific studies show that those that make faith a part of their lives are in general happier and live longer.

After all the years of squabbling, why don't the two ideological sides just acknowledge that we're likely to never prove beyond a doubt either belief — and let the schools and houses of worship coexist to perform their functions as originally intended.

Time will tell...


posted by John Herman 9:31 PM

Saturday, January 01, 2005

The holidays and New Year are a time for family gatherings, exchanging gifts as well as contemplation.

Being a last minute (last day!) shopper — not even priority shipping could save me — I reluctantly headed for the crowded mall.

A born mall-o-phobe, I quickly realized how spoiled I've become ordering merchandise online. At the mall's bookstore, I was amazed how impatient I became walking through seemingly miles of aisles, with little help from the clueless temporary help. Ditto the experience at the music/video chain. Of course they did not have what I was looking for — or maybe they did, but neither me nor anyone else there could find it. Next year I'll have to remember my annual New Year's resolution to shop early for gifts — online!

Holiday time provided another opportunity for me to realize that the Internet bubble that burst in mid-2000 is afloat again — this time at a family gathering.

After the exchange of some gifts, I found my niece with the TV remote replaying the Smallville DVD theme music over and over. She found me watching with a puzzled look, and awkwardly explained that she LOVED the series' intro song.

"Ah ha," I said to myself. I left the room and grabbed my wireless laptop, brought up my favorite search engine and typed "smallville theme." A split second later I had the song title and artist, then clicked my iTunes icon to search for "Remy Zero." I was instantly looking at a list of songs — and there it was — "Save Me"! Clicking on the sound clip, I confirmed that it was indeed the right song. A second later, it was purchased and transferred to my iPod.

Returning to the gathering, I handed my niece the iPod and she spent most of the evening listening to the song in its entirety with the now familiar white ear buds. To the chagrin of her parents, thanks to me, you can bet an iPod will be at the top of her wish list next year.

I then imagined the same scenario just several years ago...

How would I proceed to find the song associated with the artist?
Even if he was credited on the DVD case or booklet, would I be able to find it at a retail store — searching miles of aisles? In desperation, would I try to explain my quest to a blank-faced clerk? Was it even released as a song? What today required a couple of minutes could take days or weeks without the amazing Internet — if at all.

What's incredible is that most of us are refusing to acknowledge or realize the Internet's quiet impact on our lives. Today it's unfashionable to praise the Internet — probably because for many a bad taste from the so-called bust still lingers. Many were hurt by the dotcom crash. The loss of a job or a small fortune in the stock market is not easy to forget.

But — we now have to stand back to again acknowledge the Internet's influence and appreciate how our lives are continually changing. There's a quiet revolution progressing right under our noses!

The Internet is back. Or perhaps it never left.

Time will tell...

Happy New Year!


posted by John Herman 8:32 AM

Monday, December 06, 2004

The December '04 issue of Fast Company magazine is quite good with an emphasis on creativity, containing several interesting articles. My favorite addresses the apparent dearth of innovation at Microsoft.

As a self-proclaimed "creative" person, I found that the piece confirmed my instinctive belief that like love, you can't buy creativity. The pursuit of creativity with the primary goal being income or profit is a fool's endeavor. Ever notice that once brilliant but struggling musicians become commercially famous their music-writing ability seems to fall flat?

When the primary goal becomes the mighty dollar, rather than the thrill of writing a new tune (or a novel search algorithm!), the creative process shuts down. I would bet that Google's Sergey Brin and Larry Page, although excited about some kind of future success, did not fathom the extraordinary riches that have now befallen them. Their fame and fortune will soon begin to take its toll; Google will eventually become another Microsoft — fat and self-absorbed.

With all of its resources and apparently not much to lose at this point, Microsoft should pursue a bold experiment by retiring its entire R&D staff to a lifetime of paid leisure and recreation — with a long string attached. If they are truly creative individuals, they will by innate drive persevere unfettered by corporate influence — exercising their natural talents to satisfy their own curiosity and to chase some nebulous fortune, of which Microsoft would prehend 10%.

Given the R&D horde that would be unleashed and released "into the wild," Bill & Co. could profit handsomely from the many Google-like successes that would likely result.

Time will tell...


posted by John Herman 7:24 AM

Friday, November 12, 2004

Now that we're free(!) from the distraction of the U.S. presidential election, which already seems old, we can get back to addressing more substantive subject matter. (However, the unexpected "landslide" results that illegitimatized the too-close-to-call polls and too-far-out pols and pundits can serve to preface this post's theme of how tenuous is our knowledge.)

Although it was difficult to imagine that anything else was going on in the world, given the saturated news coverage given to the election, life did go on — albeit almost invisibly.

Unnoticed and under-appreciated were two mind-shaking magazine/journal articles illustrating once again that we don't know nearly as much as we think we know — and that nature is more complex than we can imagine.

The first, as explained in the October issue of Scientific American, puts a chink in what has become known as DNA's "Central Dogma" (quickly cast in concrete four decades ago), which is what we confidently thought was the process that translates DNA into proteins. The incredible unraveling of the mystery of how DNA is constructed, within the unseen world of molecules — how it unzips and how it spawns m- and t-RNA to translate its blueprint to produce proteins from amino acids — was no doubt a feat of intellectual prowess, adding considerable bulk to our already overflowing storehouse of intellectual bravado. But, it turns out that this elegant scenario derived by studying lesser forms of life (E. coli from the bowel no less) is too simple to explain our complexity. Defining the "Central Dogma" for all life forms was premature.

Thanks to the Human Genome Project, we now know that humans have not-too-many-more genes than lower forms of life such as a flat worm(!). We also just about match gene-for-gene a chimp! Not to be humbled too long, humans — in the form of bio scientists — quickly pursued finding out how we can be so complex with relatively so few DNA building blocks. It turns out there is a whole lot more going that went unseen, and it has to do with RNA and all the "junk" DNA that had been ignorantly dismissed. We won't go into details here (read the great article), but it appears to be controlled biochemical chaos (yes!) that makes us human. Once again, Nature embarrasses us and shows she's no simpleton.

The second humbling article comes from the journal Nature, which was also given minute space by some newspapers and newsweeklies, buried deeply under the weight of election coverage.

Just as it appears that our DNA story has to be rewritten, so too does
Once again, Nature shows us she's no simpleton.
the story of our evolving from more primitive ancestors struggling to walk, talk and think. The discovery of a primitive, miniature hominid that apparently existed just 14,000 years ago, side-by-side species sapiens, puts a huge chink into our (formerly) firmly planted evolutionary tree — which appears now to be a huge bush where many of the outermost branches existed together. (Some are even whispering that we sapiens might have out-survived these lesser hominids through carnivorous means!)

The lessons to be learned once again are:
How many other stories that we have already naïvely cast into concrete will have to be re-written?

How much more don't we know, yet egotistically think we already know?

When will we finally respect the complexities of Mother Nature?

Heeeello...! Ray Kurzweil! Are you listening?

Time will tell...

posted by John Herman 9:33 AM

Wednesday, September 29, 2004

Is Technology Thwarting the Democratic Process?

It appears our previous post overestimated the level of discourse that would help in evaluating whether challenger John Kerry would be a better U.S. president than incumbent George W. Bush. There's been nary a word about the impending energy crisis and consequent economic chaos as the rest of the world strives to catch up to the U.S.

Instead of serious discussions of important issues like energy — as well as the global economy, environment, conservation, health, peace and stability — we are bombarded daily with carefully crafted sound bites, incessant personal attacks, accusations, decades-old misdeeds, and... poll results! Both major political parties have employed verbal SWAT teams to instantaneously attack and counter-attack their opponent's charges that literally can be instantly heard 'round the world for anyone still interested.

How did it all come to this? Technology! As in TV, satellites and cable. As in sophisticated systems for marketing and polling. And the harried and hurried, 24/7 connected lifestyles that afford little time to consume more substantive data even if it was presented.

America's founding fathers would shudder at how their idealistic vision has blurred with the worst that technology has to offer.

Perhaps the most powerful influence is TV. Even before the fateful debate between Kennedy and Nixon, Eisenhower employed the marketing powerhouse Rosser Reeves (known for "Melts in your mouth, not in your hand" as well as other famous commercial taglines) to subtly portray him in the TV ads as authoritative yet likable.

It was the first use of a political "ad campaign" with short 30 second "spots" to manipulate the viewer's perception of the candidate. Its success set the stage (so to speak) to employ ever more sophisticated principles and techniques for legal brainwashing en masse.

At about the same time, the science of polling "Galluped" ahead and provided the so-called pulse of the nation — which can itself influence opinions exponentially.

Finally, the ubiquitous electronic medium facilitates the infamous "debates," which have become anything but. The intense negotiations that precede them strive to minimize spontaneity and maximize the probability that there'll be no major gaffes that would be aired ad infinitum — leading to the destruction of the unfortunate perpetrator.

So when will it all end?

Short term: After the U.S. election and there's a collective sigh of relief among its citizens — that it will be at least a year (!) before the 2008 election is mentioned.

Long term: When the institutionalization of Phrenicea occurs. The capabilities of Phrenicea lead to less complicated lifestyles, so simple in fact as to resemble 18th and 19th century living, if not before.

Time will tell...

posted by John Herman 7:10 PM

Wednesday, July 14, 2004

The Phrenicea scenario envisions the eventual adoption of solar power for dwellings and transportation. So why did we not choose hydrogen, now being pandered to hopefuls — mostly by politicians — as the fuel of the future?

With the U.S. presidential election fast approaching, there's inevitably going to be some debate about our (in reality the world's) dependence on oil and the vulnerabilities that that engenders, not to mention the deleterious impact on the environment. So-called green alternative fuels that sound too good to be true, such as hydrogen, will be broached superficially — and a gullible public will once again be lulled into a false state of optimism. "Ok, next topic folks..."

Likely not discussed: the U.S.'s eventual loss of its monopoly on living standards supported by the high consumption of the earth's resources — as the so-called "have not" nations aggressively embrace capitalistic-like models, perhaps crudely at first. This will more than offset any progress that might be garnered with expensive high-tech conservation efforts by the "already haves." The consequence is that things are going to get much worse before they get any better.

So why not hydrogen, a potentially clean fuel?

It's true that hydrogen is literally everywhere, but not in a state at our (sea) level where it can be readily consumed. Hydrogen is found in nature usually tightly bound to other elements, the most abundant being in combination with oxygen to form water. If it is bound to itself (H2), it is literally in the stratosphere in trace amounts, being about 14 times lighter than air.

Significant energy is required to split hydrogen from the bonds of a strong molecular grip. Where would this energy come from? It can be "dirty," from fossil fuel — but that would be ludicrous. Or it can be clean from wind, waves or other (fanciful) source — but that is frankly wishful thinking.

In addition, transporting and storing hydrogen is cumbersome and dangerous — requiring either cryogenic temperatures or 5,000-10,000 pounds of pressure per square inch. Imagine living beside — or worse — driving around with storage tanks under such stress. And just imagine the cost to build a transport infrastructure as intricate as what has evolved for fossil fuels over the past century. The mind boggles.

Consequently, we envision solo- and solar-based energy to be the future. The price paid will be significant — diminished freedom to travel about and very modest abodes (aka cubicles).

Time will tell...

posted by John Herman 2:56 PM

Monday, May 17, 2004

Our "Insects Exercising" page ponders the fact that although we are the most intelligent species on earth, we're the only species dumb or vain enough to waste precious energy exercising with little more than sweat to show for it. (If we were really smart we'd jury-rig this selfish effort to generate electricity or something useful!)

That observation may soon change however. The company Takara recently announced the development of a new gadget called Cat Attack, which in their words is "the world's first remote control cat toy for the 21st century cat." (Twenty-first century cat?!!!) Basically, it is a human-operated device that attempts to engage our "sedentary cats with weight issues" to get them off their furry behinds to "exercise." With an estimated 77 million pet cats in the U.S. alone, you can guess the company's incentive for Cat Attack developing this novelty.

The toy purportedly can mimic the spontaneous movements of a mouse in an attempt to revive the cat's feral instinct, which evolved to maximize survival in the wild. How does it do this? Here's the kitty litter according to Takara's press release:

"Cat Attack uses the latest research in chaos theory and complex systems to simulate the movements and personality of a cat's favorite prey. This "virtual mouse" technology utilizes algorithms modeled upon the neural network of a real mouse..."
(Visit http://www.takara-usa.com/ for more info.)
Chaos theory? It's chaos alright! So, now that we've supplanted evolution to make the domestic descendent of species silvestris in our own image — a fat and lazy couch potato — we're now attempting to entice these coddled animals to exercise. This is pushing anthropomorphism to the extreme.

Nevertheless, all this will eventually be made moot if you subscribe to the Phrenicea scenario of the future, where all live pets — predator and prey alike — will be banned and replaced with artificial ones for the sake of the environment.

Time will tell!


posted by John Herman 10:26 PM

Tuesday, April 06, 2004

I am often asked what the purpose of the Phrenicea website is, and if it's lucrative. ("With respectable traffic," they say, "why not display banner ads?") When replying that it's a not-for-profit pursuit, I sense that I am perceived as naïve or idealistic, and perhaps that I am wasting good time that could be better spent in a moneymaking endeavor.

This repeated experience for me prompts the question, Should all of our time be devoted to acquiring wealth and material gain?

So much today is geared towards money accumulation, often as a primary objective, rather than as a byproduct of an excellent product or service. For many corporations, steadily increasing profit is their overarching goal. I've noticed too that many times it's these companies in particular that boast the loudest about having a "mission" to serve their customers; and that their customers are "number one." But the empty catchphrases are transparent to the hapless customers and especially to their employees — who see the hypocrisy through the hype.

A great case study is General Motors — still big but once the undisputed leader in auto sales as well as styling — which became particularly greedy in the 1970s when the "bean counters" were elevated above the engineers and stylists, to pursue a suicidal strategy to maximize profits by sharing components among the various divisions. At first it was a cost saver and very successful, until buyers caught on to the homogenization. It did not take long for GM's prestige brands, Buick and Cadillac, groomed for decades by solid engineering and ingenious marketing strategies, to be viewed as glorified Chevys and Pontiacs. Oldsmobile did not even survive the debacle. Having learned a hard and very costly lesson, GM is concentrating again on "product" in their attempt to regain credibility vis-à-vis the European and Japanese competition.

(Will Google soon make analogous mistakes? It's going to be interesting to watch what occurs when it takes the plunge from private to public. It would be a shame to see their from-the-heart creativity and innovation; their bold and brash approach to pursuing new content and features; and their naïve, light-hearted and fun approach quashed by a profit mentality that is synonymous with most public corporations "answering" to their stockholders. Will every decision-making effort be scrutinized for its impact on revenue generation? Oh well.)

So, after that bloated blog-empowered diversion, What IS the purpose of the Phrenicea site?

At the risk of once again being considered naïve or idealistic — it is to gain satisfaction from inducing introspection, thought or discussion about potential futures as well as what is going on today. I will be very gratified if:

  • someone visits the "H2Ouch" page and becomes cognizant of wasting fresh water;
  • someone visits the "Chiphead" page and begins to wonder at all the complexity that is their brain;
  • someone visits the "Hot Topic" page and ponders our impact on the environment;
  • someone visits the "No Pets" page and becomes enraged at the blunt scenario, but then begins to think about the irresponsible neglect of countless homeless pets today;
  • someone visits the "No Money" page and considers the now-unbelievable possibility of replacing a computer-based money system with a biological one;
  • someone visits the "Construction" page and is tempted to learn more about the current state of brain research.
If even a minority of Phrenicea visitors becomes more curious or inquisitive about the issues of today — or tomorrow, then it's worth the time and effort. If that is naïve or idealistic, then so be it.

Time will tell...


posted by John Herman 7:01 PM

Saturday, March 06, 2004

How about a lighthearted blog for a change!

Is our happiness inversely proportional to the amount of choices that we have? I am beginning to believe that it is.

It wasn't that long ago when we didn't have hundreds of TV channels; when it didn't seem so difficult to find something interesting or enjoyable to watch — even with all the commercials. Curiously, so many complain today that they just can't figure out what to watch with so many programs available. Is the quality that poor now or is it just too many from which to choose? Watching some of the old TV series on DVD or on the nostalgia channels, I don't believe it's a quality issue!

There was also a time when movie theaters played just one movie for a week or more. Imagine! So there you were, stuck in there to watch the entire show to get your money's worth, even if it was boring or bad. But how many movies seemed that way, until you watched the whole thing? Only then would you begin to appreciate it. Sometimes days later. Sometimes years!

Gone With the Wind was that way the first time. It seemed to go on forever. I surely would have left it for Godzilla if I had the chance. But then I would have missed that powerful ending. It's still worth waiting for Rhett Butler to stiff Scarlet O'Hara with that "I don't give a damn" line. Yes!

Now, at the megaplexes with ten or more different movies playing simultaneously, a new phenomenon has emerged: "movie hopping," where the bored can simply get up and sneak into another movie in progress. It's against the rules but it's happening. It's even got a name, and it's cool — or whatever the latest colloquial jargon is today (da bomb?). Unfortunately though, the hoppers will never have the chance to appreciate the movie they "thought" they would not like. Sadly, they may not really enjoy any of them — with the missed beginnings, endings and in-betweens.

Another observation: Why is it that I tend to enjoy a movie more when it's on broadcast TV, even if accompanied with commercials, than when watching it on DVD? (Even if I own the DVD that is... somewhere!) Is it because I don't have the option of rewinding, fast forwarding or pausing at will? Because I don't have that compelling choice just a remote-control button away? I'm beginning to believe that is the answer!

*****
Of course I now have to weave this thread of thought into some comment about Phrenicea. So — imagine if you had all of the world's knowledge at your command (by merely engaging Phrenicea). Would you Phrenicea-hop here, there and everywhere to the point of not concentrating on anything?

Time will tell...


posted by John Herman 10:17 AM

Monday, February 02, 2004

Internet anarchy seems to be getting worse each day.

Literally hundreds of junk emails are dumped into our mailboxes daily, many with triple-X-rated animated porn. Phony online missives are cleverly disguised with genuine logos
We're witness to a dizzying escalation of the matériel of cyberspace.
and HTML layouts that could convince even the savviest PC user to provide banking, eBay or AOL information to "reactivate" accounts — resulting in instant identity theft.

Worms and viruses are tailored to penetrate the membrane of Microsoft's soft(ware) underbelly — held at bay only by hard and soft firewalls that must be perpetually fortified. We're witness to a dizzying escalation of the matériel of cyberspace, rivaling the MAD-plagued world that followed WWII.

And now there's booble.com, launched January 20, with a brilliant branding strategy to become a best-known site overnight. It's a way too obvious rip-off of Google's name and web presentation — for searching that subset of the web classified as "adult oriented."
Internet anarchy is going to get worse — the rotten fruit of unimaginable ingenuity.
Booble executives dismiss Google's claims of trademark infringement, stating that the site is merely a "funny parody" of Google. Google in turn is being sued for selling keywords to advertisers that are more than generic, but registered company names.

Where will it all end? We predict it's going to get much worse — the rotten fruit of unimaginable ingenuity — to the point where we'll gladly embrace with open arms a centralized authority to keep things under tight control. Might this be Phrenicea?

Time will tell...

posted by John Herman 7:15 PM

Monday, December 15, 2003

Our very own John Herman, Director of Phrenicea, has a thoroughly opinionated letter published in the 29 December 2003 issue of Fortune magazine:
"With all due respect to W. Brian Arthur, the information revolution's continued advance in breadth and sophistication will not lead us into decades of prosperity but will level the social and economic playing field for all of the world's population. The result will be a lowering of the standard of living for today's developed nations, as Third World and developing nations acquire ready access to the same knowledge and information base.

Arthur should extrapolate on a global scale the U.S.-India relationship described in "Where Your Job Is Going" in the same issue, where all of the world's population will then compete for whatever few jobs might be left that his vaunted technology has not yet eliminated."


It seems that the eminent Citibank professor W. Brian Arthur published an essay in the 24 November issue of Fortune postulating that "the main hope for future economic golden eras remains [the] cluster of technologies we call information technology [since] it's too early for biotech or nanotech to transform anything..." This was more than enough to get Herman's goat, prompting him to fire off an editorial letter to the magazine.

If you're at all familiar with the Phrenicea scenario of the future, it's our take that biotech will be the predominant force of change in this century — offsetting the social and economic damage done by information technology.

Time will tell!


posted by The Editors 7:16 PM

Wednesday, November 19, 2003

We recently received a thoughtful email on the subject of Phrenicea's Polynutriment:

Subject: Krill and Plankton as Polynutriment? Are you sure?

While reading through your website, I found it rather interesting, but I also found something else. About the creation of this Polynutriment, the food that is to be a "custard-like porridge" and will take care of our food needs:

You say it is to be made of krill and plankton. However, while reading an article (which may or may not be true) I found that, as the world's population is growing, we no longer have so much food as to suffice for everyone's appetite (have we ever?). Consequently, we have been scouring the ocean for krill. You may well know that whales eat krill and plankton, and as we eradicate their food supply, they will become extinct. "By reaping the smallest life forms, we may well eliminate the largest," is the ending quote of the article.

This suggests that there isn't enough krill and plankton to endlessly make the Polynutriment, unless we are to live in a state in which all life forms except for humans (and possibly bacterium) exist. So, the question (or rather, questions) is, are there even enough krill and plankton left to make an endless supply of Polynutriment?

Also, will we make animals extinct (namely, whales, so we can eat "their" krill)?

Thank you. Great site!
Max


Dear Max,
Thank you for your email, and kind comments concerning our website.

To answer your first question, "Are you sure?" The answer is... No! Of course no one can predict the future with any certainty. Our mission is to stimulate thought and discussion regarding the future and the present — and it appears we have been successful in your case!

We are very concerned about the potential near-term extinction of many ocean dwelling species due to the super-mechanized techniques that have been developed to snare ever-larger catches. Species at the lower rungs of the food chain are being targeted already, as traditional catches become smaller. Analogous concerns linger in regard to mechanized mega-farms on land.

The over-the-top Polynutriment scenario hopes to bring these concerns to light for visitors of the Phrenicea site.

Your comment regarding the availability of plankton and krill as a sole source to produce Polynutriment is a valid one.

Visit the new Alfred Hitchcock inspired web page to learn of an additional source!

Thanks for the excellent email!

posted by John Herman 7:20 PM

Wednesday, October 08, 2003

It appears now that California will benefit from the naïve intelligence that we alluded to way back in August ("Two Cents" below). The state's voters have spoken loud and clear — there will be a Governor Arnold (easier to pronounce). The desperate efforts by the established politico types to disgrace The Terminator could not mask his sincere conviction to try his best to do good for the people.

It will not be easy going however. After years of mismanagement, the state is a mess. Arnold will be underestimated at first, and no doubt he has a tremendous learning curve. And there will be threats of more recalls.

Regardless, we will go on record here to predict that history will one day record that Arnold was one of the best governors the state ever had.

Only time will tell of course, as is typical with this futurist stuff.

Naïve intelligence — watch it work wonders.

posted by John Herman 7:38 PM


Hmmm... If only time will tell then maybe TIME already has!

Postscript by John Herman June 7, 2007 (from the TIME Magazine 100 issue — May 14, 2007)


Hmmm... If only time will tell then maybe TIME already has!

Postscript by John Herman June 18, 2007 (from TIME Magazine — June 25, 2007)


Hmmm... If only time will tell then maybe WIRED already has!

Postscript by John Herman October 21, 2007 (from WIRED Magazine — May, 2007)


Hmmm... If only time will tell then maybe US NEWS already has!

US NEWS

Postscript by John Herman November 15, 2007 (from WIRED Magazine — May, 2007)

Tuesday, August 12, 2003

Ahhhhnold! You made the cover of Time magazine! And Newsweek! Incredible! An incredible life story — from literally nothing to running for governor of the state with the largest population in the U.S.

Why are we at Phrenicea excited? It has nothing to do with politics, of which we try to refrain from stepping into the fray. Arnold's journey and especially his most recent pursuit is a wonderful example of "naïve intelligence" or "intelligent naïveté."

Schwarzenegger's serious opponents, as well as the news media, will inevitably zero in on his political inexperience as a major liability. We disagree. His political naïveté is an asset — which will foster creative solutions to surmount constraints erected by those calcified with formal training (law in this case) and by the political stalwarts with a limiting mind-set that says "it's just not done that way" or the condescending "that's not how things work."

We often see examples of Intelligent Naïveté without acknowledging its impact:

One of our favorite musicians, Ian Anderson of Jethro Tull, wrote the best material of his career while playing the flute as if it was an electric guitar, with fingering learned (wrongly) without the benefit of lessons. Ironically, subsequent retraining and cognizance of proper technique produce more complexity, but the music seems to lack the simple beauty and timelessness of his original work. This same headfirst naïveté drove him to develop one of the most successful salmon farms in Scotland.

Jackie Gleason arguably peaked with his 39 scrappy episodes of the Honeymooners, and then became so self-conscious of his intelligence and "greatness" that he could never match its simple magic.

Michael Dell, in his campus dorm room, began selling PCs he assembled to fellow students. Little did he know that this simple model would revolutionize manufacturing methods and eventually be taught at major business schools.

And we all know about the most famous college dropout Bill Gates — hoodwinking "Big Blue" — the then colossus of computing.

The point we're trying to drive home is that it is not typically the experts that are extraordinary. It's often the intelligent but naïve, those brave or ignorant enough to push on where others would not tread.

And the germane point here is that our passion — the study of the future — does not have to be limited to the "experts" or those claiming to have the proper training. Futurism is an open field of study where the intelligent but naïve can have just as much of a chance to correctly predict future events or trends as one claiming to be a professional futurist.

So go Ahhhhnold! We'd love to see you as Governor Arnold and watch you flex your brain muscle to turn conventional wisdom on its head.

posted by John Herman 11:26 AM

Wednesday, July 23, 2003

The Phrenicea scenario of the future depicts a worldwide society that has abandoned sexual intercourse, conjugation, pregnancy and birthing in favor of clinical cloning.

As time passes, this scenario first presented in the 1999 short story "The Engagement of Phrenicea" is becoming tame. Fleshed out somewhat on our website since then, the scenario envisions women no longer enduring pregnancies and labor.

According to the story/web scenario, each individual would be allowed to initiate one clone, of himself or herself — or another chosen individual. Originally considered a bold prediction, it pales with today's reality.

The July 12 issue of the British magazine New Scientist editorially condemns an effort being pursued by a team in Israel to take the eggs from the ovaries of aborted fetuses with the intent of fertilizing them in vitro. If eventually successful, the resulting child would have a mother that was never born!

Perhaps the Phrenicea scenario was unknowingly prescient to insist on cloning of (once) living individuals that were the product of a natural conjugation. Suddenly human cloning almost seems benign.
posted by John Herman 6:38 PM

Saturday, June 21, 2003

Here's Two Cents from a thoughtful visitor of our Phrenicea site on the subject of "Will computers ever rule the World?":

When I was in college, I was haunted by the question, "What is a computer?" I started out in theoretical Computer Science and ended up in Computer Engineering. I also spent twenty years thinking about Artificial Intelligence and have the following conclusions:

A computer, based on the von Neumann Architecture, is a dumb, stupid machine. And a von Neumann computer WILL ALWAYS BE just a dumb stupid machine. The popular mythology of the day is that computers will evolve to have superhuman intelligence. How this will happen is just beyond my imagination. You can't violate the laws of physics just because you want to or say so. Computer bits are like an arrangement of rocks in a field. We humans attribute meaning to that arrangement, but essentially it's just rocks. von Neumann computers based on discrete mathematics will always have limitations.

We are currently in a period of human cultural evolution where "the computer" is mythologized to be a God. Just like Superman, "the man of steel," was mythologized to be a God 60 years ago in the age of monster high-rise buildings. It seems popular culture must have fictional characters that rise above that which currently amazes us. To leap tall buildings in a single bound, bend steel, run faster than a locomotive. For computers we have Commander Data of Star Trek and Neo of Matrix. These characters have the ability to rise above it and be masters of their universes.

Hard AI people will argue humans are bound by the same limitations as computers. That's just not completely true. From the perspective of discrete mathematics, then yes. But the human mind goes beyond the von Neumann architecture in the following respects:
Thoughts in the human mind come into existence in non-discrete, non-deterministic ways. The electrical activity of the brain has analog side effects, which give rise to creativity. The human mind on the surface appears to be logical but at the core it is fundamentally non-deterministic. Although the final states can be simulated, the causality of the states, or that which causes state transitions, requires gray matter.

Creative solutions to problems are not synthesized through logic. Creative solutions are happened upon after many potential solutions are discarded without conscious thought. The way to solve a difficult problem is to "be" with the problem over time. This way of solving problems has nothing to do with the von Neumann Architecture. The longer you are "with" the problem the greater your chance of solving it.

Now if someone starts selling DNA-based computers, then we may develop hard AI that is equally smart (and stupid) has human intelligence. And even if we do have DNA-based computers, I don't think we can just assume that building human intelligence into a machine will automatically mean a lack of stupid behavior.

One the things we used to say in college is before you can have AI, you first have to have or understand intelligence. This led me into studying philosophy and eventually psychology. After many years it became clear to me that the human mind is only part of the equation of human intelligence. Jungian or depth psychology shows we are far more complicated.

I think our broader society has to understand that computers have profound limitations. My hope is that eventually the self-esteem of human intelligence is restored.

D.P.
Well, D.P. certainly does not appear to be a "chiphead," our respectfully disrespectful appellation for Ray Kurzweil, champion of chip-based intelligence.

So — please keep those cards and letters (aka email) coming!
posted by John Herman 8:34 AM

Saturday, May 31, 2003

Lots of good stuff spotted in the media recently:

Time magazine's June 2nd cover story asks the question, "Are you programmed from birth, or does life change the program?" Given the realization after the Human Genome Project's surprising finding that humans have only 30,000 genes vs. the overblown guess of 100,000 — or about twice the number of a lowly worm — scientists are rethinking how that could possibly be enough to make us human. The shocking theory now is that our environment has much more influence on our development — beginning in the womb and lasting throughout our lifetimes — than was ever imagined. "Genes are not static blueprints that dictate our destiny. How they are expressed [...] is affected by changes in the womb, by the environment and by other factors." This is revolutionary thinking after so many years of believing that the DNA sequence per se defined our destiny.

The May 30th New York Times reports that scientists at the University of Idaho and Utah State University cloned a mule [the sterile result of crossing a male donkey with a female horse], which heretofore had never reproduced. "This same technique," they state, "might also allow geldings, which are castrated horses, to have progeny." This is science fiction becoming reality!

In the May 30th issue of Cell, a scientific journal, researchers report discovering the "master gene" in embryonic stem cells that gives them their regenerative capabilities. This could lead to discovering how to create stem cells from any cell within the body, thus broadening the potential to clone individual body parts.

The May 10th issue of NewScientist magazine proclaims, "We could be on the verge of creating children from artificial eggs and sperm made in a lab dish." They surmise, "The applications might even include allowing infertile women or a pair of men to have their own children." This research is again related to gene expression — the turning on and off of key genes. Even more shocking — a man could pursue "self-fertilization," where his passel of genes would be used to create another variation of himself, not an identical clone.

The famous Cold Spring Harbor Laboratory's DNA-learning Web site, features a free chapter download from perhaps the most successful biology textbook ever, DNA Science: A First Course, which states matter-of-factly: "We stand at the threshold of a new century with the whole human genome stretched out before us." And seemingly describing the Phrenicea website by stating that, "the border between science fact and fiction will blur as we move further into the genome age." Again, mirroring the boldness that is Phrenicea: "A tenfold increase in the information capacity of photolithographic DNA arrays could condense the entire human genome onto a chip the size of a postage stamp." Suddenly, the thought of using our genome to identify who we are becomes technically feasible.

The next couple of decades will be exciting indeed. It seems that whatever we can fathom now, even if deemed preposterous, can become plausible after just a few years.

What color will you choose for your cubicle?
posted by John Herman 8:18 AM

Tuesday, May 06, 2003

The quote, "The Future is Now," is by now a cliché. However, things are happening in this world that are truly frightening.

We present a scenario on our "sterile" page depicting paranoid, super-precautionary lifestyles — a society with bizarre behavioral adaptations such as "donning surgical masks, caps and even gloves" when going about. Well, that image is already commonplace in China with the sudden impact of SARS — truly a "wildcard" event. (Newsweek magazine has an excellent "My Turn" piece depicting life with SARS.)

We also predict widespread cubicle dwelling, which can still be considered way-out by most — right now. In what may be signaling a burgeoning trend, the hosts of the Webby Awards recently announced that, "with concerns about traveling heightened by global events, [they] would bring [the awards] to the winners around the world rather than having the honorees travel to San Francisco, [since] a significant number of nominees have expressed concerns about traveling at this time." The International Academy of Digital Arts & Sciences will use Web technology to offset the dangers engendered by technology. (Technology offsetting technology.) Consequently, the winners will accept their award while planted in front of their PCs in their PJs, rather than basking in the limelight at a posh ceremony in San Francisco.

Cubicle dwelling seems like a wacky prediction and the future as per Phrenicea is not yet now. But at this pace, for how much longer?

posted by John Herman 6:57 PM

Monday, May 05, 2003

One of the more controversial aspects of the Phrenicea scenario, in terms of feedback, concerns the prohibition of keeping live pets to minimize fecal pollution. The "Two Cents" submitted by J.S.H. is fairly representative:

I have to make a comment on the subject that "all pets will be banned in 2030." This is one Phrenicea vision that will not occur for several reasons. A dog is one of man's best friends and for many so are cats and birds. There are billions of animal lovers in this world, and if their pets were banned, an enormous black market would develop. The other reason — there is an advocacy group that is for the preservation of all species and it's obvious they had the political clout to save the tsetse fly and the spotted owl.

A more logical solution to the [pollution] problem would be to impose a heavy tax on the purchase of pets and by law the owners would be responsible for the disposal of the biological waste on a daily basis. Of course that would discourage many pet owners because of the extra cost and task them daily to an unpleasant procedure. They would be candidates for having "matties" as [artificial] pets.

No Phrenicea, there will be bow-wows, meows and songs and chirps forever and ever. And nature will still provide society with the songs of the robin, the caw of the crows and warble of many other songbirds.

Perhaps Phrenicea by 2030 can find the formula to return the Canadian geese back to Canada, it's obvious they are the biggest poopers of all!

Sincerely,
J.S.H.

Perhaps J.S.H. is right. As we say elsewhere on this site, 'Everyone is qualified — and no one is qualified — to predict the future.' However, taxes and impositions would not diminish the volume of pet-generated fecal matter contributing to a calamitous pollution problem that can no longer be disguised, regardless of how "proper" is its disposal.
posted by John Herman 6:52 PM

Thursday, March 13, 2003

Although it is not our place or purpose to side with the hawks or doves in regard to impending war, the current state of affairs is a crisis situation that causes concern and anxiety regardless of one's ideology.

Worrisome too is advancing technology and the consequent sophistication of weaponry designed to cause destruction. Whether potentially wielded by governments able to be identified, or more frightening by countless individuals ultimately matters little — to the victims.

The hard reality is that the capability and potential for massive loss of life continues to grow. We as a species appear to harbor vestigial behavior that may have contributed to our survival when the most potent weapon was a stick or a sharp piece of stone. This behavior, when amplified with tools of incredible destructive power, just might ultimately end the human race.

"Survival of the fittest" got us to where we are today. But it's time to consciously redirect the clever behavior that did us well for millennia and apply it towards the pursuit of coexistence. As we say elsewhere on this site, the stark realization is that our survival as a species is now up to us.

It is a job for all to pursue and it's in all of our best interest.
posted by John Herman 7:28 PM

Wednesday, January 29, 2003

February 28, 2003 will mark 50 years from the Saturday in 1953 when 25-year-old James Watson and 37-year-old Francis Crick discovered the double helix structure of DNA. This event was perhaps the very first of many that would eventually be coined the Biotechnology Revolution, a phenomenon that is still proceeding at a rapid pace today (and will — in our opinion — continue throughout this century). Perhaps if the two were less ceremonial they might have been first to exclaim to the world, "That's one small step for [a] man, one giant leap for mankind," as Neil Armstrong did sixteen years later with his first footprint on the moon. (In retrospect, which achievement do you believe is more significant?)

There will be much fanfare in the general press marking the DNA discovery — and rightly so. Of all the "secrets of life" revealed before and since, elucidating this molecule is arguably the most momentous. It is rare that the complexities of nature, translated into terms understandable to man, become popularized and familiar to the layperson. This "golden anniversary" is therefore a celebration of more than the discovery itself.

Subsequent research performed by many others later revealed the elegance of DNA replication, RNA's role in assembling amino acids into proteins, and the tremendous importance attributed to the specific sequence of DNA's nucleotides, which when paired make up the "rungs" of the ladder-like structure. Later in life Watson would be instrumental in the effort to sequence the entire human genome — a very significant accomplishment that hasn't even begun to fulfill its potential.

The Phrenicea editors believe that we'll go beyond merely accepting nature's DNA construct and will take advantage of the supremely elegant scheme — to store non-somatic data on artificial DNA. Imagine an additional man-made 24th human chromosome comprised of genes related to our own (comparatively trivial) complexities such as genealogy and family history, educational level, political persuasion, retirement account balances, medical history, credit card activity, marital status, cloning status[!], etc. Only time will tell.

Nevertheless, it's now time to give hearty cheers to Watson and Crick for their brilliant discovery — and to all who have proceeded from their groundbreaking footsteps!

posted by John Herman 6:52 PM

Friday, January 03, 2003

So now the world awaits proof from Clonaid that they've succeeded in cloning a human being. Given the relative bizarreness of the Raelians, the sect that Clonaid is associated with, skepticism is warranted. We here at Phrenicea believe that their success is ultimately irrelevant in the long term. If they fail to provide proof and upon further investigation their claim is relegated a hoax, then it will only be a matter of time before another well-funded group makes a similar announcement — but accompanied with scientific proof. This will inevitably happen!

The Phrenicea scenario of the future depicts a worldwide society that has abandoned sexual intercourse, conjugation, pregnancy and birthing in favor of clinical cloning. Why do we propound this radical view? Because it is human nature to pursue a course that involves the least amount of pain and effort. (Just compare today's soft society with the rigors of pre-industrial life.) The catalyst will be biotechnology's ability to concoct the artificial milieu that can nurture a human stem-like cell to term — successfully replacing the rich and protecting environment of the uterus. Once the need for a woman's body to produce another human being is totally eliminated, inconvenient and painful pregnancies will be history.

posted by John Herman 11:00 AM

Friday, November 22, 2002

We here at Phrenicea are sensitive to items in the news that, while just routine articles to many, are actually bellwethers of things to come.

Not one but two significant stories were reported today.

The first is an announcement by the scientist that was instrumental in shortening by years the effort to map the human genome, J. Craig Venter. His Institute for Biological Energy Alternatives has received $3M from the U.S. Department of Energy to synthesize a chromosome — an artificial chromosome that doesn't exist in nature. This is truly significant and congruent with our vision of the future where DNA-based monetary units will be stored on a man-made 24th chromosome pair — replacing current forms of money (paper, electronic, etc.). What once sounded completely ridiculous suddenly does not!

The second news item comes from Port Everglades, Florida — where the cruise ship Amsterdam sits idle for a thorough (every nook and cranny) disinfecting. The effort will have every pillow replaced and literally every surface of the ship's interior and deck disinfected. Eighty-seven passengers from the just-completed cruise had to leave the ship after developing symptoms of the Norwalk virus — including diarrhea, stomach pain and vomiting. For those that remained, drastic behavioral modifications were adopted. To stem contagion and damping what normally should have been a fun time, passengers rubbed elbows instead of shaking hands, washed hands repeatedly, refrained from eating buffet-style food, and were thoroughly mindful of coming down with the disease. We believe this behavior will become more the norm as the world's population commingles and as germs become more resistant to our efforts to eliminate them.

posted by John Herman 4:06 PM

Monday, November 04, 2002

An alert fan of Phrenicea (who wishes to remain anonymous) recently sent us an email indicating that our three-year-old scenario may be catching on.

She reports that the November-December 2002 issue of The Futurist contains an article (page 41) written by doctoral fellow João Pedrode Magalhães (University of Namur-Facultès Universitairs Notre-Dame de la Paix in Belgum) addressing what might occur when a single person has the potential to destroy humanity. He concludes:
"The most likely future of humanity will involve an initial decrease in personal freedom for an increase in security. The percentage of humans with power will decrease with time, which might lead to humankind being ruled by a single mind. Even if taking over humankind is difficult and occurs far in the future, it is an irreversible step. Therefore, if we look far into the future, we see that one mind might eventually rule mankind, or what is left of humankind at that time."
We here at Phrenicea headquarters are flattered. It appears that the terrorism wild card has turned João into a Phrenicea aficionado. We expect — as the century unfolds — to find many more sightings congruent with our vision. We will hereby immodestly reference these with a newly coined word, "phreni-FINDs."

Phrenicea Web visitors are invited to submit phreni-FINDs wherever found, which we'll chronicle on our On-Target page.

posted by John Herman 11:02 AM

Wednesday, October 23, 2002

The tragic events unfolding in the Washington area have seemed to inaugurate an unprecedented and potentially frightening phenomenon: televised press conferences not meant for the press or viewing public, but for perpetrators of violence.

Imagine the consequences of such a concept should this pattern of crime proliferate: CNN- and Fox-like dedicated channels for publicly communicating with criminals!

Could this be just another step towards the scenario depicted on phrenicea.com, where a centralized controlling entity is required to neutralize unacceptable destructive capability of a technology-enabled populace?

posted by John Herman 11:05 AM

Friday, September 13, 2002

The recent announcement by McDonald's that they're changing the type of oil they use to make their irresistible French fries is another opportunity to demonstrate the principle of UNINTENDED CONSEQUENCES. We here at Phrenicea keep harping on the phenomenon since it seems to be completely ignored by Ray Kurzweil, whom we respectfully call "Mr. Chiphead" and devote an entire Web page to.

In the very early days, most fries were immersed into sizzling saturated animal fat, such a lard or tallow. "Saturated" refers to the lack of empty locations on the fat molecule to bond with hydrogen, that is, it is already saturated with hydrogen. Through studies, it was found that saturated fat causes high cholesterol, clogged arteries and heart disease.

Unsaturated fat, whose source is vegetables (corn, soybean, cottonseed, etc.), has many empty spots along the molecule not containing hydrogen (unsaturated). These were found to be healthier. However, they tend to quickly become rancid in use. That's when "partially hydrogenated" oils emerged. Man-made, unsaturated oils were chemically modified to bond with some hydrogen, not enough to become saturated. These seemingly had the best qualities of both saturated and unsaturated. Yes, Man did it again and beat Nature.

HOWEVER, after time studies revealed that these "trans" fats were actually worse than the saturated fats. The man-made chemical bond resulting from the hydrogenation process was not like that of Nature, but was of a type called "trans." It appears now that they increase the bad cholesterol and decrease the good cholesterol! Who would ever expect that! Anyway, like we often state here, "Man cannot outsmart Mother Nature." There will always be UNINTENDED CONSEQUENCES, so be skeptical when listening to the so-called experts expound on how we will become one with computers, computers will be smarter than us, etc.

posted by John Herman 1:25 PM

Tuesday, September 10, 2002

Hooray! The topic "Future Studies" is given rare media attention this week with a special ten page section in Newsweek titled "The World in 2012." Our friend Joe Coates is mentioned, as is the World Future Society's convention this past July. (Perhaps next time they will mention the Phrenicea site/scenario!)

Unfortunately, the overall assessment of the discipline is one of pessimism when compared to its heyday 30 years ago, after Alvin Toffler's popular "Future Shock" runaway best seller sparked widespread interest and thinking about the future.

The final topic, "So Predictably Unpredictable," does not paint a rosy picture futurism's impact on current political thinking. It tries to prove the point of the futility of predicting the future by paralleling our Oops pages — highlighting predictions that were way off base. Of course Phrenicea's motive for including unfulfilled predictions is not to disparage the field of study, but to prove the point that so-called "experts" can be as fallible as anyone, regardless of their credentials and braggadocian comportment. All ideas concerning future events have value.

So even if contemplating the future is not as popular as it once was, that's not a reason to write it off. To repeat our philosophy, thinking about the future inevitably stimulates scrutiny of the present, with the hope of improving conditions for all.
posted by John Herman 3:29 PM

Thursday, July 18, 2002

This week's issues of Time and Newsweek prompted this "Two Cents" posting. Both magazines feature cover stories revealing the shocking results of a rigorous federally funded scientific study called the Woman's Health Initiative (WHI), to finally determine the long-term effects of Hormone Replacement Therapy (HRT), the fancy term for prescribing middle-aged women supplemental estrogen and progestin.

Way back in 1966 Robert Wilson, a gynecologist, published a book titled "Feminine Forever," which proposed that women ingest the "wonder drug" estrogen, incredibly (for back then) mass produced by pharmaceutical companies from the urine of pregnant mares, to ameliorate the deleterious side effects of menopause triggered by the natural decrease of the hormone as they age. (Subsequently, many women, with the cooperation of their physicians, got "hooked" on the beneficial aspects of the drug, such as: supple skin, vibrant hair, sustained libido, etc.)

Like Ray Kurzweil,* that wildly optimistic technophile that today envisions all sorts of incredible (for now) man-made chips, micro-robots, etc. embedded within our brains and bodies — merging man with machine — Wilson boldly (and ignorantly) proposed back then that since it was within our power to create this hormone in quantity; Why not artificially keep estrogen levels perpetually on par with younger women?

It seems so simple! Just step in as Nature steps out! We humans are clever, knowledgeable and have ability! (Never mind that Nature spent millions of years perfecting our biochemical balancing act!)

But aha!!! "Unintended Consequences" strikes again. Now 40 years later, it's determined that artificially prolonging youth this way can cause invasive breast cancer and severe cardiovascular problems. And we've not a clue why!

Regardless of pedigree, level of learning, accomplishment, professional status, and reputation — playing with Nature is a dangerous game.

Once again, UNINTENDED CONSEQUENCES! Repeat after me...

*Whom we here at Phrenicea lovingly lambaste and call "Chiphead."
posted by John Herman 10:57 PM

Monday, July 01, 2002

The June 24th issue of the National Academy of Science published a report compiled by Mathis Wackernagel (author of "Our Ecological Footprint: Reducing Human Impact of the Earth") et al belonging to an organization in California called Redefining Progress. The report purports to measure how much of the Earth's resources are being consumed by humans, and concludes that that it would now take 1.25 Earths to regenerate all that is being consumed.

They estimate that since 1980, human demand has exceeded the capacity of the Earth to regenerate. (Not surprising.) Wealthy countries obviously are the worst offenders, such as the U.S. — which uses up to three times the resources per person than those in less developed areas.

The report focuses on six activities requiring the Earth's physical space:
- growing crops
- grazing animals
- harvesting timber
- fishing
- housing, transportation, power generation
- consumption of fossil fuel

We here at Phrenicea applaud the study, which formally addresses some of the concerns which have led us to propose what some consider a preposterous scenario of the future.
posted by John Herman 12:30 PM

Tuesday, June 04, 2002

At the risk of sounding presumptuous, it appears someone at Time magazine might have visited our Phrenicea site recently. Spotted today in the June 10th issue, there's an "Inside Business" bonus section with a somewhat familiar-sounding blockquote: "How biology is teaching us to build better software, computers and companies."

Inside the foldout page, writer Eric Roston states: "Software engineers will tell you that the longer they labor to solve complex problems by manually writing code, the more they respect the reasoning powers of the brain."

DUH!!!

And of course the article references our now-ubiquitous "Mr. Chiphead," Ray Kurzweil.

UGH!!!
posted by John Herman 9:36 AM

Wednesday, April 24, 2002

The Phrenicea scenario of the future envisions the cloning of humans — ultimately to the point where it supplants male/female mating and consequent pregnancy, due to its simplicity and convenience. Of course this sounds preposterous from today's perspective. Society and mores change ever so slowly, but they do change. And they will change as a result of very small steps along the path towards human cloning.

One of these small steps can be found in the April 29, 2002 edition of Fortune magazine. The article tells of a very wealthy man, John Sperling, who invested over $4 million to create a company, Genetic Savings and Clone just to clone his dog, Missy. He pursued this because its feasibility seemed within reach after hearing about Dolly the sheep, and because he could not bear the thought of living without his Missy, whose characteristics he believed would never be found in another dog. Although a failure so far, the effort did result in the first cloned cat, appropriately named CC. With continued investment and experimentation, you can be sure that dogs will eventually be cloned, and that his company will thrive in duplicating pets for forlorn owners.

So how does this article corroborate the outrageous Phrenicea prediction? It does not take much imagination to visualize the next logical step: a well-to-do parent upon losing a child pursues his or her replacement regardless of cost, when its feasibility seems within reach after the hurdles with animals are overcome. Outlawed you might say? Given enough financial incentive, there will always be havens for this kind of thing. And just as news today of a cloned sheep would hardly register — so too will the success of cats, dogs and other pets become mundane, as will the cloning of humans later in this century.

posted by John Herman 10:56 AM

Friday, March 22, 2002

Recently we received a very thoughtful letter from "rs," challenging some aspects of the Phrenicea scenario. Because the primary function of the Phrenicea website is to stimulate thought and discussion about the future as well as the present — we WELCOME contrary viewpoints. It is not our intention to persuade, but to agitate! Here's a shining example of our success:

While your Website is interesting and entertaining there are a couple of fundamental flaws.

Your logic is good but I don't agree with your conclusions.

I for one have no desire to eat a nutritious glop for 3 meals a day. I like flavor. I like texture. I like different and new foods. We now have artificial foods that mimic flavor and texture of certain foods. These were developed in order to provide people with healthier or cheaper equivalents. I've noticed that the market for the real foods in these cases has not declined.

I for one am not going to give up sex. Once again, there's nothing like the real thing. And by the way, people have been prophesying for at least 40 years that marriage and the traditional family is going away. I don't foresee that happening ever, certainly not in the next 40 years. People who make such predictions don't understand marriage.

Many of your conclusions assume that everyone will have the same wants and feelings. That is just not so. There are lots of folks who never tire of seeing what's around the next bend. Not knowing, wanting to see major motivators in the advertising industry. Mitsubishi says "Wake up and drive!" You say driving is too dangerous, go back to sleep. I don't think so. In Europe, public transportation is so developed in some countries that no person is more than a 5-minute walk removed. They pay over $2 a gallon for gas, have to turn their engines off at certain intersections, and still a significant majority drive enthusiastically. I think the future will hold more in the way of built in automotive safety controls like proximity sensors and smart roads than abolishing private conveyances.

For almost every conceivable problem we have in the world today there are people thinking about and working on solutions. I believe the solutions that survive will be palatable and marketable. I don't believe that any government which steps in and makes life unbearable will survive.

That's all I have time for now...
rs


Excellent letter indeed. Why not put in your "Two Cents"!
posted by John Herman 10:31 AM

Monday, March 11, 2002

After this most unusual winter (especially after these particularly mild, even warm days in March — which even included anomalous thunderstorms) it is perhaps a good to time think again about global warming, although I tend to believe that this year's conditions in the northeastern U.S. are NOT a result of that phenomenon.

Anyway, it now appears that there may finally be some real attention paid to the impact of worldwide warming! The acknowledged consequences, including wacky weather, disappearing coral reefs and melting polar ice were not enough to get the U.S. to participate in the initiative to acknowledge man's contribution and to take some responsible steps to stop or reverse the impact.

So what might possibly cause a change in attitude? How about the oriole, chickadee, quail, thrasher, and finches both purple and gold. Yes, birds! The National Wildlife Federation has reported that various U.S. states may lose their state birds to changes in avian migration patterns — caused by changing habitats caused by global warming.

Could state pride initiate a fight to keep their birds? Could grassroots awareness lead to a national policy? Maybe our tiny feathered friends are smarter than we think!

posted by John Herman 3:42 PM

Friday, February 15, 2002

Okay, here we go again — the cloning of cloning announcements.

Yesterday the Wall Street Journal reported the cloning of a cat, called cc (for "carbon copy" — how clever) by a company named Genetic Savings and Clone (no kidding — how corny). Caught off guard, the British journal Nature had to quickly post the research paper on their website; it was slated to be published in their upcoming hardcopy issue.

This latest cloning announcement was at least substantive. The cloning effort yielded a real live cat, genetically identical to its host. With that, an ethicist at the University of Pennsylvania gushed, "The commercial future of cloning is absolutely in animals."

Well, that may turn out to true in the near future. Emotionally attached pet owners willingly spend thousands elongating their treasured animal's lifespan. The Phrenicea scenario indicates otherwise — for the long term — as we predict live pets, whether potential hosts or resultant clones, will be banned to not contribute to the critical fecal waste problem.
posted by John Herman 2:43 PM

Tuesday, January 29, 2002

Recently, I was challenged to define "terrorism" in 20 words or less. With all the references to terrorism now — the Phrenicea website included — I agreed it would be appropriate to have at hand a cogent definition for reference. I initially found the task difficult, but after several go-arounds was ultimately able to reduce the word count to just two, albeit with some loss in precision.

So, here they are:

36 Words:
Heinous act committed with stealth and surprise, targeted towards an individual or group
which presents particular beliefs or resides within certain geographic boundaries;
perpetrated by hit-and-run cowards resulting in destruction, paralyzing fear and death.

23 Words:
Heinous act targeted towards those presenting particular beliefs or residing within certain
boundaries; perpetrated by cowards resulting in destruction, paralyzing fear and death.

3 Words:
Oh my God!

2 Words:
Oh sh.t!!!
posted by John Herman 2:45 PM

Thursday, December 27, 2001

Recently watched a PBS video that originally aired in May 2001, "Beyond Human," which explores a new world in which people are becoming more like machines and machines more like people. Yale University's Mark Reed is featured. (A brilliant researcher, his work is also celebrated in the final 2001 issue of the journal Science, billed as the "Breakthrough of the Year.") Unfortunately, like that Chiphead Raymond Kurzweil, Reed lets hubris ooze through his pores to the point where it's pretty scary. Sample:
The new technology that we have on being able to nano scale and [create] molecular-sized systems is to be able to make the devices so small, that we can pack a million million devices on this [one-inch square area] and have millions of times more computer power in the same amount of area. [Here comes the scary part.] We're starting to understand that we can control Nature truly at the atomic level and so I think we're starting to see really the Nano Age, being able to understand Nature, being able to control Nature far better than we could before.
Growing up in the halcyon 1950s, when GE and companies like it boasted, "progress is our most important product," we heard analogous optimism in relation to atomic energy, space travel, chemical engineering, vaccines and antibiotics. Today's scientists and researchers should be required to incant daily: "Unintended Consequences."

Ironically the impact of unintended consequences, albeit on a different thread, is evident on the program just several months after it was completed, when Sun Microsystems co-founder Bill Joy [naïvely now] comments on nano technology:

It's the difference between crawling and a jet plane. The jet plane doesn't just get me from here to there faster: It changes geography, it changes trade, it changes commerce, it changes the world.
And since September 11th it changes the landscape. It took about 45 years, but no one was sufficiently prescient to predict that commercial jets would become lethal cruise missiles able to topple two 100-story skyscrapers — changing our lives unimaginably forever. Although it would have been difficult to conceive at one time, today's New York Newsday almost nonchalantly reminded prospective paying passengers editorially that "the human element — alert senses and willing hands — must be an essential part of travelers' defense."

UNINTENDED CONSEQUENCES! Repeat after me...
posted by John Herman 12:19 PM

Tuesday, December 04, 2001

It appears the normally humble editors at Phrenicea will have to go (respectfully) head-to-head with perhaps the most brilliant theoretical physicist since Einstein: Stephen Hawking, Lucasian Professor of Mathematics at the University of Cambridge.

In his new book, The Universe in a Nutshell, the professor takes a stab at envisioning the future, and in so doing straddles the fence between optimism and pessimism. His view (in a nutshell!) is that as long as we humans don't blow ourselves up to extinction, we will have a promising future. Curiously though, he states, "it is not clear that intelligence has much survival value," vis a vis the relative longevity of bacteria and other lower forms of life.

He acknowledges that today our own bodies are the most complex of systems. [Yes!] However, he believes man will acquire knowledge to "completely redesign our DNA," leading to an improved human race. [No way.]

He equates the size of the human brain with our intelligence, and predicts a genetically engineered enlarged brain [Nah!] enhancing that intelligence. [How then does he explain his own superior intellect given his average sized brain?]

He also envisions "electronic intelligence" [No way.] and space travel to other planets. [Unlikely.]

These debatable points seem to warrant a dedicated Web page. Watch for it!

posted by John Herman 2:38 PM

Tuesday, November 27, 2001

Those familiar with the Phrenicea scenario know the somewhat pessimistic view that cloning essentially replaces all other forms of reproduction due mostly to convenience — to the point where human evolution ceases. All new human beings are basically copies of individuals that already exist. One tiny step toward this end was sensationally reported this week, which did little more than let the cloning genie escape the petri dish. Attempts to stop the practice might hinder, but will not prevent, the eventual duplication of a human being.

Obvious this weekend was that science has changed from the days of Gregor Mendel and Charles Darwin (and even the 20th-century's Barbara McClintock), when the full impact of a lifetime's worth of work (peas, species diversity and corn, respectively) was not published, realized or appreciated until the final days of their lives, or perhaps after death. Now the proverbial ink is barely dry in the researcher's log book when the sensational results are plastered on TV, online and in the tabloids. "Exclusive — The First Human Cloned Embryo," boasted a Scientific American Web article posted on Saturday, November 24. The news, traveling at the speed of light, made a splash on all the Sunday early-morning-pundit shows, with the talking heads pontificating the impact to society and the human species. Even President Bush felt it necessary to issue a (negative) statement.

What's amazing is that the cloning experiments were a FAILURE! The intent was to create stem cells — uncommitted, immature cells that are anxious to become distinctive — like a heart cell, or a kidney cell, or even a brain cell. The scientists' dream is to some day be able to replace entire organs "grown" from stem cells — without the problem of tissue rejection. These experiments did NOT produce stem cells.

So what did they do?

Two methods were used to try to create stem cells, both utilizing donated ovarian eggs from women willing to participate:
- The first removed a woman's chromosomes from her egg and replaced them with chromosomes from another donor's mature cell. The egg was then coaxed with chemicals to divide a few times.
- The second coaxed an egg to divide with all of the woman's chromosomes intact. This is called parthenogenesis, which is analogous to the process of creating seedless fruit.

Both methods were successful in stimulating a few cell divisions, but not nearly enough to yield a ball of cells called a blastocyst, within which stem cells would reside. In the "old days" of science, these results would have been shared with fellow cognoscenti via journal publication for critical evaluation, but probably no further than that.

In conclusion, this was one small step for cloning, one giant leap for media hype.

posted by John Herman 2:31 PM

Wednesday, September 26, 2001

September 11, 2001; another "day which will live in infamy." The subsequent handling of the situation by President Bush and his administration effectively communicated the fact that lassoing the culprits will be next to impossible, and that we will all have to be more vigilant going forward, with a new perspective on all that we heretofore took for granted. From my pessimistic perch here at Phrenicea headquarters, I see this as just the beginning of a realization that as world population grows and becomes ever more sophisticated, it will be next to impossible to live a civilized-like existence without a centralized entity (aka Phrenicea!) keeping track of every individual's goings about. Comments and opinions to the contrary are welcomed!
posted by John Herman 12:27 PM

Saturday, August 18, 2001

Phrenicea attempts to predict the future, while at the same time raise relevant concerns about the present. To that end, it searches for current-day evidence bolstering a mostly pessimistic forward view. Found in a recent New York Times article is reporting that eerily parallels, at times word-for-word, the forewarning as presented in "H2Oouch!" Some excerpts to temper even the most ardent optimists:

Parts of six counties in a region that borders one of the world's largest freshwater sources, Lake Michigan, could be in for serious water shortages within 20 years, the report by a regional planning commission said. And while the June report surprised people who live near a lake system that contains one-fifth of the world's surface fresh water, it did not surprise a handful of corporations that have been saying that WATER FOR THIS CENTURY WILL BE WHAT OIL WAS FOR THE LAST.

This year, with shortages appearing in places that have never doubted the future of their supply, many parts of the country have discovered WATER MAY INDEED BE A COMMODITY MORE PRECIOUS THAN OIL.

"What worries me a lot is when we start to think that drinking water can only come from a bottle," said Sandra Postel, an author of two books on water and director of the Global Water Policy Project, in Amherst, Mass. "We're not just talking about something like oil, or pipes and transfers. We're talking about a public good, something that keeps everything alive."

Already, bottled water costs more than gasoline in most stores, but nearly 90 percent of all municipal water systems are publicly owned. Enron, the nation's No. 1 marketer of natural gas and electricity, saw water as a commodity that would eventually be deregulated, just as electric power was in California. If that happened, Enron would be free to buy and sell water to the highest bidders — no different from oil or megawatts.

Some major American cities in the Southwest, including El Paso, San Antonio and Albuquerque, could go dry in 10 to 20 years. But a number of towns in New England and the well-watered half of the Midwest are also facing the prospect of running out of water in a generation's time.

Most of the nation's fresh water — about 60 percent — is out of sight. It comes from below ground, in rivers and pools known as aquifers. These aquifers are being depleted at the same time that surface water in lakes and rivers is stressed by growing demands and heat.


posted by John Herman 9:36 AM

Friday, June 15, 2001

George W. has, perhaps unintentionally, raised the awareness of Global Warming to new heights, if measured by the coverage in the news. Newsday's June 10th Sunday edition boasts a full-page story on China's Gobi Desert migrating towards its capital (Beijing), a city of 12.6 million people. The desert's advance purportedly has caused serious discussions about moving the capital. Of course this is blamed in part on global warming and the consequent effect of water evaporation.

That edition's editorial page emblazoned with, "A Potential Catastrophe in Slow Motion," addresses the world-wide phenomenon and desperately asks: "Must rising sea levels flood coastal plains and submerge island nations before [conservation] happens?"

If George W. continues to pursue promotion of his myopic view he just might raise enough awareness to help reverse the dangerous trend!

posted by John Herman 10:21 AM

Thursday, May 24, 2001

From "Trend Alert" published by (no relation) The Herman Group (Roger and Joyce Herman):
Citizens of the future will have bountiful opportunities to endure a greater volume of these messages. Continuous Partial Attention must be developed as a skill, a talent for the new millennium. Or is there an alternative? With the volume of information and stimuli increasing so rapidly, can we ever keep up with it?

Many will seek to reduce the number of things competing for attention. If there is too much coming in at the same time, we'll fear that our brains will hit overload. For some people, it hurts to experience the mental discomfort of Continuous Partial Attention. People will search for ways to turn off anything that sends stimuli continually.

A significant portion of our population will turn off radios and televisions most-if not all-of the time. They'll cancel subscriptions to newspapers and magazines. They'll retreat to the peace and solitude of their homes, withdrawing from stimulus-heavy society.

The years ahead will find some people heavily connected and involved with everything that happens in the fast-moving world around them. Others will reduce the load, not needing to win, but also not withdrawing from the race. And others will drop out, seeking quiet serenity.
posted by John Herman 11:12 AM

Tuesday, April 03, 2001

Time magazine apparently agrees that Global Warming is a "Hot Topic" since this week's cover depicts the Earth as the yolk of an egg in a fry pan!
posted by John Herman 4:08 PM

Sunday's Newsday contained an excellent editorial concerning human cloning. This is a piece that future generations might look upon as quaint, almost humorous. The Phrenicea scenario has human cloning actually replacing natural forms of reproduction — thus TRUNCATING HUMAN EVOLUTION BY THE RECYCLING OF THE HUMAN GENOME as it stood at mid-21st century. The opinion is right on target however that the cloning issue is beyond what a single nation can address. It requires centralized control. Maybe Phrenicea?!

"The prospect of a cloned baby coming to life within two years is quite real. It opens disquieting vistas of a world that could turn out to be as much a dystopia of eugenics experiments and human spare-parts farms as a utopia of perfectly formed replicas of lost loved ones. It poses urgent questions about ethics, scientific inquiry and the role of government in regulating this new technology...

Though cloning today is fraught with development pains, so has been every other new advance in biotechnology. Glitches eventually get worked out. An iron rule of science is that, if something can be done it will be done regardless of legal barriers or moral qualms; witness the first test-tube baby, once seen as an abomination of nature and now accepted as routine...

Will cloning be used in eugenics schemes to attempt to develop a race of perfect specimens or a subgroup of docile worker slaves? Will human clones be produced to harvest organs for transplants, as cloned pigs already may be? Those kinds of uses are far from unthinkable, unfortunately. To deal with them, there would be need of an international agreement with the force of a treaty. But it's a supranational problem, not one a single nation can address."

posted by John Herman 4:00 PM

Wednesday, March 28, 2001

J.R. McNeill, a professor of history at Georgetown University, in his recent book Something New Under the Sun excellently states:
"The open land, pristine ecosystems, and untapped water reserves that formed humanity's ecological buffer against adversity have largely vanished. Along with the continued adverse impacts of a fast-growing population, our ecological future will likely bring less biodiversity, a scarcity of clean water and an increasingly warmer climate. It is impossible to know whether humankind has entered a genuine ecological crisis. It is clear enough that our current ways are ecologically unsustainable..."

posted by John Herman 1:10 PM

Sunday, March 25, 2001

In today's New York Times there's an article highlighting the difficulties in regard to cloning. These from the perspective of the Phrenicea scenario of the future are humorously quaint:
"All the evidence so far, scientists say, indicates that the breathtakingly rapid reprogramming in cloning can introduce random errors into the clone's DNA, subtly altering individual genes with consequences that can halt embryo or fetal development, killing the clone. Or the gene alterations may be fatal soon after birth or lead to major medical problems later in life. Some scientists say they shudder to think what might happen if human beings are cloned with today's techniques. While arguments over the ethics of human cloning have dominated the debate, these scientists say the real issue is the likelihood that clones would have genetic abnormalities that could be fatal or subtle but devastating. Until that problem is solved, they say, human cloning should be out of the question."
posted by John Herman 11:41 AM

Wednesday, March 21, 2001

Interesting piece in the New York Times concerning a water study taken in Reno Nevada, a city that has experienced significant growth and one that is a good place to study competing public attitudes about water values. The study was concentrated along the Truckee River, which descends 2,500 feet along its 100-mile path. It thus encompasses three distinct environments: upscale tourist communities, downtown Reno and finally the irrigation canals in rural Nevada.

The study revealed that poor people practice water conservation more than the rich, while the wealthy have no qualms about paying more for water. All were worried about where water will come from in the future, and everyone hates "water cheats," those that water their yards on "no watering" days.

A conclusion was that peer pressure is a powerful force in spurring water conservation.

posted by John Herman 7:02 PM

Tuesday, March 20, 2001

Found on the Scientific American website: Collecting fog as a source of drinking water for many remote regions of the world.
"Fog is hardly different from rain. What differentiates the two is the size of the water droplets and the speed at which they fall. Raindrops range from five millimeters to 0.5 millimeter in diameter and shoot toward the ground at speeds between two to nine meters per second. Fog droplets, on the other hand, are a mere 40 to one micron in diameter (1,000 microns are 1 millimeter); they fall at only about one to five centimeters per second. Because they are so light and drop slowly, fog droplets travel almost horizontally, even in the lightest breeze. Consequently, you can't catch fog in a bucket. Instead a good fog collector is typically a vertical or almost vertical surface that fog droplets can drift onto and then run down. Following this basic design, trees, in fact, make great natural fog collectors.
Enter artificial fog collectors. Taking the fog-gathering technique of trees a step further, the artificial kind use large, vertical mesh panels. As the fog drifts through the mesh, some of the droplets hit the weave, run down the panel and are collected. This water can then be used for human consumption or agriculture or to reforest the area. In that last case, the resulting vegetation can then function as natural fog collectors, eventually passing part of the gathered water on to the soil, feeding other plants, wildlife and small streams that humans can use.
How much water a fog collector can net varies, depending on the frequency of fog and its thickness. The amount of water in a cubic meter of fog ranges from 0.05 gram to as much as 3 grams."

posted by John Herman 8:07 PM

Thursday, March 15, 2001

Just as a comparison of satellite data from 1970 and 1997 yielded what scientists say is the first direct evidence that so-called greenhouse gases are building up in Earth's atmosphere and allowing less heat to escape into space, President Bush did a "read my lips" gaffe a la George Sr. in reversing his campaign pledge to mandate CO2 emission limits. Looks like global warming will remain a "Hot Topic" as the Phrenicea scenario predicts.
posted by John Herman 8:46 AM

Wednesday, March 14, 2001

From today's New York Times which supports the pessimistic Phrenicea Vision in regard to Global Warming ("Hot Topic"):
Under strong pressure from conservative Republicans and industry groups, President Bush reversed a campaign pledge today and said his administration would not seek to regulate power plants' emissions of carbon dioxide, a gas that many scientists say is a key contributor to global warming.
posted by John Herman 12:31 PM

Monday, March 12, 2001

Found an excellent book at the library, Water: The Fate of Our Most Precious Resource by Marq De Villiers, that echoes Phrenicea's H2Ouch! concerns.

Amazon.com's review: "Water is a curious thing, observed the economist Adam Smith: although it is vital to life, it costs almost nothing, whereas diamonds, which are useless for survival, cost a fortune. In Water, Canadian journalist de Villiers says the resource is still undervalued, but it is becoming more precious. It's not that the world is running out of water, he adds, but that 'it's running out in places where it's needed most.' "

Too bad it won't be a bestseller.
posted by John Herman 12:15 PM

Friday, March 09, 2001

FOUND TODAY on one of Ford Motor's websites — the "golf cart-like jitney" envisioned by Phrenicea (actually me way back when in April '99):
"Meet your new TH!NK neighbor. The future of stylish, powerful and dependable personal mobility. A zero-emissions electric vehicle that could only come from the innovative, "eco-green" engineering at TH!NK. Make no mistake. This isn't an overloaded golf cart. The TH!NK neighbor offers much of the comfort, convenience and benefits of owning a car — at a fraction of the cost."

posted by John Herman 2:02 PM

Wednesday, March 07, 2001

Today's Newsday took advantage of the bad weather to address the "Hot Topic" global warming:
"Don't Like the Weather? Just Wait Another Century...
As the region recovers from a wild snowstorm, it may seem absurd to worry about global warming. But the reality is that temperatures are rising worldwide, a trend that carries the potential for irreversible climate changes that could alter the face of the Earth...

President George W. Bush has expressed strong opposition to the Kyoto draft agreements...
Bush's point, and it's well taken, is that the Kyoto document puts a disproportionate burden on the United States and virtually no limits on developing nations like China and India, where energy usage is rising rapidly. Now Bush needs to come up with some constructive alternatives, such as emissions trading with nations that produce less carbon dioxide and credits for reforestation to absorb greenhouse gases. In light of the emerging evidence, which shows that the Earth's average temperature
could rise by as much as 10.4 degrees in the next century-the most rapid change in 10,000 years -doing nothing is unacceptable.
posted by John Herman 10:10 AM

Saturday, March 03, 2001

From the New York Times in regard to "Hot Topic":
The icecap atop Mount Kilimanjaro, which for thousands of years has floated like a cool beacon over the shimmering plain of Tanzania, is retreating at such a pace that it will disappear in less than 15 years, according to new studies.

The vanishing of the seemingly perpetual snows of Kilimanjaro that inspired Ernest Hemingway, echoed by similar trends on ice-capped peaks from Peru to Tibet, is one of the clearest signs that a global warming trend in the last 50 years may have exceeded typical climate shifts and is at least partly caused by gases released by human activities, a variety of scientists say.

Measurements taken over the last year on Kilimanjaro show that its glaciers are not only retreating but also rapidly thinning, with one spot having lost a yard of thickness since last February, said Dr. Lonnie G. Thompson, a senior research scientist at the Byrd Polar Research Center of Ohio State University.

Altogether, he said, the mountain has lost 82 percent of the icecap it had when it was first carefully surveyed, in 1912.
posted by John Herman 3:23 PM

An excerpt from a February New York Times editorial:
Americans have a special responsibility. The United States is the mightiest nation on the planet and the greatest contributor to the industrial component of global warming. The nation is wealthy and at peace. A
mature approach would require certain sacrifices designed to provide a better environment for future generations of Americans and a more equitable relationship with neighbors around the world.

But that's only one approach. Another is to just ignore the problem and continue to feast like gluttons at the table of the world's resources. That will work for a while. Why not? All you have to do is convince yourself that damaging the planet is somebody else's problem.

posted by John Herman 3:07 PM

Friday, March 02, 2001

Early evidence that space travel is (no pun!) "not long for this world":
NASA on March 1st canceled the billion-dollar X-33 and X-34 rocket projects that the agency had once hoped would lead to a replacement for the space shuttle. The agency said the benefits would not justify the costs. NASA invested $912 million in the X-33, and Lockheed eventually spent $357 million. NASA admitted that technology has not advanced to a point to develop a new reusable launch vehicle that substantially improves safety, reliability and affordability.
posted by John Herman 9:21 AM

Thursday, March 01, 2001

Everyone is busy these days. But we are shaping the future, whether we realize it or not. Are we shaping it positively?
posted by John Herman 2:47 PM

Chronicling the Future®

Your Two Cents!

Top of Page

Home


Please note that submitted comments will be reviewed prior to posting on this website.


Use of this website constitutes acceptance of the Phrenicea® Terms and Conditions.

This page belongs to
www.phrenicea.com


Entire site ©2000-2014 John Herman. All rights reserved.