Where Does The Future Come From? (Subtitled, "You Haven't Seen Anything Yet!)"

Dr. Charles F. Urbanowicz/Professor of Anthropology
California State University, Chico / Chico, California 95929-0400
Telephone: 530-898-6220 [Office]; 530-898-6192 [Dept.] FAX: 530-898-6824
e-mail: curbanowicz@csuchico.edu and home page: http://www.csuchico.edu/~curban

12 March 2001 [1]

[This page printed from http://www.csuchico.edu/~curban/SP2001ReddingCA.html

© [All Rights Reserved.] For a presentation at the Redding, California, Chamber of Commerce Luncheon meeting on March 12, 2001. Charlie Urbanowicz has been a member of the faculty at CSU, Chico since August 1973.

 

PRESENTATION SUMMARY IN REDDING, CALIFORNIA:

"I quote others only the better to express myself."
(Michel Eyquem de Montaigne [1533-1592] French philosopher/essayist)

Working from the present into the past, and with some suggestions about the future, I shall try and provide some cogent comments (ideas and information) concerning technology, the times we live in, and elaborate on the phrase "you haven't seen anything yet!" Events that occurred at various dates will be discussed, including 1961 (when I was in the process of flunking out of New York University, New York City); 1971 (when I was conducting my research for the Ph.D. in Anthropology); 1981 (when I was the Associate Dean for The Center for Regional and Continuing Education at CSU, Chico); 1991 (when I was back to full-time teaching at CSU, Chico); 2001, and, finally, the future.

The Pursuit of Destiny: A History of Prediction was published last year by Paul Halpern, Professor of Physics (University of the Sciences, Philadelphia, PA), and this volume is called to your attention. Halpern deals with "predictions" from the "Oracle at Delphi" (Greece) to contemporary times (computers and chaos theory), a scope which I am certainly not going to cover today! Suffice it to say, individuals have been interested in "predicting" the future for a long time: perhaps ever since Homo sapiens, or pre-cursors of Homo sapiens, realized that today developed out of yesterday and that which is occurring right now develops into tomorrow. (Or, perhaps, as the phrase over the entrance to CSU, Chico's Kendall Hall states it, in a little different way, "Today Decides Tomorrow").

 

IDEAS

"One of the symptoms of an approaching nervous breakdown
is the belief that one's work is terribly important." Bertrand Russell (1872-1970)

Alfred North Whitehead (1861-1947), philosopher and educator, once remarked that "it is the business of the future to be dangerous" and perhaps it is even more dangerous trying to predict the future: we can't predict it! However, the decisions we make today influence our choices about what we will do tomorrow. But can we "predict" tomorrow? I argue that (#1) we cannot predict the future (because by definition the future is the "time that is to be or come hereafter") and (#2) there is no such thing as "future shock" but merely individual unawareness of various aspects of "the present." In addition to being selectively being unaware of what is occurring around us, we also have a tendency to selectively forget that which came before us : what we developed out of!

As an anthropologist (someone who is interested in people), I believe in evolution, a neutral scientific term used to describe that an organism has adapted to its environment and may be able to pass DNA onto the next generation. Period! The "Human Genome" televised press conference on February 12, 2001 (the 194th birthday of Charles Darwin - and also the 194th birthday of Abraham Lincoln [1809-1865]) certainly stressed evolution throughout the presentation. Evolution is not goal-oriented and does not imply value-laden progress but simply describes a process that has been occurring for billions of years.

We have a "fear" of evolution, not because of the past but because of the future! Every generation believes itself to be "the best" and yet, chances are our children and our children's children will be "better adapted" to their environmental situations (in a relative sense) than we are, and that scares us and causes us to forget the past. As the most prolific author Michael Crichton (an M.D. who also has an M.A. in Anthropology) stated it:

"He had a term for people like this: temporal provincials--people who were ignorant of the past, and proud of it. Temporal provincials were convinced that the present was the only time that mattered, and that anything that had occurred earlier could be safely ignored. The modern world was compelling and new, and the past had no bearing on it." Michael Crichton, 1999, Timeline (NY: Ballantine Books), page 84.

 

IDEAS & BEHAVIOR  

"The average person now changes jobs 8.6 times between the ages of 18 and 32, according to the U.S. Bureau of Labor Statistics. Such upheavals in the labor market have forced colleges to adapt....[stress added]." Emily Bazar, 1999, Number of Students Over 40 Soaring At College Campuses. The Sacramento Bee, August 24, 1999, pages 1 and page A10, page 1.

In 1961, after I graduated from high school (Jersey City, New Jersey), I attended New York University (New York City) and fourty years ago this month I was in the process of flunking out of NYU; born in 1942, and based on my nineteen years of experience to 1961, there is no way I could have predicted that in 1971 I would be conducting fieldwork for my Ph.D. in Anthropology from the University of Oregon in the Polynesian Kingdom of Tonga (after completing four years of services in the United States Air Force from 1961-1965) and there is no way I could have predicted that I would be here today, in 2001, in Redding, California! We cannot predict the future, but we certainly can work on inventing it!

My wife and I, and our nine-month old son, arrived in Chico in 1973 and there is no way we could have predicted that he would marry a fellow Chico State student in 1993, graduate in 1995, and that we would have two grandchildren in the year 2001. Likewise, there is no way I could have predicted the growth of this area since I first came up here when I began my job in Continuing Education; and as it appears on the web:

"Shasta County is located in the extreme northern end of the Sacramento Valley, equal distance between Los Angeles and Seattle on interstate 5. It is 160 miles north of Sacramento and 230 miles north of San Francisco. There are three incorporated cities in Shasta County, Redding, the county seat, Anderson and the City of Shasta Lake. Redding, bisected by the Sacramento River, is a growing center of commerce and industry and the nationally recognized metropolitan marketplace of northern California, serving the adjacent counties of Tehama, Trinity and Siskyou." [http://www.reddinganesthesia.com/redding/]

In 1990, Redding (according to my almanac sources) had approximately approximately 66,176 residents and Shasta County had a population of approximately 115,613; by 1999, Redding had 78,700 (an increase of ~18% in nine years) and Shasta County was up to an estimated 165,400 (an increase of ~43% in those same nine years). Growth and development appear to be inevitable when one looks around the north state and the community of Chico has also increased in size since we arrived there in August 1973 and it will continue to grow:

"This year [2001] could set new records for Chico building, if what's going on at the Planning Department is a measuring stick. City Planning Director Kim Seidler said it's typical for cities to see a 'winter lull,' or a time when planners aren't seeing people filing the papers necessary to build projects. Not so this year. ... Chico needs 'roughly 1,000 units a year' until 2020, according to [Jim] Mann [of the Building Industry Association]." Michelle MacEachern, Building Shows No Sign Of Slowing. The Chico Enterprise-Record, February 19, 2001, pages 1 and 8A.

Enrollment has also increased at Chico State and over the years 1977->1988 I was the Associate Dean in The Center for Regional and Continuing Education at CSU, Chico, and our office was charged with the "distance education" of the times. I left Continuing Education in 1988, but over the years I worked with a team of individuals in the Instructional Media Center and Continuing Education which developed a distance education system that evolved from faculty from CSU, Chico, driving to various locations in the north state, to Chico State faculty broadcasting closed-circuit ITFS (microwaved) courses (via the Instructional Television For Students network) to those locations, to satellite-delivered courses throughout North America. After I left my administrative role, the satellite-delivered courses continued to expand (including a course or two broadcast to Japan) and now the university is offering internet-based courses delivered via a personal computer to an individual workstation. And what of the future? We probably haven't seen anything yet!

"Sick of Work? ... In fact, 137 workers die each day nationwide from work-related diseases, and thousands suffer asthma, respiratory diseases, hearing loss and life-threatening cancers, according to the National Institute for Occupational Safety and Health in Washington [stress added]." Sabrina Jones, San Francisco Examiner, May 28, 2000, page J-1.

Education, just as life, is in a changing mode and the environment is changing. As the distinguished anthropologist Gregory Bateson (1904-1980), once a Trustee of the University of California System), wrote:

"The unit of survival [or adaptation] is organism plus environment. We are learning by bitter experience that the organism which destroys its environment destroys itself." Gregory Bateson, Steps to an Ecology of Mind, 1972: 483.

On behavioral activities, not necessarily related to "work" (although I have written professionally about it), I enjoy "games of chance" and the Halpern book had a quote which I readily identified with:

"Many experts in the predictive sciences become attracted, at some point in their lives, to gambling. Those whose inner fire compels them to understand how best to anticipate the future, often cannot resist expressing their talents [?] in a series of well-played hands." Paul Halpern, 2000, The Pursuit of Destiny: A History of Prediction (Perseus Books), page 130.

I like cards, but not the stock market because the stock market is also a "gamble" (not simply an investment, as shareholders in recent months of PG&E and Lucent Technologies and Microsoft and....are aware) and I quote the words of Samuel Clemens, better known as Mark Twain (1835-1910):

"October is one of the peculiarly dangerous months to speculate in stocks. The others are July, January, September, April, November, May, March, June, December, August, and February." In William A. Sherden, 1998, The Fortune Sellers: The Big Business of Buying And Selling Predictions (John Wiley), page 96.

If we knew what the stock market would do, would we not take advantage of it? If we could know the exact moment of our death, would we want to know? Isn't that the ultimate prediction? I don't want to know. (But perhaps you haven't seen anything yet! More below on medicine.) Finally, on behavior and games of chance please consider Anthony Holden's delightful 1990 publication Big Deal: A Year As A Professional Poker Player (page 61): "The good news is that in every deck of fifty-two cards there are 2,598,960 possible five-card poker hands. The bad news is that you are going to be dealt only one of them [stress added]."

 

IDEAS & BEHAVIOR & THINGS

"The difficulty is that modern human beings no longer directly perceive the world they live in and whose conditions affect them." James Burke and Robert Ornstein, 1995, The Axemaker's Gift: A Double-Edged History of Human Culture, page 280.

"Soon Diagnostic Tools May Include Cameras Swallowed Like A Pill. Researchers are finding a less invasive way to look inside a patient's body. ... The new procedure, undergoing clinical trials, instead uses tiny cameras encased in small capsules that the patient swallows as easily as a large vitamin. The cameras travel into the small intestine, flashing two pictures every second. About 50,000 photos are transmitted to a special belt worn by the patient, and the information is later downloaded into a computer. After their journey, the cameras are passed unnoticed by the patient and can be flushed down the toilet. The cameras cost about $300 [stress added]." Tara Parker-Pope, The Wall Street Journal, December 8, 2000, page B1.

The Human Genome press conference, televised on February 12, 2001, brought some basic science to the general public, as did recent meetings of the American Association for the Advancement of Science; heavily covered in the papers, one could read the following from a researcher in genetics: "I am absolutely convinced that my children--they're 6, 10 and 18--will be taking longevity drugs. I think they'll all very probably be centenarians, assuming we don't destroy the Earth First." Thomas E. Johnson, The San Francisco Chronicle, February 18, 2001, page A4. Consider the following, if you will from an intriguing book published last year: "The growing body of evidence suggests that atherosclerosis and Alzheimer's disease--two of the most common and damaging chronic diseases--may be added within the next decade or so to the growing list of diseases caused by infection." Paul W. Ewald, 2000, Plague Time: How Stealth Infections Cause Cancers, Heart Disease, And Other Deadly Ailments (NY: Free Press), page 125.

The January-February 2001 issue of The Futurist, the publication of the Washington, D.C. World Future Society, had an article on "Cultural Amnesia" and the author wrote that "Like Alzheimer's disease, cultural amnesia is a progressive and debilitating disease" because of various "memory killers" including the passage of time, selective memory, technology, and materialism. (Stephen Bertman, 2001, Cultural Amnesia: A Threat To Our Future, The Futurist, Vol. 35, No. 1, Jan-Feb, pages 46-51). The delightful Sue Grafton phrased the cause of our "memory problems" in a similar way in her 1999 "O" Is For Outlaw publication: "Our recollection of the past is not simply distorted by our faulty perception of events remembered but [also] skewed by those forgotten [stress added]." Sue Grafton, 1999, O" Is For Outlaw (NY: Henry Holt & Co), page 25.

Consider, if you will, the following dates and, if you were alive, where were you and how were you living?:

"It is sometimes difficult to grasp the effects of constant doubling. Suppose that in 1959, when the first transistor was printed on silicon, a patch of seaweed in the Pacific Ocean [the largest geographical feature on this planet at ~64,000,000 square miles] measured one foot across. The seaweed patch doubled in size every year just as chips have doubled their number of components. By 1964 it [the seaweed patch] would have measured 32 feet across; 1964 chips contained 32 components. By 1970 it would be 2000 feet across. By 1984 it would have choked the entire Pacific [stress added]. James Martin, 1987, Technology's Crucible, page 15.

When the Intel 4004 microprocessor was introduced in a calculator in 1971, it could handle a whopping 400 instructions a second and in 1981 the IBM PC could handle 330,000 instructions a second. Where were you, what were you doing, and how old were you, in February 1986 when the following full-page advertisement appeared in Time magazine (February 10, 1986)?:

"IT'S SO FAST, YOU'LL FLY THROUGH YOUR WORK. Introducing the NCR PC6. Whoosh! That's information up on the new NCR PC6. The PC6 is NCR's most powerful personal computer yet. It's powered by the advanced Intel 8088-2 microprocessor. So you can process information nearly twice as fast as the PC XT.™ At that rate, you can load programs faster. Recall files in an instant. Calculate in a flash. And get home earlier. The PC6 stores a lot, too--up to 40MB of hard disk space, or about 7,575 single-spaced typewritten pages. Of course the PC 6 is compatible--running over 10,000 business software programs. In fact, a special switch lets you operate at either 8 MHz or 4.77 MHz, allowing you to run software that some other high performance PCs, like the PC AT,™ can't run. And, just in case, you can get a built-in streaming tape back-up system to guard against accidental erasures, disk damage, or coffee spills. The NCR PC6. To see it, fly on down to your NCR dealer today [stress added]."

It is now fifteen years later and in December 2000 and February 2001, we read the following:

DECEMBER 11, 2000: "Intel Corp. says it has scored a scientific breakthrough that someday will help computers run about six times faster than they do now. ... it has built the world's smallest transistor, one that's about one-sixth the size of what's currently being produced. ... Intel's just-released microprocessor contains 42 million transistors and runs at 1.5 billion cycles a second [1.5 gigahertz]. With the transistor of the future...the company will be able to pack 400 million transistors onto a single chip. The company estimates that chip will run at 10 billion cycles a second. Intel said that the transistors are so tiny that a stack of 100,000 of them would be as thick as a sheet of paper [stress added]." Dabe Kasler, Intel Claims Speed Breakthrough. The Sacramento Bee, December 11, 2000, page D1.

February 21, 2001: "'New products and new technology will lead the way out of recession,' Intel Chief Executive Officer Creaig Barrett said yesterday in a keynote address .... He also showed a 2-GHz version of the Pentium 4, the company's flagship desktop process, which so far is shipping at speeds up to 1.5 GHz. The faster version is expected by the end of the year." Henry Norr, Intel Banks on New Chips. The San Francisco Chronicle, February 21, 2001, page C1 and C14.

What will it be like in fifteen years, or 2016? You ain't seen nothing yet!

Incidentally, as I ask you to think about the dates that I mention and ask "where" were you and "what" were you doing, consider that in 1973 International Business Machine introduced the Mag Card II, a "typewriter" (remember those?) with an electronic memory for $11,800. In 1973 I was in my first year at CSU, Chico, and my beginning Assistant Professor salary was $12,500/year!

"September 1989 Weighty matters. Apple launches its first laptop computer, the Macintosh Portable. The machine, comparable in size to a portable typewriter, weighs in at 16 lbs. and retails for $6,500." Anon., Time Digital, September 6, 1999, page 17.

"The driving force in the semiconductor industry has been the theorem known as Moore's Law. First posited by Intel Corp. co-founder Gordin Moore in the 1960s, Moore's Law states that the number of transistors that fit on a chip will double every 18 months. ... Moore's Law has held true so far, with Intel's latest Pentium cramming 8 million transistors on a tiny sliver of silicon. The industry is confident that it can achieve even more astounding figures, such as 100 million transistors on a chip." (San Francisco Chronicle, August 10, 1998, page E1)

"The great thing about crummy software is the amount of employment it generates. If Moore's law is upheld for another 20 or 30 years, there will not only be a vast amount of computation going on planet Earth, but the maintenance of that computation will consume the efforts of almost every living person. We're talking about a planet of help desks [stress added]." Jaron Lanier, 2000, One-Half of a Manifesto: Why stupid software will save the future from neo-Darwinian machines. Wired, December 2000, 8.12, pages 158-179, page 174.

At times do any of us really know how some of our technology actually works? In the San Francisco Chronicle on Sunday, February 18, 2001, there was a statement by David Hyman, President of a music-database company; in talking about what his technology does he is quoted as saying: "Most consumers don't know that it's us. They think it auto-magically happens" (Benny Evangelista, 2001, David Human's Connections. San Franciso Chronicle, February 18, pages C4 and C5, page C4). I loved the auto-magically and it reminded me of a quote from the distinguished science fiction (science fact) writer Arthur C. Clarke, genius behind the film 2001: "Any sufficiently advanced technology is indistinguishable from magic" (Clarke's Third Law in Profiles of the Future: An Inquiry into the Limits of the Possible by Arthur C. Clarke, 1984, page 26).

The auto-magic science facts of 2001 would have been but speculative science fiction ideas of 1961, perhaps being considered in 1971, would have been on someone's designing table by 1981, and probably in "Beta Testing" in 1991. What auto-magic science facts of 2041 (when I hope to be 99 years of age and in relative good health) are being scoffed at in 2001 and will become a reality by....? Consider recent information combining "computers and biology" such as "Why Bioinformatics Is Hot Career" in the San Francisco Chronicle of March 4, 2001: an estimated 20,000 jobs will be created in the field by the year 2005 (Stacey Wells, 2001, Why Bioinformatics Is Hot Career. The San Francisco Chronicle, page J1 and J2); and "IBM To Jump On Biotech Bandwagon" where an IBM Vice President is quoted as saying "Biology is the science that's driving high-performance computing today." (Michelle Kessler, 2001, IBM To Jump On Biotech Bandwagon. USA Today, March 6, 2001, Section B ("Money"), Page 1. Where will computers and biology and....meet? You haven't seen anything yet!

 

IDEAS & BEHAVIOR & THINGS & WORDS

"Technologies acquire historical weight by reshaping the human condition. Gutenberg's press led to mass literacy, fostered the Protestant Reformation (by undermining the clergy's theological monopoly) and, through the easy exchange of information, enabled the scientific revolution. In the 19th century railroads created a truly national American market that favored mass production and the consumer society." Robert J. Samuelson, The Internet And Gutenberg. Newsweek, January 24, 2000, page 45. 

As previously stated, the anthropologist Gregory Bateson wrote that the "unit of survival [or adaptation I add] is organism plus environment" [stress added] (Steps To An Ecology of Mind, 1972, page 483) and this phrase has stuck with me for almost thirty years. I strongly argue that if we, as individuals (and as a collective), are to survive we must (a) be aware of and (b) adapt to the ever-changing electronic world around. I am also interested in ideas associated with Charles Darwin (1809-1882) and as he once wrote, borrowing from the eminent sociologist Herbert Spencer (1820-1903), there is such a thing as "survival of the fittest." As an anthropologist looking at the future, I find (and often see) an organic (and clearly Darwinian) metaphor applied to changes in education, and (as stated above) "you ain't seen nothing yet!" Incidentally, in his quote, Bateson went on to write about the human mind:

"The unit of survival [or adaptation] is organism plus environment. We are learning by bitter experience that the organism which destroys its environment destroys itself. If, now, we correct the Darwinian unit of survival to include the environment and the interaction between organism and environment, a very strange and surprising identity emerges: the unit of survival turns out to be identical with the unit of mind" [italics in original; stress added]." Gregory Bateson [1904-1980], 1972, Steps To An Ecology of Mind (NY: Ballantine Books), page 483.

Universities and colleges and towns are changing, because they must:

"'We used to educate farmers to be farmers, factory workers to be factory workers, teachers to be teachers, men to be men, women to be women.' The future demands 'renaissance people. You can't be productive in the information age if you don't know how to talk to a diverse population, use a computer, understand a world view instead of a parochial view, write, speak.'" Byrd L. Jones and Robert W. Maloy, 1996, Schools For An Information Age: Reconstructing Foundations For learning And Teaching, page 15.  

The Renaissance (that time of re-birth and the revival of learning beginning of the 15th Century) developed into the period known as the "Enlightenment" (where reason ruled over superstition) and all of must prepare for the future.

The Education Research Service (Arlington, Virginia) published an item last year (2000) entitled Ten Trends Educating Children For A Profoundly Different Future (Gary Marx) and the trends are well worth considering. I shall not discuss them in depth but just consider them (and possibly check out the volume); one could spend hours (if not days) on any "single" trend (and they are interconnected). The ten trends are (and these are all direct quotes from various pages in the ninety-six page book):

#1. "Our age is catching up with us! For the first time in history, the old will outnumber the young."
#2. "We have a new look. The country will become a nation of minorities."
#3. "What you know and who you know both count. Social and intellectual capital will become the primary economic values in society."
#4. "One size doesn't fit all. Education will shift from averages to individuals."
#5. "This problem has gone on long enough! The Millennial Generation will insist on solutions to accumulated problems and injustices."
#6. "The Status Quo: It isn't what it used to be! Continuous improvement and collaboration will replace quick fixes and defense of the status quo."
#7. "Move over atoms--here come the bits. Technology will increase the speed of communication and the pace of advancement or decline."
#8. "That triggers an idea! Knowledge creation and breakthrough thinking will stir a new era of enlightenment [stress added]."
#9. "What's the right thing to do? Scientific discoveries and societal realities will force difficult ethical choices."
#10. "Please apply! Competition will increase as industries and professions intensify their efforts to attract and keep talented people."

Consider, if you will, not only population growth in Chico or Redding or only California, but the entire USA (as well as other nations of the world) in the next years: 

 

COUNTRY / POPULATION in:
the year 2000
the year 2025
China
1,261,832,000
1,407,735,000
India
1,014,004,000
1,415,274,000
USA
275,563,000
335,360,000
Indonesia
224,784,000
287,985,000
Brazil
172,860,000
209,587,000
Russia
146,001,000
138,842,000
Pakistan
141,554,000
211,675,000
Bangladesh
129,194,000
179,129,000
Nigeria
123,338,000
203,423,000
Mexico
100,350,000
141,593,000
ESTIMATED GLOBAL POPULATION:
6,080,142,000
7,840,660,000

Source: The World Almanac And Book of Facts 2001 (World Almanac Books), pages 861-862.

We are entering a new period of change and may it turn out to be a period of enlightenment through cyberspace!

Cyberspace was a term invented by William Gibson in Neuromancer (1984) to describe interactions in a world of computers and human beings. Cyberspace can also be viewed as another location to be explored and interpreted by anthropologists. I obviously believe that that the "World Wide Web" is very similar to the period known as the Enlightenment, both a movement and a state of mind:

"The Enlightenment was both a movement and a state of mind. The term represents a phase in the intellectual history of Europe, but also serves to define programs of reform in which influential literati, inspired by common faith in the possibility of a better world, outlined specific targets for criticism and proposals for action. The special significance of the Enlightenment lies in its combination of principle and pragmatism [all stress added]." Geoffrey R.R. Treasure, 1994, The Enlightenment. The New Encyclopedia Britannica, Volume 18, pages 676-683, page 676.

If we substitute words like "web" for the "Enlightenment" above, would not the following sound true in March 2001?

The world wide web is both a movement and a state of mind. The web represents a phase in the intellectual history of the world, but also serves to define programs of reform in which influential literati, inspired by common faith in the possibility of a better world, outline specific targets for criticism and proposals for action. The special significance of the world wide web lies in its combination of principle and pragmatism.

You ain't seen anything yet! The distribution of the information on the Human Genome Project via the world wide web, and the sharing of "music files" via Napster, and all sorts of other web-distribution systems are an indication of the changes that are coming upon us as we are part of the new age of enlightenment!

"[Children] Born during a baby bulge that demographers locate between 1979 and 1994, they are as young as five and as old as 20, with the largest slice still a decade away from adolescence. And at 60 million strong, more than three times the size of Generation X, they're the biggest thing to hit the American scene since the 72 million baby boomers. Still too young to have forged a name for themselves, they go by a host of taglines: Generation Y, Echo Boomers, or Millennium Generation. ... Most important though, is the rise of the Internet, which has sped up the fashion life cycle by letting kids everywhere find out about even the most obscure trends as they emerge. It is the Gen Y medium of choice, just as network TV was for boomers. 'Television drives homogeneity,' says Mary Slayton, global director for consumer insights for Nike. 'The Internet drives diversity [stress added].'" Ellen Newborne et al., Generation Y. Business Week, February 15, 1999, pages 80-88, page 82-83.

"And then the revolution came. ... Computers and modems and the mighty Web are as ubiquitous in a child's vocabulary as the multiplication table. ... Experts say that computers, and more importantly, the Internet, are changing the way children learn, develop and think. Amanda Stanley had a computer in her home even when her family chose not to keep a TV or radio in the house. 'I've been around computers all my life,' she said. The 13-year-old [born 1987?], who comes from a family of computer enthusiasts, learned how to paint jeans at a camp last summer. Now she wants to sell her wearable art online. She is enrolled in Giga Gals, a program that started at the Austin [Texas] Children's Museum in February [2000]. Web designers help 9-to 18-year-olds get online and start their own sites, from Web diaries to e-commerce ventures [stress added]." Omar L. Gallaga, For High-Tech Kids, Computers Are The Norm, Not A Novelty. The San Francisco Chronicle, May 29, 2000, page B5.

I do not follow all "video game" technology, but consider the following from 1997, which is surely outdated by now. In 1997 Minoru Arakawa of Nintendo of America (which began in 1979) was interviewed in USAToday and the following appeared concerning 1997 games:

"Q: How do those older games from the '80s compare with games for the current system, the Nintendo 64? Arakawa: It's like a university compared with elementary school. The graphics are so much better. The sound is much better. Everything is much better [stress in original] (Nintendo Plans Zelda 64 For Next Big Play. Mike Snider, USAToday, June 23, 1997, page 10B.

Read about the games of 2001 and the computing power that is available to children:

"After years of development, Microsoft unveiled its highly anticipated Xbox gaming console Saturday [January 6, 2001], promising three times the graphics performance of its rivals and enough power to create real-time shadows, facial emotions and images previously only seen in movies like 'Toy Story.' ... Microsoft claims its 64 megabytes of memory, an Intel 733-megahertz processor and an 8-gigabyte hard drive make it the most powerful of any gaming console" [stress added]." May Wong, Microsoft's Xbox Game Console Steals the Show. The Sacramento Bee, January 7, 2001, page A5

This "toy" probably has more computing power than was available to take us to the moon in 1969 and those Gen X and Gen Y students are going to continue changing at an accelerating pace:

"Ferry Zuiderwijk is still too young to be a commercial pilot or hold an MBA. But at age 17 [born 1984?], he already knows how to keep the nose up while running a sizable airline. ... Simulation games--once the realm of joysticks and quick reflexes--are moving into new territory. Game players now can run theme parks, make Wall Street deals or operate a farm, all without leaving their living room. ... many companies--from aerospace to management firms--could likely start using simulations to enhance their own training efforts and offer virtual experience for would-be executive managers." Matt Moore, 'Sim' Games Grow More realistic. The San Francisco Chconicle, February 19, 2001, page B3.

In reading and thinking and speaking about evolution, combined with rapid technological changes (not to mention "memory killers"), it is easy to understand and sympathize with Jared Diamond when he wrote the following:

"...for almost the whole of our evolutionary history, our repository of knowledge lay in the memories of people, not in books. Therein lies a sad but potent reason for the status of old people being so much lower in many modern, literate societies than in traditional nonliterate societies: many individuals in literate societies no longer believe that their elders are important repositories of knowledge and thus do not perceive them as valuable. In addition, technology is now changing so quickly that certain kinds of experience are soon outdated, as is made clear to me when my eighteen-year-old students watch my sixty-three-year-old eyes glaze over at their efforts to teach me how to use computers" [stress added]." Jared Diamond, Threescore and Ten. Natural History, December 2000-January 2001, Volume 109, Number 10, pages 24-35, page 28.

As an anthropologist, I attempt to keep my eyes from glazing over; I study the present, with an eye on the past, to anticipate (or invent my) future. The Enlightenment of the past, combined with the industrial revolution that began in approximately the 1760's, created the world that we know today and all disciplines are changing and Anthropology is no exception:

"After dedicating their careers to studying exotic cultures in faraway lands, a few anthropologists are coming home. They're taking research techniques they once used in African shantytowns and Himalayan villages to Knights of Columbus halls, corporate office buildings and suburban shopping centers.... [The Anthropologists] study American families the way they would Polynesian cargo cults or Mongolian nomads--by inserting themselves into the daily lives of their subjects" [stress added]." Matt Crenson, Anthropologists Among Us. The Modesto Bee, July 17, 2000, pages D1 and D2.

There is an on-going project that anthropologists are currently involved in, entitled "The Silicon Valley Cultures Project" (on the WWW) and it is called to your attention as an example of some contemporary anthropological work: we cannot predict the future, but we can constantly invent and re-invent ourselves! Other individuals are "doing anthropology" in other not-so-esoteric areas:

"Calling it ethnographic or observational research, agencies are sending anthropologists and other trained observers into the field and the screening room to chart the hidden recesses of consumer behavior [stress added]." Gerry Khermouch, 2001, Consumers In The Mist: Mad[ison] Ave's Anthropologists Are Unearthing Our Secrets. Business Week, February 26, 2001, page 92-94, page 92.

"Dr. Steven Smith encountered a virtual smorgasbord of cultures when he started tending patients at Molina Medical Center in south Sacramento two weeks ago. ... along with such wide diversity come cultural conundrums.... That's where Margie Akin comes in. As a cultural and linguistic services specialist since 1999 for Long Beach-based Molina Healtchare of California, Akin has helped Molina's employees understand how patients' cultural backgrounds affect their approach to health care. With a doctorate in anthropology from the University of California, Riverside, Akin is uniquely qualified for the position, which is mandated by the state and held by an administrator at most health-care companies [stress added]." Erika Chavez, Cultural Specialists Helps Fine-Tine Health Care. The Sacramento Bee, February 18, 2001, page B1 and B7, page B1.

Anthropology, which is the study of people, must change as people change. Information is coming to human beings from various "new" sources, such as simulations and the Internet and the World Wide Web (when the computer and electricity and software all work together) but please don't forget books to tell us about the past so we can understand the present and possibly invent the future:

"A home without a library lacks diversity of voices, opinions and world views. When you read a book, you enter another person's perspective. And because a reader can put the book down and think about what the author has said, a good reader enters a dialogue with the author or the characters created by the author. One can reread passages and linger over thoughts or ideas or savor the deliciousness of the language. Television, even at its best, lacks diversity and the ability of a viewer to carry on an inner dialogue with the speakers or the authors of the program. Books encourage thinking. A reader must create images from the words the author has supplied, must imagine the events described, must track the plot or the logic of the writer and must visualize the main characters in the mind's eye. The book is in your hands. You can return to passages if there is something you don't understand. You can argue with the author in your head; you can nod in agreement. You learn, unconsciously, the way words can fit together--sometimes so well that they seem inevitable and irresistible [stress added]." Charles Levendosky, Read a banned book, give one to your children. The Sacramento Bee, October 2, 1999, page B7)

Please note the words of Clifford Stoll, 1989 author of The Cuckoo's Egg: Tracking A Spy Through The Maze Of Computer Espionage and 1995 author of the best-selling Silicon Snake Oil: Second Thoughts On The Information Highway, who had the following concerning the relative value of computers and books:

"Today, however, the bargains are on paper, not on disk. Don't believe me? Spend seventy dollars on an atlas at your bookstore. While you're paging through it, notice its precise colors and logical layout. Now think of the hundred dollars you've saved by avoiding those map-making CD-ROMS, with cruder resolution and no topography. Twenty years from now, you'll still read that atlas and dream of faraway places; the software will be long since obsolete and unusable [stress added]." Clifford Stoll, 1995, Silicon Snake Oil: Second Thoughts On The Information Highway, pages 140-141.

Incidentally, lest I seem too exuberant about the future and technology (and hence my citation of Stoll), a caveat is also in order as one considers trends; as the creator of "Dilbert" has written: "Something unexpected always happens to wreck any good trend." Scott Adams, 1997, The Dilbert Future: Thriving on Stupidity in the 21st Century (Harper Collins), page 5.

 

CONCLUSIONS (OF A SORT)

"When this circuit learns your job, what are you going to do?"
Marshall McLuhan & Quentin Fiore (1967), The Medium Is The Massage, page 20.

"Any teacher who can be replaced by a computer deserves to be!
[stress added!]" (David Smith; as cited by Mike Cooley, 1999, Human-Centered Design.
Information Design, edited by Robert Jacobson (MIT Press), pages 59-81, page 73.

"Nothing is easier than self-deceit.
For what each man [or woman] wishes,
that he also believes to be true."
The Greek Orator Demosthenes, c. 384->322 B.C.

From my limited perspective, one of the problems in attempting to make "predictions" about the future is our collective hubris about both the present and the past! I am of the opinion that "experts" of every generation view the activities of that generation as the most exciting, important, and "progressive" events that ever occurred ever! In my classes I attempt to convey the importance of "the past" (or history) in understanding the present (or contemporary events) and stress the importance of evolution and placing things into perspective. Consider if you will the following words:

"Nobody who has paid any attention to the peculiar features of our present era, will doubt for moment we are living at a period of most wonderful transition which tends rapidly to accomplish that great end, to which indeed, all history points--realization of the unity of mankind. . . . The distances which separated the different nations and parts of the globe are rapidly vanishing before the achievements of modern invention, and we can traverse them with incredible ease; the languages of all nations are known, and their acquirement placed within the reach of everybody; thought is communicated with the rapidity, and even by the power, of lightning. On the other hand, the great principle of the division of labor, which may be called the moving power of civilization, is being extended to all branches of science, industry, and art. . . . The products of all quarters of the globe are placed at our disposal, and we have only to choose which is the best and cheapest for our purposes, and the powers of production are entrusted to the stimulus of competition and capital [stress added]."

These words come not from 2001 but from the May 1, 1851 Inaugural Address of the Prince Consort Albert (1819-1861), on the occasion of the opening of the "Great Exhibition of the Works of Industry of all Nations" held in London (Michael Sorkin, 1992, Variations On A Theme Park: The New American City And The End of Public Space, page 209). Every generation finds it difficult to believe that that which comes after the present generation might in fact be better than that which is happening right now.

Change is the natural order of things. Along with out inability to predict the future (only invent it) cities and colleges of the future will change. Colleges and Universities won't disappear, but they will be altered; consider, however, the not so sanguine 1997 statement by Peter Drucker in Forbes magazine: "Thirty years from now big university campuses will be relics" (Peter Drucker, Forbes, March 10, 1997, pages 126-127). Don Tapscott, 1998 author of the influential Growing Up Digital: The Rise of the Net Generation, had this to say about the article:

"Educators really took note when none other than Peter Drucker shocked the post-secondary world in the March 10, 1997 issue of Forbes magazine. Confirming leading educators' worst nightmare, he stated publicly: 'Thirty years from now big university campuses will be relics.' Referring to the impact of the digital revolution, Drucker said: 'It is as large a change as when we first got the printed book.' He continued: 'It took more than 200 years for the printed book to create the modern school. It won't take nearly that long for the big change. ... Already we are beginning to deliver more lectures and classes off campus via satellite or two way video at a fraction of the cost. The college won't survive as a residential institution. Today's buildings are hopelessly unsuited and totally unneeded [stress added]." Don Tapscott, 1998, Growing Up Digital: The Rise of the Net Generation (McGraw Hill), page 153; and see Robert Lenzner and Stephen H. Johnson, 1997, Seeing Things As They really Are. Forbes, March 10, 1997, pages 122-128.

Higher education aside, my best guess about the future is that it cannot be predicted: too many things can happen, including totally random events that no one has control over! The future can, however, be invented and we will have to learn new things "in the future." In a very interesting 1998 publication dealing with the field of "futures" by William A. Sherden, entitled The Fortune Sellers: The Big Business Of Buying And Selling Predictions, the author pointed out the following:

"Analysis of the track record of forecasters over the past several decades shows that long-term technology predictions have been wrong about 80 percent of the time. Such was the finding of Steven Schnaars, [1989] author of Megamistakes: Forecasting and the Myth of Rapid Technological Change and an associate of marketing at Baruch College...." William A. Sherden, 1998, The Fortune Sellers: The Big Business Of Buying And Selling Predictions (John Wiley & Sons), pages 169-170.

Schnarr's 1989 book, however, did point out that "not all forecasts failed. Some successfully foresaw events that produced vibrant market growths" (Steven P. Schnaars, 1989, Megamistakes: Forecasting and the Myth of Rapid Technological Change, page 107) and he came up with three general guidelines, the first of which places Drucker into some perspective, places my own technological biases into perspective, and perhaps (explains in hindsight) some of the dot-com failures we have witnessed in the past months:

"The most obvious advice to be gleaned from a study of past forecasts is to avoid falling in love with the underlying technology. The most outlandish errors uncovered in this study failed for this reason. [and] Ask Fundamental Questions About Markets [and] Stress Cost-Benefit Analysis: The most fundamental question to ask of a growth market forecast is whether or not the product upon which it is based provides customers with something special and does so at a price that both the customer and the manufacturer will accept [stress added]". Steven P. Schnaars, 1989, Megamistakes: Forecasting and the Myth of Rapid Technological Change, pages 143-147.

I guess this wasn't the case when we saw the following in December 2000:

DECEMBER 2000} "So far this month, U.S. Internet companies have cut 10,459 jobs, up 19 percent from November's record total of 8,789 cuts....That make for a total of 45,515 dot-com layoffs since last December...." The San Francisco Chronicle, December 28, 2000, page B1.

And then there was the poignant headline in The Wall Street Journal of March 8, 2001: "E-Business Booster Mary Meeker Becomes E-Lusive." Meeker reportedly had a pay package of $15,000,000 for 1999 and the Journal reprinted some of her "comments" (predictions?), such as the one from April 1999: "There is no doubt in my mind that the aggregate market value for the Internet sector will be a lot higher in three years than it is today." Considering that some of the stocks she recommended have fallen anywhere from 67% to 96% in the past year, she did say "three years" so....perhaps by 2002 we will see....? Randall Smith & Mylene Mangalindan, 2001, E-Business Booster Mary Meeker Becomes E-Lusive. The Wall Street Journal, March 8, 2001, pages C1 and C4.

In discussing the future, I state the following: if professional forecasters can't predict the future, what makes anybody think anyone can? We all make educated guesses and that is the best we can do.

I am reminded of the following phrase: a futurist is an expert who will know tomorrow why the things that were predicted yesterday didn't happen today! That definition is right in the league with the one about a consultant: an individual who takes the watch off of your wrist and tells you the time! Look around you and see what is happening now.

Recently, off of the Internet, the following came to me: you know you are living in the year 2001, when:

#1. You just tried to enter your password on the microwave.
#2. Your daughter sells Girl Scout cookies via her web site.
#3. You exchange e-mail with individuals from half-way-around the world, but don't know your neighbor.
#4. Your grandmother asks for a GIF file of your newborn so she can make a screensaver for her PC.
#5. Your idea of being organized is multiple-colored Post-It Notes.
#6. And you get most of your jokes via e-mail instead of in person!

And if this is happening in 2001, what about 2020? Another interesting article is called to your attention: "What you'll need to know in twenty years that you don't know now" (by Joseph D'Agnese, Discover, Vol. 21, No. 10, Cotober, pages 58-61); among some items the author mentioned are these:

"By the year 2020, you will need to know stuff you can hardly guess today....you will need to know how to talk to your house....you will have to learn to drive a more automated car....you'll need to know enough to make more complicated medical choices....you'll need to access your betrothed's genetic map....you will always need to know if the facts you've dredged up are accurate and truthful....you will be forced to take on moral questions no human has ever faced. When will you find time to do that? How will you contemplate when everything is speeding up and time for reflection is practically nonexistent? That's you in 20 years. Like the machine that inspired your age, you will be constantly scanning, processing, sifting, searching for a code to guide you through. And yet the key, the compass, the answer, was once offered in a temple at Delphi. What you will need to know in 2020? Yourself [stress added]."

TO END: I like the power of words (and visuals!) and finally begin really ending with "the world is full of butterflies" (Ovars Peterson, 1998, The Jungles of Randomness: A Mathematical Safari, page 145). It refers to the mathematics of "Chaos Theory" and that butterfly which became so popular in a 1972 paper by Edward Lorenz: "Does the flap of a butterfly's wings in Brazil set off a Tornado in Texas?" In brief, little things can have a tremendous impact, and we will never know...until we experience it. An article on February 25, 2001, was most appropriate concerning the impact of "little things" (in this case, a computer program):

"At 57 kilobytes, the tiny program is no more than the computer equivalent of a heartbeat. But what it does is frightening to anyone with a financial stake in the entertainment industry: It enables you to copy movies on any computer with a DVD [Digital Versatile Disk] drive and send them out unscrambled on the Internet [stress added]" Doug Mellgren, Hacker Hero. San Francisco Chronicle, February 25, 2001, pages D3 and D4, page D3.

Again, little things can have a tremendous impact.

We deal with the future by inventing it and by building on the known as one moves into unknown: thank goodness that the world is full of some wonderful and interesting people and beautiful butterflies and the unknown! As my father-in-law and American educator Ralph H. Thompson (1911-1987) once wrote:

The cutting edge of knowledge is not in the known but in the unknown, not in knowing but in questioning. Facts, concepts, generalizations, and theories are dull instruments unless they are honed to a sharp edge by persistent inquiry about the unknown." Ralph H. Thompson, 1969, Learning To Question. The Journal Of Higher Education, Vol. XL, No. 6, pages 467-472, page 467.

Questioning is important but the answers that individuals provide are also important, for they tell you something about the respondents and tell you something about the educational system that they are building upon for their future; you might be interested in the following which appeared in the San Francisco Examiner on July 4, 1999:

For many Bay Area students, Independence Day means hot dogs, family picnics, fireworks--and not much else. Out of four dozen teens quizzed in an informal survey in San Francisco, Concord and Pacifica, most knew that the Fourth of July had something to do with America's independence, but less than half could name the country from which we won our freedom. 'Japan or something. China. Somewhere out there on the other side of the world,' said... [a 14 year old and a 17 year old added:] It's like freedom. Some war was fought and we won, so we got our freedom.' As to which country we had been fighting, 'I don't know.... I don't, even, like, have a clue [said a 17 year old 1999 high school graduate]. 'It wouldn't be Canada, would it?' guessed [a 13 year old high school freshman].... 'We're not in school right now, so you asked the wrong kids.' The unscientific survey was conducted at Stonestown Galleria in San Francisco, Sunvalley Mall in Concord and the Linda Mar shopping center in Pacifica. Many of those who correctly identified England as our adversary in the Revolutionary War did so only after some thought. 'Was it somewhere in Europe, like France? Germany? Russia? Let me think' [said a 17 year old 1999 high school graduate].... 'Wasn't it Great Britain? I just had to think.' 'I'm gonna have to go with Spain' [said a 14 year old high school freshman and someone else].... correctly answered that we fought the Revolutionary War before World War II. But was it before or after the Civil War? ... couldn't say. 'After. I think it was after' [said a 14 year old friend and a 19 year old high school graduate] ... who declined to give his last name, said he knew we celebrate the Fourth because it's Independence Day. But the country we were fighting with? 'That I don't (know),' he said. 'I want to say Korea. I'm tripping.' Asked how long ago it might have been...took a guess. 'Like 50 years,' he said. One student wondered aloud whether the Fourth of July was somehow related to Pearl Harbor. Another was not sure whether our independence came before or after the Vietnam War. ... A 1994 study of several thousand eighth- and 12th-graders across the country tested the students' knowledge of basic history. Thirty-nine percent of eighth graders scored at a level considered below their basic proficiency; an even higher number--57 percent--of high school seniors scored below the basic level. The study was conducted by the U.S. Department of Education's National Center for Education Statistics. Adults may do better. According to a Gallup Poll conducted last weekend, a majority of Americans can correctly identify what the Fourth of July is all about. When asked to name the country from which we gained our independence, 76 percent correctly named Great Britain or England. Nineteen percent were unsure. The results were based on telephone interviews of a randomly selected national sample of 1,016 adults [stress added]. Emily Gurnon, Fourth of July: Kids Unclear on Concept. San Francisco Examiner, July 4, pages 1 and A9.
and
A question for you before you set off for your Independence Day fireworks: Who was the American general at Yorktown? You have four choices: William Tecumseh Sherman, Ulysses S. Grant, Douglas MacArthur or George Washington. When that question was asked last year of 556 randomly chosen seniors at 55 top-rated colleges and universities, one out of three got it right [stress added]. David Broder, 2000, Those dummies from Harvard and Duke. The San Francisco Examiner, July 2, 2000, page C7.

I am really an optimist when it comes to the human condition for as Sir Winston Churchill (1875-1965) once remarked, "I am an optimist. It does not seem too much use being anything else." The wonderful Jane Goodall (born 1934) wrote the following in 1999:

"My reasons for hope are fourfold: (1) the human brain; (2) the resilience of nature; (3) the energy and enthusiasm that is found or can be found or can be kindled among young people worldwide; and (4) the indomitable human spirit [stress added]." Jane Goodall [with Phillip Berman], 1999, Reason For Hope: A Spiritual Journey (NY: Warner Books), page 233.

More recently, I also like the words from an article in The Sacramento Bee of February 18, 2001:

"There's nothing extraordinary to act as if we have a bond with the future. It is something most of us do in our private lives without hesitation. We have children; we take out 30-year mortgages. We put away money in a college fund even before the children are out of diapers. We salt away dollars for retirement and put in landscaping even though we know we're likely to be transferred five times before the yard resembles the designers sketch. In our private lives, the future is real and tangible--real enough to plan and scrimp for [stress added]." Mark Paul, Tax Cut? No - Put A Trust Fund In Every Crib. The Sacramento Bee, February 18, 2001, Pages L1 and L3, page L1.

The future is real and tangible for all of us until the moment we cease to be alive (and that is an entirely different story) and while we absolutely cannot predict the future, we can certainly build and invent it! Finally, on "teaching" and computers (since that is what I do at CSU, Chico, I "teach" and utilize computers), I end with the following from the previously quoted Clifford Stoll:

"What equipment do you need for an ordinary classroom? Desks, chalkboard, and an eraser. The center of the classroom isn't a thing, but a person [stress added]." Clifford Stoll, 1999, High-Tech Heretic: Why Computers Don't Belong in the Classroom and Other Refledctions by a Computer Contrarian (NY: Doubleday), page 94.
# # #


This presentation draws upon some earlier ideas, words, and publication currently on the web (from most recent to oldest):

http://www.csuchico.edu/~curban/ChicoFireDeptMay2000.html [May 2000 paper]
http://www.csuchico.edu/~curban/LeadershipChicoMarch2000.htm [March 2000 paper]
http://www.csuchico.edu/~curban/14th_ICAES.html [July 1998 Gambling/Gaming Paper]
http://www.csuchico.edu/~curban/Jan'98_Millennium_Paper.html [January 1998 paper]
http://www.csuchico.edu/~curban/Unpub_Papers/1991AAAS.html [1991 paper on Education and Technology]
http://www.csuchico.edu/~curban/Forum/March1990.html [1990 paper on Science Fiction/Science Fact]
http://www.csuchico.edu/~curban/Unpub_Papers/1977SETIPaper.html [1977 paper on Evolution, Technology, and Civilization]

Other Interesting (and somewhat appropriate web sites) include:

http://www.darwinawards.com/ [Official Darwin Awards]
http://www.sjsu.edu/depts/anthropology/svcp/ [The Silicon Valley Cultures Project]
http://www.wfs.org/ [The World Future Society]
http://www.nychinatown.com/eud_future.htm [Future Predictions} Various]
http://directory.google.com/Top/Society/Future/Predictions/ [Google Web Directory} Society->Future->Predictions]
http://kurellian.tripod.com/ [An Illustrative Speculative Timeline of Future Technology and Social Change]
http://www.unitedmedia.com/comics/dilbert/ [The Official Dilbert Web Site]

and finally

http://www.wired.com/wired/archive/8.04/joy.html [Why The Future Doesn't Need us} Provocative article by Bill Joy} co-founder and Chief Scientist of Sun Microsystems]


[1] © For a presentation at the Redding, California, Chamber of Commerce Luncheon meeting on March 12, 2001. Charlie Urbanowicz has been a member of the faculty at CSU, Chico since August 1973. To return to the beginning of this page, please click here.

[This page printed from http://www.csuchico.edu/~curban/SP2001ReddingCA.html]


To go to the home page of Charles F. Urbanowicz.

To go to the home page of the Department of Anthropology.

To go to the home page of California State University, Chico.


Copyright © 2001 (all rights reserved).

11 March 2001 by CFU


# # #