Dr. Charles F. Urbanowicz/Professor of Anthropology
California State University, Chico
Chico, California 95929-0400

(530-898-6220 [Office]; 530-898-6192 [Dept.] FAX: 530-898-6824)
home page:

[This item was printed from'98_Millennium_Paper.html]

5 January 1998[1]



Twenty-five years ago, in 1973, the Urbanowicz Family came to Chico; what might California State University, Chico be like in 2023? What will the student of 2023 be like? Everything is relative to the year in which you were born and to the years in which you grew up and became adjusted to "your" culture. Professional forecasters cannot predict the future and nobody can. People make educated guesses and that is the best we can do: we deal with the future by inventing it and by building on the known as one moves into unknown. Finally, thank goodness that the world is full of beautiful butterflies.


"The wisest prophets make sure of the event first."
Horace Walpole (1717-1797)

The title of this brief presentation is based on the fact that hindsight has been described as always being 20-20 and foresight, or seeing into the future is more of a 50-50 chance! Either something will take place in the future, or it will not; but with 20-20 hindsight (and hindsight is defined as "recognition of the nature and requirements of a situation, event, etc., after its occurrence"), we can look to the past and say "see what happened because of such-and-such? Why, it was inevitable!" Was it? As the distinguished Lawrence Peter (Yogi) Berra once remarked: "Prediction is very hard, especially when it's about the future." The "working title" I had for this presentation was: "The Future Certainly Can't Be Predicted, But Perhaps It Can Be Invented!" This presentation will be augmented by transparencies to emphasize various points and will also involve some audience participation!


"How you think about who you are right now
has everything to do with what will happen to you in the future."
C.C. Carter, Chico Enterprise-Record, May 16, 1997, page 12A.

My choice of 25-25, as opposed to 20-20, stems from the fact that I was "looking" for something to "hang" this presentation on: twenty-five years ago, in 1973, the Urbanowicz Family (Charlie, Sadie, and nine-month old son Tom) came to Chico; what might the world look like twenty-five years into the future, in the year 2023, when my wife and I hope to be 81 years young and our son 51 years young, and our grandchildren merely in their twenties? What will California State University, Chico be like in 2023? What will the student of 2023 be like? Not to mention the faculty and staff and ever-changing technology!

I'll share some "guesses" about the future below (and I'll be asking you for some of your "best guesses" about the "futures" before us, and I'll be asking you for some "guesses" about previous events), but I do know that twenty-five years ago, on January 8, 1973, in the United States District Court in Washington, D.C., the five defendants in the "Watergate" burglary pleaded "guilty" in the courtroom of Chief Judge Sirica. Do you remember Watergate? Does the name John Sirica "ring a bell" to you? Watergate and Sirica mean something to me, but for most of the 130 students in my classes in Fall 1997 that was old history: most of the Freshmen in my Fall 1997 classes were born in 1979. For the student of 2023, the events of 1973 will really be ancient history, just as the events of 1948 are ancient history to the students of 1998!

The end of 1972 was also interesting because for the first time in its history, on November 14, 1972, the fabled Dow Jones Industrial Average reached a record 1,003.16 points, shattering the 1000 point barrier for the first time! Wow! But by December 1974, the DJI was back down to 577.6!

As of January 1, 1998, the DJI was at 7,908.25, and could anyone have "predicted" that twenty-five years ago? No. Can anyone "predict" what the DJI will be in twenty-five years? No again. Incidentally, from the "great" 1000 point breakthrough in 1972, not very many people probably predicted that within ten years, by 1982, the DJI would be down again to 776.92 in August of 1982, and the "Bull Market" that we are seeing today actually began in that year; and $0.58 gambled on Microsoft in 1986, when Microsoft stock first went public, was worth approximately $142 at the end of 1997; or $5.80 gambled on Microsoft would be worth $1,420.00; or $58 would be worth $14,200; or $580 would be worth $142,000; or...! Please note, I did use the term "gamble" instead of the more popular term "investment" in the stock market, for regardless what anyone says, I (and many others) believe the stock market to be very-large scale gambling and the words of Samuel Clemens, also known as Mark Twain (1835-1910), are worth considering:

"October is one of the peculiarly dangerous months to speculate in stocks. The others are July, January, September, April, November, May, March, June, December, August, and February." In William A. Sherden, 1998, The Fortune Sellers: The Big Business of Buying And Selling Predictions (John Wiley), page 96.

While many may view the stock market as the best "investment" for long term gains, please remember that some stocks go "up" in value and some stocks go "down" in value and it is a gamble! I am reminded of some "gas" ventures in the north state many years ago, or perhaps kiwi "investments" or, as the San Francisco Chronicle pointed out on December 27, 1997: "How the Emu Turned Into a Giant Turkey" and the "investors" (please read "gamblers)who lost quite a bit of money thinking these birds would be the next best thing to....! (Jesse Katz, 1997, San Francisco Chronicle, page A4).

The year 1973 saw Roe v. Wade decided in the United States Supreme Court (by seven-to-two), the first twenty Prisoners-of-War returned from Vietnam, and 84 United States Congressmen sponsored 22 bills calling for the impeachment of President Richard M. Nixon. The House Judiciary Committee began impeachment hearings in 1973 and the rest is history. Could anyone have predicted that a "third-rate-burglary" would result in the 1974 resignation of Nixon and the United States of America having President Ford as the first president to reach office without having been elected by the public? No. (Recall that Nixon's Vice-Presidential running mate, Spiro T. Agnew resigned from office in October 1973, pleading nolo contendere for a federal charge of income tax evasion: failure to pay his taxes on a bribe he received when he was Governor of Maryland in 1967! Who could have predicted that? No one.)

Alfred North Whitehead (1861-1947) once remarked that "it is the business of the future to be dangerous" and perhaps even more dangerous is trying to predict the future: because we can't predict it! By definition, the future is unknowable or unknown: it is "time that is to be or come hereafter; something that will exist or happen in future time." I will argue, however, that we can possibly invent the future; but we can invent the future only if (#1) we know what the present is all about and only if (#2) we know what the past was like that led up to the present.

In 1973, the first gene-splicing took place, East and West Germany finally established formal diplomatic relations, and M*A*S*H, Kojak, and Hawaii Five-O (along with the Mary Tylor Moore Show and Maude) were among the top television shows of the year; and All In The Family was the number one American show, with the some of the "immortal" words of Archie Bunker being: "Look. Archie Bunker ain't no bigot. I'm the first to say--look, it ain't your fault you're colored." The year 1973 gave us Woody Allen's superb movie Sleeper, which gave us an interesting futuristic perspective on various things, including the 20th century, and a late 1997 report seems to continue to confuse us, for as The Sacramento Bee (and other papers pointed out Christmas Eve day):

"Adding a rich dollop of confusion to the question of what's good for you, a new study found that the more saturated fat men eat, the less likely they are to suffer a stroke." (Associated Press, The Sacramento Bee, 24 December 1997, page A9)

In 1973 we got the first computerized brain scanner (and the CAT-scan comes into being), as well as the "multiple accurate reentry vehicle" (MARV), permitting "accurate multiple missile guidance" (Lois Gordon and Alan Gordon, 1990, American Chronicle: Seven Decades In American Life 1920-1989, page 509). Indeed, the "Cold War" was real, and sometimes it is amazing to me that we didn't annihilate ourselves in the 1960s, 1970s, or 1980s, considering that in 1974 India became the 6th nation with nuclear capabilities; and according to the Atomic Energy Commission, at the end of 1996, we know that perhaps thirty-two nations have nuclear reactors going (not to mention at least twelve nations that have the potential to unleash biological warfare).

When did nuclear proliferation finally begin ending? It wasn't when the "Three Mile Island" near nuclear disaster occurred in Pennsylvania in 1979 but in April 1986, after the Chernobyl disaster in the Ukraine, part of the former Soviet Union. Could anyone have predicted the results from Chernobyl? No. Could anyone have predicted the break-up of the former Soviet Union or the (somewhat) peaceful reunification of Germany? No. Incidentally, two authors about the future, Richard Carlson and Bruce Goldman, in their 1994 book entitled Fast Forward: Where Technology, Demographics, And History Will Take America And The World In The Next 30 Years, described the "Cold War" as World War III and wrote that World War IV has already started: "The chief conflict among developed nations over the next decades will be economic in nature...." (1994, page 7).

See my point? Everything is relative to the year in which you were born and to the years in which you grew up and became adjusted to "your" culture. Incidentally, on January 8, 1973, Elvis Presley celebrated his 38th birthday: he wasn't to die until 1977 and does anyone remember when "Elvis-the-Pelvis" was censored from prime-time television in the 1950s for his bodily gyrations? And now on television one has....what will entertainment of 2023 be like? On television or on the Internet or with interactive CD-ROMS ?!


"When you ferret out something for yourself, piecing the clues together unaided, it remains for the rest of your life in some way truer than facts your are merely taught, and freer from onslaughts of doubt." Colin Fletcher, 1968, The Man Who Walked Through Time, page 109.

Moving quickly into the future, and I will not be giving a year-by-year description of events (but I do ask you to check your own personal data banks for historical events and personages that you remember), in 1974 the cost of mailing a first-class letter was increased to ten cents and the first digital watches appeared and someday terms like "clockwise" and "counterclockwise" will be nonsensical phrases to the children of the digital age! (Or will they only mean something to racetrack fans, but that is an entirely different story: I am interested in "gaming" or "gambling" in America and the future of that!)

And the computer? History provides us with a 1943 statement attributed to Thomas J. Watson, Chairman of the Board of IBM: "I think there is a world market for about five computers." (In Christopher Cerf and Victor Navasky, 1984, The Experts Speak: The Definitive Compendium of Authoritative Misinformation, page 208). I was born in 1942, and I suppose that if my family had the money to invest in stocks (they didn't) they might have followed the advice of this eminent individual and not invested in any "computer-related-stock" and they (and he) would have been wrong! But they didn't have any money and they didn't invest and so that historical track never occurred: but what of the people who might have follow those erroneous words? How many people might have made money if they had gambled on those new-fangled computer things?

In order to continue placing today into a somewhat skewed perspective and make some "guesses" about inventing our futures, the year 1976 not only brought the bicentennial to this country, but shortly thereafter the "Personal Computer" appeared with the introduction of the first Apple Macintosh, then the Commodore PET computer, and then the Tandy/Radio Shack TRS-80. Twenty-one years ago, however, in 1977, Ken Olson, President of the then Digital Equipment Corporation (or DEC) stated "There is no reason for any individual to have a computer in their home" (In Christopher Cerf and Victor Navasky, 1984, The Experts Speak: The Definitive Compendium of Authoritative Misinformation, page 208).

In 1995, a survey was conducted by the Electronic Industries Association which reported that personal computers were in 40 percent of American households, computers with CD-ROMs were in 19 percent of American households, and cordless telephones were in 59 percent of American households. Television penetration was at the 98 percent level. (In Peter F. Eder, 1997, The Emerging Interactive Society, The Futurist, Vol. 31, No. 3, pages 43-46).

According to the U.S. Bureau of the Census, the resident population of the United States, projected to January 1, 1998 was 268,925,519 ( On July 1, 1997, California had ~32,268,000 residents and from July 1, 1996 to July 1, 1997, "California gained 410,000 people" (D. Westphal, State's Growth Rebound. The Sacramento Bee, Dec. 31, 1997, page 1). California has roughly 12 percent of the nation's population and some United States census projections for the year 2023 range from a low figure of 290,193,000 to a high figure of 371,423,000 residents ( By July 1, 2025, California could have a population of 49,285,000 ( There is a population boom(let) approaching. In December 1997 a study appeared in Wired magazine (a truly digital publication) that pointed out some 4,000,000 children alone have Internet access:

"Kids' bedrooms are now the nerve center for all things tech in the home. Of the 76 percent of children who have their own rooms, many are equipped with cable television, telephones, or VCRs - and some 4 million have Net access. More good news: 71 percent of those with both a TV and a computer said they would give up television if forced to choose. See ya, Barney." Michael Behar, 1997, Wired, Vol. 5.12, Page 108.

In February 1997, The Wall Street Journal provided an interesting perspective on our near future:

"A population burst unlike any since the heyday of the baby boom has entered the American system. And although its members are still children, their impact on business and society is already immense. ... The annual number of U.S. births started rising around 1980, ending the baby-bust years. In each of the years from 1989 to 1993, U.S. births exceeded four million for the first time since the early 1960s. Today there are roughly 57 million American under age 15--and more than 20 million in the peak years between four and eight. ... 'Technologically, this generation is going to make the Gen-Xers look like fuddy-duddies,' says Frank Gevorsky, a 41-year-old social historian at the Discovery Institute, a Seattle think tank. He predicts that within five years, members of Generation Y will be producing term papers with full motion video. 'They're on fast-forward,' he says. Generation Y was born into a world so different from the one their parents entered that they could be on different planets" [stress added]." Melinda Beck, 1997, Next Population Bulges Shows Its Might. The Wall Street Journal, February 3, 1997, pages B1 + B2, page B1.

Perhaps this is why an anthropologist is so interested in technology: individuals, or children, from "different planets!" What an opportunity for cross-cultural research!

In 1998, Don Tapscott published an extremely interesting book that I think should be read by all who are interested in children, technology, and the future: Growing Up Digital: The Rise of the Net Generation and he introduces the term N-Gen to describe the children of the digital age (that we hope will choose Chico State as their institution of choice). One can also get information on the "Net Generation" at Children are maturing quicker, both physiologically and possibly mentally, and it appears they are being exposed to more-and-more information at an ever earlier age (see, for example, Shirley R. Steinberg and Joe. L. Kincheloe, Editors, 1997, Kinder-Culture: The Corporate Construction of Childhood) and the educational establishment should be prepared for them; for if we aren't prepared for them, others will be. Consider, if you will the following about a hypothetical child born in May 1997, who could be entering Chico State in 2014:

"Like no generation before, Alyssa's enters a consumer culture, surrounded by logos, labels, and ads almost from the moment of birth. As an infant, Alyssa may wear Sesame Street diapers and miniature pro basketball jerseys. By the time she's 20 months old, she will start to recognize some of the thousands of brands flashed in front of her each day. At age 7 [in the year 2004], she will see some 20,000 TV commercials a year. By the time she's 12 [in the year 2009], she will have her own entry in the massive data banks of marketers. Multiply Alyssa by 30 million--the number of babies born in this country since 1990--and you have the largest generation to flood the market since their baby boom parents. More impressive than their numbers, though, is their wealth." (Business Week, June 30, 1997, page 62)

There will be a great deal of competition for that wealth and the educational experience that culture requires.


"Any sufficiently advanced technology is indistinguishable from magic."
Profiles of the Future: An Inquiry into the Limits of the Possible
Arthur C. Clarke, 1984, page 26.


"When this circuit learns your job,
what are you going to do?"
The Medium Is The Massage
Marshall McLuhan & Quentin Fiore, 1967, page 20

When the Intel 4004 microprocessor was introduced in a calculator in 1971, it could handle a whopping 400 instructions a second and by 1981 the IBM PC could handle 330,000 instructions a second. Once again, a Wall Street Journal article in December 1997 reported that:

"Today's run-of-the mill $1,500 PC can handle 200 million instructions a second. Digital's Alpha processor can tick off some one billion instructions per second. Computer power on silicon has increased a million-fold or so in just over a quarter-century" Rich Karlgaard, Digital Warriors Want Baby Bells' Blood [all stress added]. The Wall Street Journal, December 8, 1997, page A24).

What will it be like in 2023 or twenty-five years? A late 1997 report by the Semiconductor Industry Association attempted to predict a mere fifteen years into the future, or 2012:

"For example, the report predicts that chip makers will exhaust conventional lithography, the process of printing circuit designs on silicon wafers, as early as 2006. Right now, the smallest features on chips are 0.25 microns in width, or 1/400th as wide as a human hair. At the current rate of progress, the industry by 2006 will be approaching features that are 0.1 microns wide, beyond the reach of existing lithography tools. ... By 2012, the SIA's experts concluded, manufacturers should be able to put 1.4 billion transistors on a thumbnail-sized microprocessor, which will operate at a speed of 2,700 megahertz. Memory chips will hold as much as 275 billion bits of data. By contrast, Intel's current Pentium II microprocessors have 7.5 million transistors and run at [only!] 300 megahertz, while the most popular memory chips only store about 16 million bits of data [stress added]." Dean Takahashi, Chip Firms Face technological Hurdles That May Curb Growth, Report Suggests. The Wall Street Journal, December 1, 1997, page B8.

Speed and decreasing size of the technology are two variables to keep in perspective, especially when one places things into all important "context" (recalling IBM Watson's 1943 "computer" quote previously cited):

"It is estimated that during World War II [1941-1945 for the United States, or 1939-1945 for Europeans, or 1933-1945 for China and Japan!], there were about 1 billion vacuum tubes in operation round the world, powering radios, radar systems, and any other form of what was then advanced electronics. These vacuum tubes put end to end would have spanned the globe approximately four times and weighed about 200,000 tons. These days, the same electronic capability embedded in microelectric circuitry could be delivered by 182 Intel Pentium Pro chips, each carrying 5.5 million transistors, and all 182 chips would easily fit inside a single shoe box [stress added]" William S. Sherden, 1997, The Fortune Sellers: The Big Business of Buying and Selling Predictions (John Wiley & Sons), page 159)

Based on the context of his times, perhaps Watson in 1943 was correct with his "prediction," but the context changes; and on December 16, 1947, the "transistor" was invented (and the rest is historical hindsight), but Ken Olsen was still wrong in 1977! And the future? IBM invented a "copper chip" and a prediction is:

"Switching metals could speed a microprocessor up to 40 percent while cheapening its manufacturing cost by up to 30 percent, resulting in computers that think faster and store more information, the company said [stress added]" Associated Press, San Francisco Chronicle, 1997 September 22.

And then? The age of the $500 personal computer?: with circuits "as thin as .25 micron--1/400th the width of a human hair--on silicon... (Dean Takahashi, 1997, Envisioning The Era of the $500 PC, The Wall Street Journal, page B1 and B7); or, finally (for now), a Texas Instrument announcement in December 1997 for xerogel bubbles:

"These 'xerogel' bubbles, made from silicon dioxide, are really tiny--a mere 0.001 microns across. You would need 100,000 of the bubbles to span the stump of a human hair. They are small enough to coat circuit lines, which are expected to shrivel to 0.1 microns by 2010, enabling chips to be crammed with 500 million transistors--almost 100 times today's mightiest chips [stress added]" Otis Port, 1997, The Secret in TI's Chips: Bubbles, Business Week, December 22, page 79.

But could there be more in the future? deoxyribose nucleic acid or DNA as computer, with "calculations trillionths of a second, a thousand times faster than the fastest supercomputer." (Michael Stroh, 1997, The Next Frontier: DNA Computers. The Sacramento Bee, December 23, pages A1 and A14, page A14)

What next? Things are shrinking and getting faster and although there are still only 168 hours in a week, some individuals are doing something about that!

"A group of computer programmers at Tsinghua University in Beijing is writing software using Java technology. They work for IBM. At the end of each day, they send their work over the Internet to an IBM facility in Seattle. There, programmers build on it and use the Internet to zap it 5,222 miles to the Institute of Computer Science in Belarus and Software House Group in Latvia. From there, the work is sent east to India's Tata Group, which passes the software back to Tsinghua by morning in Beijing, back to Seattle and so on in a great global relay that never ceases until the project is done. 'We call it Java Around the Clock,' says John Patrick, vice president of Internet technology for IBM. 'It's like we've created a 48-hour day through the Internet [stress added].'" (USA Today, April 24, 1997, page B1).

Given the times we live in, I can think of no better statement that strongly encourages California State University, Chico, to increase involvement in various educational technology delivery systems than the following statement by Minoru Arakawa which appeared in USA Today of June 23, 1997. He was discussing the activities of Nintendo of America, started in 1979, and the following question and answer appeared:

"Q: How do those older games from the '80s compare with games for the current system, the Nintendo 64? Arakawa: It's like a university compared with elementary school. The graphics are so much better. The sound is much better. Everything is much better [stress in original] (Nintendo Plans Zelda 64 For Next Big Play. Mike Snider, USA Today, June 23, 1997, page 10B).

"Everything is much better" and this is what our future (and current) students expect (and will expect) when it comes to educational delivery systems; and these are the students referred to earlier in this paper: the members of Generation Y who will be producing term papers with full motion video. "They're on fast-forward" and they were "born into a world so different from the one their parents entered that they could be on different planets" [stress added]" (Melinda Beck, 1997, Next Population Bulges Shows Its Might. The Wall Street Journal, February 3, 1997, pages B1 + B2, page B1). Perhaps everything will also be different is another phrase that we should consider!


"'We used to educate farmers to be farmers, factory workers to be factory workers, teachers to be teachers, men to be men, women to be women.' The future demands 'renaissance people. You can't be productive in the information age if you don't know how to talk to a diverse population, use a computer, understand a world view instead of a parochial view, write, speak.'" (In Byrd L. Jones and Robert W. Maloy, 1996, Schools For An Information Age: Reconstructing Foundations For learning And Teaching, page 15).

In the Spring of 1997, my wife and I completed a 10,000 mile-cross country sabbatical trip and visited 26 institutions that were using various forms of educational technology in their classrooms. In the report submitted to the Provost, and in the original sabbatical proposal of 1994, I incorporated the following words of Marshall McLuhan (1911-1980) and I still stand by them:

"The speed of information movement in the global village means that every human action or event involves everybody in the village in the consequences of every event [if they wish to or if they can take part]. The new human settlement in terms of the contracted global village has to take into account the new factor of [potential] total involvement of each of us in the lives and actions of all. In the age of electricity and automation, the globe becomes a community of continuous learning, a single campus in which everybody irrespective of age, is involved in learning a living [stress added]." Marshall McLuhan, 1969, Counterblast, page 41.

Higher education, and all education (public and private and industrial), is changing and the various current electronic technologies are the latest attempt to convey information to the latest group of students. "On April 1, 1997, 69.3 mil[lion] Americans (26%) were under 18 years old" and they will be entering, or are already in, the school system (The World Almanac and Book of Facts, 1998 edition, 1997, page 376).

The main constituents of California State University, Chico, are our students and the secondary constituents are parents and legislators (and they can also be the same). It behooves every academic and non-academic unit on campus to continue to give the best to our students both in terms of personal contact and information technologies for their entrance into the 21st century; and we need the best possible information in order to make the best possible "guesses" about who is coming to college in the millennium; and we also need to get our best information out to our potential students!

Change is ever-present, from Marcus Aurelius, through the 18th and 19th centuries, and into the present. Institutions of higher education must change to keep up with the electronic environment. As cited in the sabbatical report:

"Colleges will not, of course, disappear--but over time they will be dramatically altered in nature as students and professors adopt cyberspace as their primary window into the laboratory of life. The distinctions between academic and applied research will become blurred as academic and commercial researchers begin to tap into the same sources of information and exchange in cyberspace [stress added]." David B. Whittle, 1997, Cyberspace: The Human Dimension, page 217.

"Dramatic" changes will come about because the very environment of the next twenty-five years will be radically altered as a result of the electronic revolution upon us; and although I remain positive, please consider the 1996 words of Winn Schwartau, a cyberspace expert:

"Colleges and universities will be replaced with a higher educational database that provides personally tailored interactive instruction and testing." Winn Schwartau, 1996, Information Warfare: Cyberterrorism: Protecting Your Personal Security in the Electronic Age (NY: Thunder's Mouth Press), page 660.

All of education is changing and College and Universities must be prepared to change, including 24-hour computer laboratories and multimedia classrooms and totally "wired" dormitories. For information on K-12 school on the World Wide Web please see and for information on some 100 wired universities please see Universities must change because K-12 education is changing: consider that in 1997 some 9,565 Senior High Schools in this country had network connections, up from 1,736 network connections in 1992 (The World Almanac and Book of Facts, 1995 edition, page 221; 1997 edition, page 251; and 1998 edition, page 217). The same network "growth rate" is evident for the Elementary Schools and Junior High Schools over the same time period and MODEM-usage and CD-ROM usage in K-12 schools is also (understandably) increasing.

Just at the end of 1997, USA Today reported that the federal allocation for School Districts in the United States in 1988 will be $423,000,000, with California's share being $46.5 million (or 10.99 percent), "to encourage computer literacy and connect schools to the Internet" (USA Today, December 10, 1997, page 10A). On July 21, 1997, USA Today reported that "U.S. school districts plan to spend about $5.2 billion on educational technology in the 1997-1998 school year, 21% more than the previous year....It's the first increase of this magnitude in about four years, says Quality Education Data (QED)...." (USA Today July 21, 1997, page 6D). QED is also the source for the information in The World Almanac and Book of Facts on network, modem, and CD-ROM growth..

In addition to children who are going to K-12 schools and being exposed to the new technologies, both public and private, another driving force to consider are students who are outside of the "traditional" K-12 classroom: the "home schoolers" in this country. Incidentally, the distinction is made between "public and private" since a Sacramento Bee article of December 29, 1997 pointed out that there are some 4,000 private schools in California who are "capturing a constant 10 percent of the mainstream kindergarten through 12th grade population" and are definitely a "growth industry" ("Private Schools on the Rise" by Jon Engellenner, page A1 and A10).

Concerning 'home schoolers," there was a provocative 1996 article by James Snider entitled "Education Wars: The Battle Over Information-Age Technology" the author pointed out that "New information technologies will transform education, but only after a battle royal with the education establishment" (The Futurist, Vol. 30, No. 3, May-June, pages 24-28). Snider also pointed out the following:

"Hard-core home schoolers, despite their reputation for anachronistic values, may be the only ones motivated enough to lead U.S. education into the Information Age. These home schoolers, who seek to completely bypass regional educators, have grown from 10,000 to over 500,000 in the last 20 years. They are already the leading users of educational technology in the United States."

Home-schoolers and "regular" elementary school students and junior high students and high school students are already utilizing the power of the technology and cyberspace to acquire some of their information. Consider, if you will, an article from Business Week of December 29, 1997, documenting the Virtual High School: some 500 students (involving 27 schools), in the 11th and 12th grades, are taking part in an Education Department technology grant utilizing the World Wide Web: while they admit the program isn't for everyone, it is a provocative step towards distance education while still in high school! (Stephen H. Wildstrom, 1997, The World Wide Classroom, Business Week, Page 18); if you can make the time, one may "visit" the VHS at

The future, both in and outside of the classroom, will be interesting, and I cite the 1978 Physics Nobel Laureate, Arno Penzias:

"Throughout the ages, technology has helped shape the facts we humans think about. As our knowledge has increased, so have our tools and the ways we employ them. Today, technology is so complex and pervasive that it dominates much of the environment in which human beings live and work. For this reason, I feel we need a better understanding of how technology affects the ways in which we now create and explore ideas [stress added]." Arno Penzias, 1989, Ideas And Information: Managing In A High-Tech World (NY: Simon & Schuster), page 179-180.

Not only do we need a "better understanding" of how technology works but we also need to make the time to "play" with the new technology and explore its potentials (just as the children of today are playing and exploring on the Internet). We also need to realize the impact of the past, current, and new technologies on our daily lives and realize how much technology is now (somewhat) unfortunately driving our lives! We must stress the impact to the students of today and the students of the millennium and this is why, in all of my classes, I stress the importance of the Office of Experiential Education on campus as well as the excellent facilities and services of the Career Planning and Placement Office. There is no such thing as "future shock" merely unawarness of the present and the resources that already exist in the environment about us. Finally, to end this section:

"The information revolution has changed the way we work, play, learn, shop, bank, retrieve information, and govern ourselves. In 1980, few people could purchase a VCR. They were too expensive, too complicated. Today VCRs are in 90 percent of American homes. As recently as 1979--less than two decades ago--there were no PCs, no fax machines, no cellular phones or CDs, no MTV or CNN. And no one yet had invoked the term 'information superhighway.'" Ken Auletta, 1997, The Highwaymen: Warriors of the Information Superhighway (NY: Random House), page ix.


"The unit of survival [or adaptation] is organism plus environment.
We are learning by bitter experience that the organism
which destroys its environment destroys itself."
(Gregory Bateson, Steps to an Ecology of Mind, 1972: 483)

From my limited perspective, one of the problems in attempting to make "predictions" about the future is our collective hubris about both the present and the past! I am of the opinion that "experts" of every generation view the activities of that generation as the most exciting, important, and "progressive" events that ever occured ever! In my classes I attempt to convey the importance of "the past" (or history) in understanding the present (or contemporary events) and stress that the "history" of the discipline, in my case Anthropology, is the most important course for the Anthropology major in the university curriculum. Consider if you will the following timely words:

"Nobody who has paid any attention to the peculiar features of our present era, will doubt for moment we are living at a period of most wonderful transition which tends rapidly to accomplish that great end, to which indeed, all history points--realization of the unity of mankind. . . . The distances which separated the different nations and parts of the globe are rapidly vanishing before the achievements of modern invention, and we can traverse them with incredible ease; the languages of all nations are known, and their acquirement placed within the reach of everybody; thought is communicated with the rapidity, and even by the power, of lightning. On the other hand, the great principle of the division of labor, which may be called the moving power of civilization, is being extended to all branches of science, industry, and art. . . . The products of all quarters of the globe are placed at our disposal, and we have only to choose which is the best and cheapest for our purposes, and the powers of production are entrusted to the stimulus of competition and capital [stress added]."

These words come not from 1997 or 1998 but from the Inaugural Address of the Prince Consort Albert (1819-1861), on May 1, 1851, on the occasion of the opening of the "Great Exhibition of the Works of Industry of all Nations" held in London at the Crystal Palace (in Michael Sorkin, 1992, Variations On A Theme Park: The New American City And The End of Public Space, page 209). Similar temporal-centric phrases can be found elsewhere in the literature and one can also go to the World Wide Web ( and read similar phrases: created by two individuals who were "appalled by the modern desire to call every technical achievement 'the greatest things since Columbus discovered America'" one of the site-originators had this to say: "It's a 20th century disease to believe we're not mired in history like everyone else" [stress added]" (Elizabeth Weise, 1997, "Dead Media List Tracks Forgotten Revolutions" in USA Today, December 31, page 4D). Every generation finds it difficult to believe that that which comes after the present generation might in fact be better than that which is occuring right now.

In my classes I constantly attempt to place various concepts (ideas, behavior, words, and things) into perspective and I utilize a "building block" approach through time and develop the ABCs of how I view Anthropology (and perhaps the world): "The Appreciation of Basic Cultural Diversity Everywhere" through an "evolutionary" and cumulative perspective. It is a perspective that stresses the environment and the "context" of the times.

I also use FGHI, for the transitions we, as a species, have made as we evolved through the "Foraging, Gathering, Hunting, and Information" stages and I am now expanding the framework to include "J & M" for the "Just Mining" phase we have entered as a result of the world-wide web! My philosophy, or cyberphilosophy if you will, points out we need to do a great deal of work to get something of value! Once a devotee of science fiction, I appreciated the words of the science fiction author Theodore Sturgeon (1918-1985): "Ninety percent of all science fiction is crud; but, on the other hand, ninety percent of everything is crud!" While the world wide web is not "crud" (I think that the rephrased quote might have to be 99.99 percent to .01), there is still a lot of information "out there" that must be handled with a good deal of discrimination!

Although it may not be a well-known (or readily accepted) fact, it was "sex" which led to the widespread development of the VCR industry. The first year that VCR tapes were introduced to the public, "adult videos" accounted for more than 90 percent of annual videotape sales. It may be debated that the 1972 movie Deep Throat launched the United States adult film industry, but in 1980 there were approximately 1,000 adult movie houses in the USA and in 1997 there were about twenty: sex went from the public cinema to the privacy of the home:

"The sex industry has been a critical factor in past media revolutions: adult videos exploded the video market, and adult entertainment channels helped early cable ventures. The pattern on the Web is no different." Frank Rose, 1997, Sex Sells, Wired, December Vol. 5.12, pages 218-224 and pages 276-284, page 221.

There is what some call "pornography" in cyberspace and the potential exposure to what is called "inappropriate material" for the children of the millennium is out there, but as a 15 year student wrote:

"I have never 'stumbled' into a [WWW] site I didn't want to see. Not like on TV where I have occasionally flicked the channel only to 'stumble' into some gruesome murder scene." Reanna Alder, In Don Tapscott, 1998, Growing Up Digital: The Rise of the Net Generation (McGraw Hill, page 239.

The world is changing and some of us may not agree with the way the world is changing and this makes it all the more important for parents and educators to stress the importance of critical thinking and decision-making and the importance of choosing, weighing, and gathering what we use from the Internet (as well as from television and printed materials and newspapers and....): critical thinking is forever!

I believe that we have entered the "mining" phase (although we have always been miners of information) because we have to "extract" or distill valuable information from the world of "facts" all about us! I try to make my classes, and my papers, new every time and I appreciate and think I understand the words of Scott Adams:

"Make sure your employees [or students!] are learning something every day. Ideally, they should learn things that directly help on the job, but learning anything at all should be encouraged. The more you know, the more connections form in your brain, and the easier every task becomes. Learning creates job satisfaction and supports a person's ego and energy level." Scott Adams, 1996, The Dilbert™ Principle: A Cubicle's-Eye View Of Bosses, Meetings, Management Fads & Other Workplace Afflictions, page 322.

Continuing to place things into perspective, various individuals (students, faculty, and staff) can often fail to realize the amount of WORK needed to accomplish anything. Consider, if you will, the need for refined gold (from jewelry to dentistry to high technology uses): in order to acquire one ounce of gold, one must process a lot of ore, considering that only ~.12 ounce of gold is found in one ton of ore. The ratio of gold to ore is therefore ~.000375% (or .12/32,000 ounces) and one must do quite a bit of digging (and cleaning up or processing) to get anything of value, be it gold or information, be it "physical" or in cyberspace. Incidentally, when I first used this example in August 1996, beginning an ANTH 198 CYBERSPACE course (still on the web at, gold was selling at ~$US389/ounce, definitely "down" from its record high of $875/ounce in January 1980. On January 5, 1998, gold was selling at ~$US288.70/ounce and since the cost of extracting, or mining the gold is ~$240/ounce, there have been some layoffs in the gold-mining industry (although the decrease in gold prices has yet to be passed unto the consumer).

Think of individuals who listened to "predictions" about the rise of gold and gambled heavily in that: where are they today and do we hear from them? The value of gold (and information) changes over time, giving truth to the 1883 words of the French philologist and historian Ernest Renan (1823-1892): "The simplest schoolboy is now familiar with truths for which Archimedes would have sacrificed his life."

In an information age we deal with information and I am an anthropologist of the information age. I began this concluding section with the words of Gregory Bateson (1904-1980), getting us to think about the role of the individual and the environment; I also believe in the ideas and words of Charles Darwin (1809-1882) and his ideas concerning "natural selection" as well as "survival of the fittest" (first used in 1864 by the great sociologist Herbert Spencer [1820-1903]). Earlier in this paper I cited Carlson and Goldman on World War III (the "Cold War") and the current World War IV, and let me give their complete definition on WWIV "The chief conflict among developed nations over the next decades will be economic in nature--and the U.S. has been losing its lead" (1994, Fast Forward: Where Technology, Demographics, And History Will Take America And The World In The Next 30 Years, page 7).

At California State University, Chico, although we may not yet be involved in a "global war" we are in competitition (or perhaps waging modest "battles") for those students who will be "coming to college in the millennium" and I hope we don't lose the war. Using a warfare metaphor, in the context of the information age, is not inappropriate. Our potential consumers/customers (or students) are being bombarded with massive amount of information about their futures and they (and their parents/guardians) must also make their best "educated guesses" about what they will invent. If we consider the brilliantly translated 1832 posthumous words of Karl von Clausewitz (1780-1831) in the context of colleges and universities, the words are stiill appropriate for us to consider for the students of the millennium:

"By the word 'information' we denote all the knowledge which we have of the enemy and his country; therefore, in fact, the foundation of all our ideas and actions. Let us consider the nature of this foundation, its want of trustworthiness, its changefulness, and we shall soon see what a dangerous edifice War is, how easily it may fall to pieces and bury us in its ruins. For although it is a maxim in all books that we should trust only certain information, that we must always be suspicious, that is only a miserable book comfort, belonging to that description of knowledge in which writers of systems and compendiums take refuge for want of anything better to say. Great part of the information obtained in War is contradictory, a still greater part is false, and by far the greatest part is of a doubtful character. What is required of an officer is a certain power of discrimination, which only knowledge of men and things and good judgment can give. The law of probability must be his guide [stress added]." Carl von Clausewitz, On War [1968 edition with an "Introduction" by Anatol Rapoport] (Penguin Books), page 162.

Although we are not at "war" with the Internet, everyone needs a great deal of discriminatory power in determining what we choose from the Internet; we may not be at "war" with our 21 sister institutions of the CSU, nor the 9 campuses of the UC system, nor the myriad of private colleges and universities in California (nor the Community Colleges of California nor the educational institutions of the other 49 states), but we are definitely in "competition" with them in the Darwinian sense: we are competing for the students of the millennium. Various students of the millennium are getting information which can be contradictory, can be partly false, and can be partly of a questionable nature. California State University, Chico, must use all of the resources possible to get "our message" out to the students of the millennium, include face-to-face contact, the printed word, and the Internet. By increasing the number of information sources that individuals are exposed to (or by increasing the "sample size" of information that is considered in any decision-making process), in statistical terminology, the "error of the estimate" is decreased and (hopefully) better decisions can be made.

If I seem to be stressing the Internet as a data source, it is because I think it will have a profound effect on education in the next twenty-five years and I have an idea for an electronic distribution system entitled CHICO-L, that can use the power of the Internet to get information off campus to appropriate (and interested) individuals and potential students of the millennium and I am working on this with the Office of Admission and Records on this. If you can make the time, do consider going to something like Web66 ( and see what some of the students of the millennium are currently doing in Cyberspace; or go to and see what some contemporary "wired" Colleges and Universities are doing right now in Cyberspace. It is amazing!

Earlier I cited the 1997 words of David Whittle who wrote that colleges in the future won't disappear, but they will be altered; consider, however, the not so sanguine 1997 statement by Peter Drucker in Forbes magazine: "Thirty years from now big university campuses will be relics" (Peter Drucker, 1997, Forbes, March 10, pages 126-127). Don Tapscott has this to say about the article:

"Educators really took note when none other than Peter Drucker shocked the post-secondary world in the March 10, 1997 issue of Forbes magazine. Confirming leading educators' worst nightmare, he stated publicly: 'Thirty years from now big university campuses will be relics.' Referring to the impact of the digital revolution, Drucker said: 'It is as large a change as when we first got the printed book.' He continued: 'It took more than 200 years for the printed book to create the modern school. It won't take nearly that long for the big change. ... Already we are beginning to deliver more lectures and classes off campus via satellite or two way video at a fraction of the cost. The college won't survive as a residential institution. Today's buildings are hopelessly unsuited and totally unneeded [stress added]." Don Tapscott, 1998, Growing Up Digital: The Rise of the Net Generation (McGraw Hill), page 153; and see Robert Lenzner and Stephen H. Johnson, 1997, Seeing Things As They really Are. Forbes, March 10, 1997, pages 122-128.

Perhaps this is too gloomy an assessment, but similar articles are appearing on a regular basis in the papers and various publications and on the Internet. As an old colleague of mine wrote in 1996:

"The Internet will not totally replace schools and universities, but these traditional institutions must transform themselves if they are to prepare tomorrow's students for lifelong learning. ... If the universities do not reform quickly, they will rapidly decline into irrelevance." Joseph Pelton, 1996, Cyberlearning vs. the University: An Irresistible Force Meets an Immovable Object, The Futurist (Vol. 30, No. 6, December), pages 17-20, page 17.

Finally, higher education and universities aside, my best guess about the future is that it cannot be predicted: too many things can happen, including totally random events that no one has control over! The future can, however, be invented! In a very interesting 1998 publication dealing with the field of "futures" by William A. Sherden, entitled The Fortune Sellers: The Big Business Of Buying And Selling Predictions, the author points out the following:

"Analysis of the track record of forecasters over the past several decades shows that long-term technology predictions have been wrong about 80 percent of the time. Such was the finding of Steven Schnaars, [1989] author of Megamistakes: Forecasting and the Myth of Rapid Technological Change and an associate of marketing at Baruch College, in a study of forecasts published between 1959 and 1989 in the Wall Street Journal, the New York Times, Business Week, Fortune, Forbes, Time, and Newsweek. Schnarrs also evaluated technology forecasts made by the high-tech manufacturer TRW, Herman Kahn, and Industrial Research magazine. Evaluating the predictions in TRW's 1966 study, Probe of the Future, he found that 'nearly every prediction was wrong.' Herman Kahn, a popular futurist in the 1960s and 1970s, and director of the Hudson Institute, another think tank, included an entire chapter of predictions in his 1967 book, The Year 2000: One Hundred Technical Innovations Very Likely in the Last Third of the Twentieth Century. According to Schnaars, Kahn's predictions were somewhere between 75 percent and 85 percent wrong, depending on the generosity of the grader. In 1969, Industrial Research polled the research directors at most major industrial firms in the United States, asking them to identify the technologies that would emerge over the next ten years. Schnaars found that of the twenty-two predictions, nineteen (86 percent) were wrong. One of the most startling of Industrial Research's predictions for 1979 was that human life spans would reach 150 to 200 years [stress added]." William A. Sherden, 1998, The Fortune Sellers: The Big Business Of Buying And Selling Predictions (John Wiley & Sons), pages 169-170.

Schnarr's 1989 book, however, did point out that "not all forecasts failed. Some successfully foresaw events that produced vibrant market growths" (Steven P. Schnaars, 1989, Megamistakes: Forecasting and the Myth of Rapid Technological Change, page 107) and he came up with three general guidelines, the first of which places Drucker into some perspective (and also places my own technological biases into perspective):

"The most obvious advice to be gleaned from a study of past forecasts is to avoid falling in love with the underlying tecnology. The most outlandish errors uncovered in this study failed for this reason. [and] Ask Fundamental Questions About Markets [and] Stress Cost-Benefit Anlysis: The most fundamental question to ask of a growth market forecast is whether or not the product upon which it is based provides customers with something special and does so at a price that both the customer and the manufacturer will accept [stress added]". Steven P. Schnaars, 1989, Megamistakes: Forecasting and the Myth of Rapid Technological Change, pages 143-147.

Similar "warning words" appeared in the Enterprise-Record of December 27, 1997 (page 9B), when Laura Urseny presented the words of Neil Postman, a contemporary researcher and author who has written about technology: it would be "stupid" Postman wrote, "for us not to give the most careful (and even skeptical) attention to any technology" and while his view may be more negative than mine, it is well worth reading and considering: we must all make our own individual choices and decide for ourselves. Finally, an excellent balanced presentation, I believe, appeared in a "Special Section" of The Wall Street Journal on November 17, 1997: too lengthy to summarize (it was 36 pages in length), one major point was stressed:

"But things aren't as gloomy as they look. Amid all the dissatisfaction and rancor, educators have picked up some concrete lessons about high tech. Chief among them: Computers can improve education, but not without serious planning from schools and teachers [stress added]." William M. Bulkeley, 1997, Hard Lessons. The Wall Street Journal, Technology, pages 1-36, page 1.

Serious planning and serious decision must be made concerning technology and higher education and I might as well continue getting on record as stating that I personally think that the proposed "public/private" technology "partnership" that has been extensively discussed lately for the California State University System is wrong (see Jean Wood, 1997, Ready for CETI? Chico News & Review, December 18, pages 13-15). The celebrated California Educational Technology Initiative (or CETI), calling for a "partnership" between the CSU and Fujitsu, GTE, the Hughes Corporation, and Microsoft, is an idea which might have been appropriate five years ago but is not appropriate for the relatively healthy and prosperous California (and national) economy of 1998 (and beyond?). When I first heard of the "partnership" plans in Fall 1997 I thought cui bono? (Or who benefits by it?) The more I have read about it, the more it seems to me it is not the CSU and it is not California State University, Chico, who benefits by the "partnership." Corporations are charged with watching the "bottom line" of their investments: they must make a "profit" for their constituents. The university is not charged with making a "profit" on educational endeavours. The university however, as well as the coorporation, must be efficient; but this does not mean that the university must embrace the corporate model; and the more I read about the fabled partnership, the more I was reminded of the following: Sed quis custodiet ipsos custodes? (But who is to guard the guards themselves? Decimus Junius Juvenalis, c. 55A.D.-c.130A.D.) The "bottom line" (as I interpret it) is that corporations, for the most part, deal with a product; education, however, deals with a process: the process of inquiry. Finally, the corporate world today is extremely volatile and in Business Week of January 12, 1998, one may read the following:

"Predicting what's going to happen in the telecommunications industry these days is something of a fool's errand. ... The only certainty in 1998 is that the seismic change will continue. With the industry's urge to merge, expect at least one major long-distance or local telecom company to disappear by the end of the year through merger or acquisition. The most likely candidates are U S West, BellSouth, Ameritech, and GTE [stress added]." Peter Elstrom, 1998, Telecommunications Prognosis 1998. Business Week, January 12, 1998, pages 92-93, page 92.

Aside from the Microsoft/Department of Justice activities currently going on, if GTE, one of the proposed CETI "partners," disappears, will the interests of the CSU (and the public) be protected?

I began my conclusion with Bateson, for we all need to be aware of the environment that is rapidly changing around us as a result of the technology of the day; with von Clausewitz in mind, I stress that we must realize the contradictory nature of information and the need for discrimination; and with technological growth in mind, we must realize that our children and our children's children will look back on the "quaint" and archaic ways of the 1990s and say...."You couldn't talk to your household computer? You couldn't take a maglev transport to the city? You didn't realize the implications of....? You didn't know your P-53 DNA sequence potential? You didn't have....? See my point again? Everything is relative to the year in which you were born and to the years in which you grew up and became adjusted to "your" culture, and the all important attitude you have about life. As we all know, on December 19, 1997 Harlen Adams (1904-1997) died, and he was, indeed, a part of the Chico community and his following words are very appropriate today (and always):

"The most important word in the English language is attitude. Love and hate, work and play, hope and fear, our attitudinal response to all these situations, impresses me as being the guide." In Memory of Dr. Harlen Adams. Senior Lifestyle, Vol. 7, No. 3, Oroville, CA, page 2.

So what sort of students will be coming to the college in the millennium? In my opinion (and the opinion of others), we will probably be seeing greater ethnic and cultural diversity in our classes and more students who are physically and intellectually challenged in various ways and we should be prepared for all of them. Many of the students of the millennium will also be potentially brighter and sharper than our current students and they will have been exposed to much more technology and information than we were at their age. We will see "average" students and "slow" students who probably shouldn't consider going to college but technology may be the great equalizer in getting the slower students up to "average" speed and getting the average students beyond the average! Perhaps educators will even come to accept the idea of Howard Gardner concerning the "intelligence" (or abilities) of students in the millennium: intelligence is not a single "thing" but multiple processes and maybe "technology" will be able to assist all students! (See Howard Gardner, 1993, Frames Of Mind: The Theory of Multiple Intelligences..) And what will the students of 2023 know about our world of 1998? Probably the same sort of things that the 12,876.90 full time equivalent students of 1997-1998 know about the world of 1973!

TO END: I recently came across a wonderful phrase that I would like to end with: "the world is full of butterflies" (Ovars Peterson, 1998, The Jungles of Randomness: A Mathematical Safari, page 145). It refers to the mathematics of "Chaos Theory" and that darn butterfly which became so popular in a 1972 paper by Edward Lorenz: "Does the flap of a butterfly's wings in Brazil set off a Tornado in Texas?" In brief, little things can have a tremendous impact, and we will never know...until we experience it. If professional forecasters can't predict the future, what makes anybody think anyone can? We all make educated guesses and that is the best we can do: we deal with the future by inventing it and by building on the known as one moves into unknown: thank goodness that the world is full of some wonderful and interesting people and beautiful butterflies!

# # #


Was the first commercial microwave sold to the general public? 1967

Did the Apollo 10 Astronauts walk on the moon? 1969

Were VHS videotapes introduced for sale to the general public? 1976

Did the Apple II computer appear at the First West Coast Computer Faire? 1977

Did the World Health Organization announce the eradication of smallpox? 1979

Did CNN (Cable News Network) begin broadcasting? 1980

Was the first woman, Sandra Day O'Connor, appointed to the US Supreme Court? 1981

Was AT&T "broken up" into the "Baby Bells"? 1984

Did the Space Shuttle Challenger explode? 1986

Did the Exxon Valdez oil spill in Alaska occur? 1989

Did Jim Henson, of Muppet fame, die? 1990

Did a bomb exploded in the World Trade Center in New York City? 1993

The above twelve items were chosen from various sources, including the aforementioned World Almanac as well as Grun (1991), Hellemans and Bunch (1988), and Kleinfelder (1993) (please see REFERENCES CITED below).


Will we have entertainment on demand? 2003

Computerized self care? 2007

WIll half of all household wastes recycled? 2008

WIll electric cars become common? 2011

Chemical usages on farms drops by one-half? 2012

Will we have optical computers? 2014

Will we have nanotechnology? 2016

Will we have biochips? 2017

One-half of goods sold electronically? 2018

Will we have hydrogen energy? 2020

Will we have a permanent moon base? 2028

Will we send humans on a mission to Mars? 2037

The above twelve items were chosen from the article by William A. Halal, Michael D. Kull, and Ann Leffman entitled "Emerging Technologies: What's Ahead for 2001-2030" in the December 1997 issue (Vol. 31, No. 6) of The Futurist: A Magazine of Forecasts, Trends, and Ideas About the Future (pages 20-28). The Futurist is the publication of the Washington, D.C.-based World Future Society and the forecast is based on on-going work by "about 45 well-known futurists and technical experts [who] have participated in George Washington University's Forecast of Emerging Technologies" in Washington, D.C., since 1990.

# # #


Anon., USA Today, July 21, page 6D.

Anon., Business Week, June 30, 1997, page 62.

Anon., [Associated Press], 1997, San Francisco Chronicle, September 22.

Anon., 1997, USA Today, December 10, page 10A.

Anon., 1998, In Memory of Dr. Harlen Adams, Senior Lifestyle, Vol. 7, No. 3 (Oroville, CA,) page 2.

Adams, S., 1996, The Dilbert™ Principle: A Cubicle's-Eye View Of Bosses, Meetings, Management Fads & Other Workplace Afflictions (Harper Collins).

Auletta, K., 1997, The Highwaymen: Warriors of the Information Superhighway (NY: Random House).

Bateson, G., 1972, Steps to an Ecology of Mind.

Beck, M. 1997, Next Population Bulges Shows Its Might. The Wall Street Journal, February 3, 1997, pages B1 + B2, page B1.

Behar, M. 1997, Wired, Vol. 5.12, Page 108.

Bulkeley, W.M., 1997, Hard Lessons. The Wall Street Journal, Technology, November 17, page 1-36.

Carlson, R. and Goldman, B., 1994, Fast Forward: Where Technology, Demographics, And History Will Take America And The World In The Next 30 Years (Harper Collins).

Cerf, C. and Navasky, V., 1984, The Experts Speak: The Definitive Compendium of Authoritative Misinformation (Pantheon).

Clarke, A.C., 1984, Profiles of the Future: An Inquiry into the Limits of the Possible (Warner Books).

Clausewitz, Carl von, 1832, On War [1968 edition with an "Introduction" by Anatol Rapoport] (Penguin Books).

Dickson, P., 1990, Timelines: Day by Day and Trend by Trend from the Dawn of the Atomic Age to the Close of the Cold War (Addison-Wesley).

Eder, P.F., 1997, The Emerging Interactive Society. The Futurist, Vol. 31, No. 3, pages 43-46.

Engellenner, J., 1997, Private Schools on the Rise. The Sacramento Bee, December 29, Page A1 and A10.

Famighetti, R. [Editorial Director], 1997, The World Almanac And Book Of Facts 1998 (NJ: World Almanac Books).

Fletcher, C., 1968, The Man Who Walked Through Time.

Gardner, H. 1993, Frames of Mind: The Theory of Multiple Intelligences (10th Anniversary Edition) (NY: Basic Books).

Grun, B., 1991, The Timetables of History (Simon and Schuster).

Halal, W.A., Kull, M.D., and Leffman, A., 1997, Emerging Technologies: What's Ahead for 2001-2030. The Futurist. (Vol. 31, No. 6, December ), pages 20-28.

Hellemans, A. and Bunch, B., 1988, The Timetables of Science: A Chronology of the Most Important People and Events in the History of Science (Simon and Schuster).

Jones, C. and Maloy, R.W., 1996, Schools For An Information Age: Reconstructing Foundations For Learning And Teaching.

Karlgaard, R., 1997, Digital Warriors Want Baby Bells' Blood. The Wall Street Journal, December 8, page A24.

Katz, J., 1997, How the Emu Turned Into a Giant Turkey. San Francisco Chronicle, December 27, page A4.

Kleinfelder, R.L., 1993, When We Were Young: A Baby-Boomer Yearbook (Prentice Hall).

Lenzner, R., and Johnson, S.S., 1997, Seeing Things As The Really Are [on Peter Drucker]. Forbes, March 10, Vol. 159, No. 5, pages 122-128.

McLuhan, M. and Fiore, Q., 1967, The Medium is the Message (NY: Bantam).

McLuhan, M. 1969, Counterblast (NY: Bantam).

Patrick, J., 1997, IN USA Today, 1997, page B1.

Pelton, J., 1996, Cyberlearning vs. the University: An Irresistible Force Meets an Immovable Object, The Futurist (Vol. 30, No. 6, December), pages 17-20.

Penzias, A., 1989, Ideas And Information: Managing In A High-Tech World (NY: Simon & Schuster).

Peterson, O. 1998, The Jungles of Randomness: A Mathematical Safari (John Wiley & Sons).

Port, O., 1997, The Secret in TI's Chips: Bubbles, Business Week, December 22, page 79.

Rose, F., 1997, Sex Sells, Wired, December Vol. 5.12, pages 218-224 and pages 276-284.

Schnaars, S.P.,1989, Megamistakes: Forecasting and the Myth of Rapid Technological Change (Free Press).

Schwartau, W., 1996, Information Warfare: Cyberterrorism: Protecting Your Personal Security in the Electronic Age (NY: Thunder's Mouth Press).

Sherden, W.A., 1998, The Fortune Sellers: The Big Business Of Buying And Selling Predictions (John Wiley & Sons).

Snider, J., 1996, Education Wars: The Battle Over Information-Age Technology. The Futurist, Vol. 30, No. 3, May-June, pages 24-28.

Snider, M., 1997, Nintendo Plans Zelda 64 For Next Big Play. USA Today, June 23, 1997, page 10B.

Sorkin, M. (Editor), 1992, Variations On A Theme Park: The New American City And The End of Public Space (NY: Hill and Wang).

Steinberg, S.R. and Kincheloe, J.L. [Editors], 1997, Kinder-Culture: The Corporate Construction of Childhood (Westview Press).

Stroh, M., 1997, The Next Frontier: DNA Computers. The Sacramento Bee, December 23, pages A1 and A14, page A14

Takahashi, 1997a, Envisioning The Era of the $500 PC, The Wall Street Journal, November 18, page B1 and B7.

Takahashi, D., 1997b, Chip Firms Face technological Hurdles That May Curb Growth, Report Suggests. The Wall Street Journal, December 1, page B8.

Tapscott, D. 1998, Growing Up Digital: The Rise of the Net Generation (Mc-Graw Hill).

Urseny, L., 1997, Biz Bits. The Chico Enterprise-Record, December 27, page 9B.

Weise, E., 1997, Dead Media List Tracks Forgotten Revolutions. USA Today, December 31, page 4D.

Westphal, D., 1997, State's Growth Rebound. The Sacramento Bee, December 31, page 1 and page 14.

Whittle, D.B., 1997, Cyberspace: The Human Dimension (NY: W.H. Freeman & Company).

Wildstrom, S., 1997, The World Wide Classroom. Business Week, December 29, page 18.

Wood, J., 1997, Ready for CETI? Chico News & Review, December 18, pages 13-15.

Wright, D., 1995, America In The 20th Century: 1980-1990 (NY/London: Marshall Cavendish).

# # #

1. © This WWW paper was completed January 5, 1998, for the Professional Development Committee Meeting of PAUSE'98 (with the Theme of "Guess Who's Coming To College...In The Millennium?") Clear Lake, California, January 8-9, 1988. A Professor of Anthropology at California State University, Chico, I received my B.A. (1967) from Western Washington University and the M.A. (1969) and Ph.D. (1972) from the University of Oregon. I have been at CSU, Chico since 1973 and teach various Cultural Anthropology courses and was recently chosen as one of the five "Master Teachers" for 1997-1999 at CSU, Chico. To return to the beginning of this paper, please click here. To go to the Home Page of Urbanowicz, please click here and to go to the Department of Anthropology Home Page, please click here. To go to the Home Page for the College of Behavioral and Social Sciences, please click here and to go to the Home Page of California State University, Chico, please click here. Thank you for all of your "clicking" patience and I hope you enjoy the information and links! ALSO, please note that at the time this web document was created (January 1, 1998), all of the above links were active; considering potential "future changes" that I might be making to the document, you might wish to consider something like a "URL Minder" (information available at to "bookmark" this page.

[This item was printed from'98_Millennium_Paper.html]

# # #