Continued
Worrying for a few minutes about what to serve for dinner in order please one's guests may be a sound investment of resources. Worrying about what will happen to your soul after death is a total waste. Human ancestors and other animals with foresight may have only worried about genuine and pressing problems such as not finding food or being eaten. Ever since they have become much more imaginative and have fed their imagination with rich cultural inputs, that is, since at least 40,000 years (possibly much more), humans have also worried about improving their lot individually and collectively—sensible worries—and about the evil eye, the displeasure of dead ancestors, the purity of their blood—misplaced worries.
A new kind of misplaced worries is likely to become more and more common. The ever-accelerating current scientific and technological revolution results in a flow of problems and opportunities that presents unprecedented cognitive and decisional challenges. Our capacity to anticipate these problems and opportunities is swamped by their number, novelty, speed of arrival, and complexity.
Every day, for instance, we have reasons to rejoice in the new opportunities afforded by the Internet. The worry of fifteen years ago that it would create yet another major social divide between those with access to the internet and those without is so last century! Actually, no technology in human history has ever spread so far, so fast, so deep. But what about the worry that by making detailed information about every user available to companies, agencies and governments, the internet destroys privacy and threatens freedom in much subtler ways than Orwell's Big Brother? Is this what we should worry about? Or should we focus on making sure that as much information as possible is freely accessible as widely as possible, forsaking old ideas of secrecy and even privacy and trusting that genuine information will overcome misinformation and that well-informed people will be less vulnerable to manipulation and control, in other words that, with a much freer access to information, a more radical kind of democracy is becoming possible?
Genetic engineering promises new crops, new cures, improvement of the human genome. How much should we be thrilled, how much frightened? How much and how should the development of genetic engineering itself be controlled, and by whom?
New arms of destruction— atomic, chemical, biological— are becoming more and more powerful and more and more accessible. Terrorist acts and local wars of new magnitude are likely to occur. When they do, the argument will be made even more forcefully than it was in the US after 9/11 that powerful states should be given the means to try and prevent them including in ways that curtail democratic rights. What should we worry most about, terrorism and wars or further limitations to rights?
Looking further into the future, human will soon be living with, and depending on intelligent robots. Will this develop into a new kind of masters-servant dialectic, with the masters being alienated by their servants? Will in fact the robots themselves evolve into masters or even into intelligent, purposeful beings with no use for humans? Are such worries sound or silly?
These are just some examples. Scientific and technical developments introduce novel opportunities and risks that we had not even imagined at a faster and faster pace. Of course, in most cases, you and I form opinions as to what we should really worry about. But how confidently can we hold these opinions, pursue these worries?
What I am particularly worried about is that humans will be less and less able to appreciate what they should really be worrying about and that their worries will do more harm than good. Maybe, just as on a boat in rapids, one should try not to slowdown anything but just to optimize a trajectory one does not really control, not because safety is guaranteed and optimism is justified—the worst could happen—, but because there is no better option than hope.