My early years were marked by scientific advances. My mom told stories about her family’s first radio. In the middle of the war, the first jet planes showed up. Followed by atomic weapons. Four years later a television set arrived at a neighbor’s house. All the kids on the street gathered there regularly to watch wrestling, The Lone Ranger, and Tom Corbett.
During my high school years, I watched on TV while we tested hydrogen bombs. In 1957 the Russians put Yuri Gagarin in orbit. I remember seeing the satellite one evening cross the sky over Philadelphia. Meantime, cars appeared that didn’t need stick shifts. We switched from a coal-fired furnace to one that was powered by electricity. It was a major breakthrough, saving my father from having to haul cans full of coal ashes up a flight of stairs for the collector. And in 1961 JFK promised the U.S. would put a man on the Moon by the end of the century. We made it. Just barely, but nobody criticized.
People, in earlier centuries, had spent their years watching nothing change. They had no electrical power. The first indoor toilet was invented in 1596. Other than that, things were quiet. But by the mid-1950’s I expected to live to see the first Martian landing, which would probably happen in the 1970’s. So I could expect to be around long enough to find out whether we’re alone or not. That conviction came seriously home in 1984 when SETI was founded.
Gradually, during those years, the impression that civilization was moving ahead got derailed. Technology was getting better, but life was getting scarier. The Cold War went full throttle, but I thought we’d be smart enough eventually to make it go away. Which we did.
But there were other issues. Various species were being driven to extinction by us. So okay, I could live with that, if necessary until I discovered that we’d pay a substantially high price if we killed off the bees, which manage so much of the pollinization. Or the bats. (And I sense eyes widening, but if we lose the bat population, the vast numbers of insects that are their major food source would be creating some major problems.) Butterflies are important. There are others. And we are losing them all. And unfortunately no one really seems to care.
Technology also provides more effective weaponry. Tad Daley, in Apocalypse Never, argues that unless we get rid of nuclear weapons, they will eventually get rid of us. The Hawaiian false alarm Saturday underscores that point. We are also beginning to concern ourselves with the old science fiction AI hypothesis.
There are other dangers. As technology improves, it can be used to devise and spread a hyperepidemic. Or, after making us all dependent on the world-wide web, to shut it down, and transportation systems along with it. Leaving us to figure out how to get food when supermarkets collapse. And so on.
There’s another danger that may be the most difficult of all to manage: population growth. The January issue of Mike Resnick’s Galaxy’s Edge carries a chilling reprint of an article by Gregory Benford from 1994 on the topic. The global population at the beginning of the twentieth century was about 1.7 billion. Today it’s four times higher. It doesn’t take Stephen Hawking to figure out that if it continues to multiply at that rate, we are going to have a serious problem.
When SETI first began scanning the skies, I thought it wouldn’t take long for them to pick up something. It’s never happened. How could that be? Scientists are now arguing that life began on Earth pretty much the same afternoon that the appropriate conditions made it possible. It that’s true, the start-up can’t be as complicated as we’d always assumed. That means worlds in goldilocks zones everywhere should be home to living creatures. Which suggests there should be some worlds with intelligent beings. With radios. But the sky remains uncomfortably silent.
There are a couple of possible explanations. We might be wrong about the difficulty in life’s getting started and we’re just an incredibly lucky shot. Or maybe intelligent beings, wherever they appear, have a tendency to waste little time developing the technology to kill themselves off.