The Ultimate In Mobile Computing

Dec. 21, 2004

PalmPilots. Windows CE. Car phones. Cell phones. Armband computers for warehouse management. Bar-code readers. Pagers. Geophysical positioning devices. Where will it all end? Today many of us are connected day and night to the office, the plant, or the home. And that connectedness is only going to increase. In fact, judging from the direction technology is headed, personal computing is likely to get both more personal and more mobile. Speaking at a conference last month on corporate information portals for the Internet, Ray Kurzweil, creator of optical-character-recognition and speech-recognition technologies, predicted that sometime in the next two or three decades implantation of computer chips in the brain will begin. The idea will be to augment the power of the human brain, which, despite its flexibility and incredible power, has already been far outdistanced by computers in some measures. Even so, today's computers are a long way from matching the complexity of the human brain. So far, there is no way to measure consciousness. Likewise for emotions such as love and compassion or envy and jealousy. Ethical judgment remains something computers have yet to master. (It appears some humans have yet to master it as well.) "Computers are inherently faster and much more accurate, but they are still one million times less complex than the human brain," said Kurzweil, author of The Age of Spiritual Machines (1999, Viking). But they are catching up. "I believe that gap will be closed over the next several decades. Software will eventually achieve the level of human intelligence and will surpass it." Computer-chip implants will be capable of boosting human memory by one million times, the inventor-author claimed. He sees this development as a step forward that will enhance the quality of life. "This will, in a very intimate way, expand human potential," said Kurzweil. For one thing, he points to the fact that humans are the only animals known to pass on an accumulated species-wide knowledge base to their offspring. The ability to implant the totality of human knowledge into a child at birth could mean that we'd no longer need schools and that everyone would be a genius. I imagine there'd be an urgent rush to buy the latest Intel chip to get an "upgrade." No one would want to be caught at a cocktail party relying on last year's microprocessor to keep up his or her end of a conversation. If this scenario sounds a bit frightening, better reach for your seat belt, because Kurzweil's world gets only more so -- and fast. He foresees by the middle of the 21st century a world in which humans who lack these computer implants in the brain will be considered backward, almost retarded, if you will. This scenario raises other troubling questions. Will there be an attempt to create a "master" race? A "worker" race? Neither sounds too far-fetched. After all, our lexicon today includes "knowledge" workers and "bluecollar" workers. Will the government seek to control the software in these chips to the extent that it will tell people how they should behave? Could a computer-driven brain really be "free"? Asked whether he thought the notion of implanting computers in people's brains posed social concerns that might ultimately put the kibosh on such efforts, Kurzweil seemed doubtful, yet optimistic that man could effectively manage his own cyber-evolution. "I don't think it's a process we can stop," he said. "It's not one project. It's hundreds of thousands of projects around the globe. There is tremendous economic incentive to create more powerful software and more powerful computers. But we can guide the process." For better or worse, it's probably too late to turn back the technology clock. "If all computers stopped today, everything in our lives would come to a halt," Kurzweil observed. "Computers are very intimately integrated into our civilization." At one point while discussing biotechnology research, Kurzweil said a potentially frightening scenario already exists in the area of recombinant DNA research. "In a bioengineering laboratory at a university today, there are the means to create a pathogen that would be more powerful and dangerous than an atomic bomb," he observed. Of course, just because we let one technology bugaboo out of the bag, does that mean we may as well let them all loose? At the dawn of the age of nuclear weapons, a global test ban treaty would have seemed impossible, yet today it's a reality. Perhaps guiding and controlling the use of technology in human development is all we can hope for. Still, the question looms large: Would having a chip in our brain make us more than human or less than human? Now you know why they call stored data "memory."

Sponsored Recommendations

Voice your opinion!

To join the conversation, and become an exclusive member of IndustryWeek, create an account today!