Using Humans as a Computer Model By STEVE LOHR http://www.nytimes.com/2001/10/15/technology/ebusiness/15NECO.html
THINK of it as computing's crisis of complexity, revisited. For more than three decades, the big advances in computing have soon brought new headaches. The initial steps ahead are typically in hardware - processors, storage and networks - and the headaches are manifested in software. It is software that is the medium for doing all the new things in computing that hardware makes possible - whether simple numeric calculations or increasingly sophisticated functions like symbolic processing, graphics, simulations, artificial intelligence and so on. In computing, opportunity breeds complexity. And complexity begets systems that can be buggy, unreliable and difficult to manage.
This cycle of challenge became apparent shortly after the I.B.M. (news/quote) 360 mainframe, introduced in 1964, brought computing into the mainstream of corporate and government life. In 1968, NATO sponsored a conference prompted by concerns that the "software crisis" at the time posed a threat to the economic health and military readiness of the West. Training, engineering and new software and hardware tools helped the computing community cope with the crisis of the late 1960's. But as Frederick P. Brooks Jr., one of the architects of the I.B.M. 360, observed, "Complexity is the business we are in, and complexity is what limits us." Paul M. Horn, a senior vice president who oversees the research labs at I.B.M., says the time is ripe for an assault on the ever-increasing complexity of computing in the Internet era, with its global networks and proliferation of digital devices. "We have a growing crisis on our hands," he said. Mr. Horn hopes to do something about it. Starting today, at the Agenda conference in Scottsdale, Ariz., I.B.M. will begin distributing 75,000 copies of a 39-page paper written by Mr. Horn, in which he calls the current version of the complexity problem the industry's "next grand challenge." The paper will be distributed to computer science researchers in universities, national labs and companies worldwide. I.B.M. is also making a commitment to underwrite 50 research projects at universities the next three to five years to take on the complexity challenge - millions of dollars of research grants. Mr. Horn's paper is intended partly as a call to action for researchers and the industry, but it also points toward a path for solving the problem. He calls it "autonomic computing." It is a biological metaphor suggesting a systemic approach to attaining a higher level of automation in computing.
Just as a person's autonomic nervous system automatically handles all kinds basic functions - the heart rate, breathing and digestion, for example - in response to changing conditions, so, too, should computer systems, according to Mr. Horn. The human body "does all this without any conscious recognition or effort on your part," he writes. "This allows you to think about what you want to do and not how you'll do it: you can make a mad dash for the train without having to calculate how much faster to breathe and pump your heart." Similarly, Mr. Horn says, the way to handle the complexity problem is to create computer systems and software that can respond to changes in the digital environment, so the systems can adapt, heal themselves and protect themselves. Only then, he adds, will the need be reduced for constant human maintenance, fixing and debugging of computer systems.
His work with I.B.M.'s fast-growing services business, Mr. Horn said, set him to thinking deeply about the complexity issue. The services arm of I.B.M., he notes, has been growing by about 15,000 people a year the last five years. "They're all managing the complexity we've created in the information technology industry," he said in an interview. "The only way to get efficiency gains in information technology is to take some of the people out." Even with the industry's current slowdown, the demand for information technology workers is expected to grow by more than 100 percent over the next six years. "There just aren't enough skilled people," Mr. Horn said. "If we don't do something, we'll be a services industry, and the industry won't grow. We're already headed in that direction."
Mr. Horn insists that no one company can handle the complexity challenge on its own, and that doing so will take research efforts over the next 5 to 10 years in areas like adaptive algorithms for software agents, self-healing server computers, artificial intelligence and control theory. Still, many of these are fields in which I.B.M. has its own research projects under way. "We're certainly trying to push academic research in this direction," Mr. Horn said, "because we think it's an important direction."
I.B.M. will begin showing its autonomic computing vision to the rest of the industry this week. A handful of academic computer scientists have already seen Mr. Horn's paper. They see I.B.M.'s initiative as an endorsement of ambitious computer science research in a variety of fields.
Ben Kuipers, a computer scientist at the University of Texas, has been conducting research aimed at building "common sense knowledge" into software agents - one of the technologies needed if autonomic computing is to become a reality someday.
At the University of California at Berkeley, John Kubiatowicz has been researching "introspective computing," or systems that constantly monitor themselves and adapt to changes in the computing environment. Mr. Kubiatowicz says that the time is right for research efforts of the kind I.B.M. is championing, both because complexity is indeed a growing problem and, as in the past, the steady gains in hardware - processing power and storage - have opened the door to addressing the challenge. "We suddenly have the resources to do this," he said.