?

Log in

No account? Create an account
Mad Scientist -- Day [entries|friends|calendar]
Mad_Scientist

[ userinfo | livejournal userinfo ]
[ calendar | livejournal calendar ]

The New York Times : Technology [18 Oct 2001|05:04pm]
Using Humans as a Computer Model By STEVE LOHR http://www.nytimes.com/2001/10/15/technology/ebusiness/15NECO.html

THINK of it as computing's crisis of complexity, revisited. For more than three decades, the big advances in computing have soon brought new headaches. The initial steps ahead are typically in hardware - processors, storage and networks - and the headaches are manifested in software. It is software that is the medium for doing all the new things in computing that hardware makes possible - whether simple numeric calculations or increasingly sophisticated functions like symbolic processing, graphics, simulations, artificial intelligence and so on. In computing, opportunity breeds complexity. And complexity begets systems that can be buggy, unreliable and difficult to manage.

This cycle of challenge became apparent shortly after the I.B.M. (news/quote) 360 mainframe, introduced in 1964, brought computing into the mainstream of corporate and government life. In 1968, NATO sponsored a conference prompted by concerns that the "software crisis" at the time posed a threat to the economic health and military readiness of the West. Training, engineering and new software and hardware tools helped the computing community cope with the crisis of the late 1960's. But as Frederick P. Brooks Jr., one of the architects of the I.B.M. 360, observed, "Complexity is the business we are in, and complexity is what limits us." Paul M. Horn, a senior vice president who oversees the research labs at I.B.M., says the time is ripe for an assault on the ever-increasing complexity of computing in the Internet era, with its global networks and proliferation of digital devices. "We have a growing crisis on our hands," he said. Mr. Horn hopes to do something about it. Starting today, at the Agenda conference in Scottsdale, Ariz., I.B.M. will begin distributing 75,000 copies of a 39-page paper written by Mr. Horn, in which he calls the current version of the complexity problem the industry's "next grand challenge." The paper will be distributed to computer science researchers in universities, national labs and companies worldwide. I.B.M. is also making a commitment to underwrite 50 research projects at universities the next three to five years to take on the complexity challenge - millions of dollars of research grants. Mr. Horn's paper is intended partly as a call to action for researchers and the industry, but it also points toward a path for solving the problem. He calls it "autonomic computing." It is a biological metaphor suggesting a systemic approach to attaining a higher level of automation in computing.

Just as a person's autonomic nervous system automatically handles all kinds basic functions - the heart rate, breathing and digestion, for example - in response to changing conditions, so, too, should computer systems, according to Mr. Horn. The human body "does all this without any conscious recognition or effort on your part," he writes. "This allows you to think about what you want to do and not how you'll do it: you can make a mad dash for the train without having to calculate how much faster to breathe and pump your heart." Similarly, Mr. Horn says, the way to handle the complexity problem is to create computer systems and software that can respond to changes in the digital environment, so the systems can adapt, heal themselves and protect themselves. Only then, he adds, will the need be reduced for constant human maintenance, fixing and debugging of computer systems.

His work with I.B.M.'s fast-growing services business, Mr. Horn said, set him to thinking deeply about the complexity issue. The services arm of I.B.M., he notes, has been growing by about 15,000 people a year the last five years. "They're all managing the complexity we've created in the information technology industry," he said in an interview. "The only way to get efficiency gains in information technology is to take some of the people out." Even with the industry's current slowdown, the demand for information technology workers is expected to grow by more than 100 percent over the next six years. "There just aren't enough skilled people," Mr. Horn said. "If we don't do something, we'll be a services industry, and the industry won't grow. We're already headed in that direction."

Mr. Horn insists that no one company can handle the complexity challenge on its own, and that doing so will take research efforts over the next 5 to 10 years in areas like adaptive algorithms for software agents, self-healing server computers, artificial intelligence and control theory. Still, many of these are fields in which I.B.M. has its own research projects under way. "We're certainly trying to push academic research in this direction," Mr. Horn said, "because we think it's an important direction."

I.B.M. will begin showing its autonomic computing vision to the rest of the industry this week. A handful of academic computer scientists have already seen Mr. Horn's paper. They see I.B.M.'s initiative as an endorsement of ambitious computer science research in a variety of fields.

Ben Kuipers, a computer scientist at the University of Texas, has been conducting research aimed at building "common sense knowledge" into software agents - one of the technologies needed if autonomic computing is to become a reality someday.

At the University of California at Berkeley, John Kubiatowicz has been researching "introspective computing," or systems that constantly monitor themselves and adapt to changes in the computing environment. Mr. Kubiatowicz says that the time is right for research efforts of the kind I.B.M. is championing, both because complexity is indeed a growing problem and, as in the past, the steady gains in hardware - processing power and storage - have opened the door to addressing the challenge. "We suddenly have the resources to do this," he said.
post comment

Symbiotic Intelligence [18 Oct 2001|07:04pm]
http://ishi.lanl.gov/symintel.html

Date: Thu, 18 Oct 2001 14:46:47 -0700
Reply-to: Hyperplexity@yahoogroups.com
Subject: [Hyperplexity] The Symbiotic Intelligence Project

This website contains information from Los Alamos reports:
LA-UR 97-1200, 98-489, 98-2227, 98-1150, 98-2549

http://ishi.lanl.gov/symintel.html
Self-Organizing Knowledge on Distributed Networks
Driven by Human Interaction

The following is from New Frontiers in Collective Problem Solving,
by N. Johnson, S. Rasmussen, M. Kantor; LA-UR-98-1150.

The goal is to analyze and facilitate how people, in the process of
accessing
and using information on networks, create new knowledge without
premeditation.
We argue that the symbiotic combination of humans and smart networks
will
result in a previously unrealized capability of collective problem
identification and solution. This capability is based on the
pre-existing
self-organizing dynamics of social evolution. This symbiotic
intelligence will
greatly increase the success of organizations in achieving their goals,
better
utilizing their resources and preparing for the future. For the human
society
as a whole, this new resource will improve our quality of life and
vitality as
a species.
Read more...Collapse )
post comment

From J.R.Molloy [18 Oct 2001|07:07pm]
Reply-to: Hyperplexity@yahoogroups.com
Subject: [Hyperplexity] Glossary

Methodologies - Identifiable techniques that can be widely used in the
area of
study. Assumed to be of broad use. A technique of limited use is not a
methodology. Likely this would not refer to a theory or approach to a
general
problem.

Self-organizing - (preferred over emerging) the ability of a
distributed
system to exhibit global structures or dynamics from local rules or
interactions.

Emergent - as in an emergent property: one which cannot be observed
locally in
the subsystems, but only as a global structure or dynamic. We limit the
usage
to an emergent property or structure and not as an emergent system.

Complexity - Generally avoided as an overused and poorly defined word,
except
in specific systems.

Experiments - activities that produce data or test concepts.
Simulations are a
subset of experiments, which are abstracted from the system that they
model
(or simulate).

Agents - an entity that moves through a system of interest - it has
processing
capability and memory (also see nodal). An agent includes people as
well as
"artificial" agents. Agents have the perspective of moving through
informational space and not being restricted to a specific location in
space.
In numerical methods this concept is analogous to a Lagrangian
treatment or
approach. Hence, usage includes: agent approach, agent behavior, agent
perspective, agent methods, etc. Note that an agent can be defined in a
discrete space (a graph) in a nodal representation without loss of
functionality, but only if all nodes contain the agent capability.

Nodes/nodal - (opposite of agent). Nodes have the perspective of being
restricted to a specific location in space and not moving through
informational space. In numerical methods this concept is analogous to
an
Eulerian treatment or approach. Hence, useful usage includes: nodal
approach,
nodal perspective, nodal methods, etc.

Problem solving - used in both a traditional sense of the act of a
centralized
problem solver, either a human or organization or a computing system.
And also
in the less accepted sense of a "solution" found by a self-organizing
system.
If the solution is a global property that is not observable locally,
then it
is an emergent property.

Collective - a more fuzzy word for self-organizing.

--- --- --- --- ---
post comment

navigation
[ viewing | October 18th, 2001 ]
[ go | previous day|next day ]