Intelligence without Representation or Reasoning
Last Update: 3 December 2008
material is highlighted
3 Questions for KRR:
need to represent
should be represented?
E.g., logic? no logic?
Highlights of Brooks's theory:
top-down approach: intelligence via thought & reason
with heavy computational load
(and only a partial model:
actions using the model
& ignoring the world!
in the world (cf.
Bottom-up approach to AI:
"human-level intelligence" into
(vision, language, etc.) and
... start with
that are "complete"
i.e., that can operate in real world with appropriate sensory & acting abilities
e.g., robotic insects
i.e., follow evolutionary path:
build mobility, vision, survival tasks as foundation for "true intelligence"
"these are the hard tasks"...
most of intelligence "is routine activity in a...benign but...dynamic world"
The hard tasks "constrain" solutions to the routine problems
viz., the classic AI problem-solving & planning problems:
e.g., games, math, logic
Cf. basic trade-off paradox of AI:
Tasks that are
hard for people to do
but easy to teach (games, math, logic, etc.)
are easy to develop computational theories of,
but tasks that are easy for people to do
but hard to teach (e.g., vision, mobility, survival)
are hard to develop computational theories of.
"Thought & consciousness don't need to be programmed in; they will emerge"
"The world is its own best model"
use world as its own model by continually monitoring & sensing the world
"explicit representations & models of the world...get in the way"
"internal world models...are...impossible to obtain [&] not...necessary" for action
"Representation is the wrong unit of abstraction
in building the bulkiest parts of intelligent systems
so, possibly representation is OK for some things:
"Representations...appear only in the eye or mind of the observer"
Dennett's "intentional stance"
possibly, representations are ways for a
what a robot does & how it works
" can...talk about an agent's beliefs & goals, even though the
need not manipulate symbolic data structures"
But cf. Brian Cantwell Smith's
Knowledge Representation Hypothesis
Brooks's "key aspects":
"robots are situated in the world"
"experience the world directly"
physical grounding gives meaning to system's processing
to be intelligent"
"intelligence...emerges from system's interaction with the world"
What is the role of
as opposed to
memory, reasoning, advice taking (learning by being told), etc., all require
at what level do representations "emerge" or become necessary?
Brooks: "partial models of the world" that are "relevant" to the current task (e.g., maps)
Agre & Chapman: there are
representations for specific purposes only.
e.g., not: Bee-21
which may/may not be the same bee chasing me before
deictic representations are intensional??
(Cf. memory not as storage but as re-firing of neurons)
What about reasoning?
Brooks: not needed (but he doesn't give much of an argument)
There are no permanent representations
Therefore, there is nothing to reason about or with?
Intelligence may be less a function of reasoning than of
interaction of system components with themselves and with the world
But: Gentner 2003: We're smart because of our analogical reasoning abilities operating on KRR systems.
Dennett, Daniel C.
Journal of Philosophy
68(4): 87-106; reprinted in Daniel C. Dennett,
(Montgomery, VT: Bradford Books): 3-22.
For some interesting followups to Dennett's paper, see:
Miller, Christopher A. (guest ed.) (2004),
"Human-Computer Etiquette: Managing Expectations with Intentional Agents"
Communications of the ACM
47(4) (April): 31-34.
Brooks, Rodney A. (1991),
"Intelligence without Representation"
Brooks, Rodney A.
"Intelligence without Reason"
, IJCAI-91 (San Mateo, CA: Morgan Kaufmann): 569-595.
"Today the Earwig, Tomorrow Man?"
Brooks, Rodney A.
"From Earwigs to Humans"
Proceedings IIAS: The Third Brain and Mind International Symposium Concept Formation, Thinking and Their Development, Kyoto, Japan
: pp. 59-66.
Gentner, Dedre (2003),
"Why We're So Smart"
, in Dedre Gentner and Susan Goldin-Meadow (eds.),
Language in Mind: Advances in the Study of Language and Thought
(Cambridge, MA: MIT Press): 195-235.
Copyright © 2003-2008 by
William J. Rapaport