User:CharlesGillingham/More/Moravec's paradox
From Wikipedia, the free encyclopedia
- FIND A SECOND SOURCE FOR THIS INSIGHT
Moravec's Paradox is a principle in artificial intelligence and robotics. It states that reason (conscious, intelligent, rational thought) is easy for machines to imitate, but that unconscious sensorimotor skills and instincts are, relatively, far more difficult. This contradicts the tradition in Western thought that consciousness and reason are the "highest" and "noblest" human faculties. The principle was first articulated by Hans Moravec and others in the early 1980's to help explain why "it is comparatively easy to make computers exhibit adult level performance on intelligence tests or playing checkers, and difficult or impossible to give them the skills of a one-year-old when it comes to perception and mobility."[1] Moravec also said "Computers are at their worst trying to do the things most natural to humans."[2]
Contents |
[edit] The hierarchy of human faculties
Hamlet. how noble in reason ...
[edit] Embodiment in cognitive science
Cognitive scientists George Lakoff, Mark Turner and others believe that abstract reasoning is "embodied", and by this they mean that abstract reasoning requires us to use knowledge and skills that we gain from "the body", including our half a billion year old sensorimotor and perceptual skills.[3] Lakoff and Ralph Nunes argued in YEAR! that we understand abstract concepts by thinking of simple physical situations (sometimes called "image schema"). For example, to understand the mathematical concept of imaginary numbers, we picture the rotation of a an arrow across a mental picture (the complex plane). In doing this, we are using our visual cortex and our skills in reasoning about space.[4]
Also mention Dreyfus (?)
[edit] Building intelligent machines
Moravec believes this paradox is "a giant clue to how we should proceed in building an intellient machine."[5]. Artificial intelligence researchers like David Marr, Rodney Brooks and Hans Moravec have argued very forcefully that artificial intelligence must begin with our sensorimotor skills, since our abstract skills depend upon them. We must do the hard part first, building intelligence from the "bottom-up".[6]