The University of Southampton
University of Southampton Institutional Repository

Does the Mind Piggy-Back on Robotic and Symbolic Capacity?

Does the Mind Piggy-Back on Robotic and Symbolic Capacity?
Does the Mind Piggy-Back on Robotic and Symbolic Capacity?
Cognitive science is a form of "reverse engineering" (as Dennett has dubbed it). We are trying to explain the mind by building (or explaining the functional principles of) systems that have minds. A "Turing" hierarchy of empirical constraints can be applied to this task, from t1, toy models that capture only an arbitrary fragment of our performance capacity, to T2, the standard "pen-pal" Turing Test (total symbolic capacity), to T3, the Total Turing Test (total symbolic plus robotic capacity), to T4 (T3 plus internal [neuromolecular] indistinguishability). All scientific theories are underdetermined by data. What is the right level of empirical constraint for cognitive theory? I will argue that T2 is underconstrained (because of the Symbol Grounding Problem and Searle's Chinese Room Argument) and that T4 is overconstrained (because we don't know what neural data, if any, are relevant). T3 is the level at which we solve the "other minds" problem in everyday life, the one at which evolution operates (the Blind Watchmaker is no mind-reader either) and the one at which symbol systems can be grounded in the robotic capacity to name and manipulate the objects their symbols are about. I will illustrate this with a toy model for an important component of T3 -- categorization -- using neural nets that learn category invariance by "warping" similarity space the way it is warped in human categorical perception: within-category similarities are amplified and between-category similarities are attenuated. This analog "shape" constraint is the grounding inherited by the arbitrarily shaped symbol that names the category and by all the symbol combinations it enters into. No matter how tightly one constrains any such model, however, it will always be more underdetermined than normal scientific and engineering theory. This will remain the ineliminable legacy of the mind/body problem.
204-220
Harnad, Stevan
442ee520-71a1-4283-8e01-106693487d8b
Morowitz, H.
f1290d5c-646f-43ae-8b3f-0225ea60933e
Harnad, Stevan
442ee520-71a1-4283-8e01-106693487d8b
Morowitz, H.
f1290d5c-646f-43ae-8b3f-0225ea60933e

Harnad, Stevan (1995) Does the Mind Piggy-Back on Robotic and Symbolic Capacity? Morowitz, H. (ed.) At The Mind, the Brain, and Complex Adaptive Systems The Mind, the Brain, and Complex Adaptive Systems. pp. 204-220.

Record type: Conference or Workshop Item (Other)

Abstract

Cognitive science is a form of "reverse engineering" (as Dennett has dubbed it). We are trying to explain the mind by building (or explaining the functional principles of) systems that have minds. A "Turing" hierarchy of empirical constraints can be applied to this task, from t1, toy models that capture only an arbitrary fragment of our performance capacity, to T2, the standard "pen-pal" Turing Test (total symbolic capacity), to T3, the Total Turing Test (total symbolic plus robotic capacity), to T4 (T3 plus internal [neuromolecular] indistinguishability). All scientific theories are underdetermined by data. What is the right level of empirical constraint for cognitive theory? I will argue that T2 is underconstrained (because of the Symbol Grounding Problem and Searle's Chinese Room Argument) and that T4 is overconstrained (because we don't know what neural data, if any, are relevant). T3 is the level at which we solve the "other minds" problem in everyday life, the one at which evolution operates (the Blind Watchmaker is no mind-reader either) and the one at which symbol systems can be grounded in the robotic capacity to name and manipulate the objects their symbols are about. I will illustrate this with a toy model for an important component of T3 -- categorization -- using neural nets that learn category invariance by "warping" similarity space the way it is warped in human categorical perception: within-category similarities are amplified and between-category similarities are attenuated. This analog "shape" constraint is the grounding inherited by the arbitrarily shaped symbol that names the category and by all the symbol combinations it enters into. No matter how tightly one constrains any such model, however, it will always be more underdetermined than normal scientific and engineering theory. This will remain the ineliminable legacy of the mind/body problem.

HTML harnad95.mind.robot.html - Other
Download (55kB)

More information

Published date: 1995
Venue - Dates: The Mind, the Brain, and Complex Adaptive Systems, 1995-01-01
Organisations: Web & Internet Science

Identifiers

Local EPrints ID: 253359
URI: https://eprints.soton.ac.uk/id/eprint/253359
PURE UUID: b203009c-7791-4faf-a415-d0fc5d272f63
ORCID for Stevan Harnad: ORCID iD orcid.org/0000-0001-6153-1129

Catalogue record

Date deposited: 26 May 2000
Last modified: 06 Jun 2018 13:05

Export record

Contributors

Author: Stevan Harnad ORCID iD
Editor: H. Morowitz

University divisions

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of https://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×