This is Searle's Chinese Room taken from his chapter "Minds and Brains
without Programs" in "Mindwaves" edited by Colin Blakemore and Susan
Greenfield, Blackwells, 1987. I forgot to take the photocopy, provided
by Wendy, home. I hope this is appropriate!
The Gap: Searle says that although we are reasonably confident in using
grandmother (intentionalistic) high levels of psychology to explain
behaviour, we just suppose that there are underlying hard science
levels, because we haven't the faintest idea of how the lower
(neurophysiological) levels work. The bit between the high and low
levels is the gap. The main candidate to fill the gap is the mind/brain
=program/hardware (strong AI) explanation. The consequence of this
analogy for some, is that "intelligence is purely a matter of physical
symbol manipulation".
The Chinese Room Revisited: Searle uses an example of the Turing test
to dispute the idea that computers can think in the same way that
humans can. A person can pretend to be a computer by receiving a
question in Chinese and giving the correct answer in Chinese simply by
manipulating symbols using a manual which says what response to make
for what input (even though they don't understand Chinese). If the
person now receives the question in English they can understand in a
way that the computer cannot. The analogy therefore breaks down because
we have semantics and computers only have syntax. Thus strong AI fails
to distinguish between syntax and semantics.
[I am unhappy with this, it seems to me that computers could have as
much semantic information as we have. Anything we know about Chinese,
they could know too. The difference between us and machines just
appears to be that machines haven't been programmed to experience
understanding, and if they have, then they haven't been programmed (or
rather built) to say that they understand. Searle seems to have a
homonculus that understands, but what is the understanding experience?
Surely it can only be an experience of the fit of input with memory.]
The Brain and its Mind. Searle reckons that brain function is all about
"variable" non-digital rates of neuronal firings, as opposed to
all-or-nothing firings, and that understanding brain function is all
about understanding systems of neurons in networks, circuits, slabs,
columns, cylinders etc. All mental phenomena are caused by processes in
the brain and all causal processes are internal to the brain even
though "mental events mediate between external stimuli and motor
responses there is no essential connection". Mental phenomena, like
pain, are features of the brain.
[This seems a bit extreme, I would have thought that most behaviour is
a result of a mixture of ongoing mental processes and mental processes
arising directly out of external stimuli.
Macro- and Micro-properties. This comes down to brains experience pain
but neurons don't although the latter cause or realise pain in the
latter, is like water freezes but molecules don't although the latter
(micro, low level features cause or realise higher- level features in
the former).
The Possibility of Mental Phenomena. Consciousness now is like "life"
used to be when we didn't understand the biology of life. We simply
need to wait until we understand the characteristics of brain processes
and the micro-macro-analogy. This is also true of intentionality,
subjectivity, and intentional causation. Mental states are physical
states of the brain.
Consequences for the Philosophy of Mind. Searle presents the principle
of neurophysiological sufficiency - "what goes on in the head must be
causally sufficient for any mental state whatsoever. He disputes
Wittgenstein's "an inner process stands in need of an outward
criterion".
[While it is true that mental events can take place with no observable
behaviour/output and mental events can take place with no observable
stimulation/input it still seems a polarised view when you consider
that most of our everyday life is spent receiving input and producing
output via mental mediation. Searle seems to over concentrate on
programs and semantics. While programs are transportable between
machines, they are not independent of them directly or indirectly. Try
running Windows 95 on a Sinclair Spectrum. It seems conceivable that
computer architecture will become more complex, and will have more
special-function modules. The main difference between computers and
brains is that brains are alive, and can change or adapt their
architecture both phylogenetically and ontogenetically.]
This archive was generated by hypermail 2b30 : Tue Feb 13 2001 - 16:23:57 GMT