Regarding the Turing test:
Isn't it a bit on the hard side? Why must the attribution
of consciousness to a computer be limited to the situation
where that computer flawlessly emulates a human being?
Hypothetically, is it not possible to imagine a situation
where one might form an impression of concsiousness
existing in a computer, without it necessarily having a
property of being indistinguishably human in nature?
If this has already been discussed in some
forum I'm not privy to I appologise, but as an
intercalating medic (hence on the 1st year
list) who is fairly interested by what is being
discussed here, I'm not doing the lectures -
can someone reply for my benifit please?!
Cheers,
Smith, Damian M
dms93fm@soton.ac.uk
This archive was generated by hypermail 2b30 : Tue Feb 13 2001 - 16:23:49 GMT