> From: Hawkins, Sean <swh196@soton.ac.uk>
>
> On Thu, 22 May 97 18:28:00 +0100 Harnad, Stevan wrote:
>
> > From: Harnad, Stevan <harnad@cogsci.soton.ac.uk>
> >
> > (25) What is information?
> >
> > a. any message that makes you feel informed
> >
> > b. any message that makes you feel informed
> > when the sender meant you to feel informed
> >
> > c. ***anything that reduces your uncertainty
> >
> > d. anything that reduces your uncertainty when the
> sender meant to reduce your uncertainty
> >
> > e. none of the above
> >
> > > From: Hawkins, Sean <swh196@soton.ac.uk>
> > > I do not believe c (anything which reduces your uncertainty)
> > > is the correct answer.
> >
> > "I have used the food dispenser example before to explain what
> > information and communication are about: If there are six numbers and I
> > can only choose one number each day, then my chance of getting lunch is
> > 1/6, but if someone tells me that the number is odd today (and that is
> > true) then my probability of lunch has gone up to 1/3."
> >
> > From Lecture Notes on Chapter 9
> >
> > "Information = the reduction of uncertainty between alternatives
> > that MATTER to you.
> >
> > "The 6-choice lunch machine is an example of how any data that could
> > reduce the uncertainty about which of the 6 windows contains the lunch
> > is informative."
>
> Regardless of whether you omit or keep this question in, I
> still challenge the answer.
>
> Basically, you have just used a contextual example to show
> where information reduces uncertainty.
Every example of information has to be contextual, because information
is always defined relative to a previous state of uncertainty
between N alternatives, and the reduction of uncertainty is likewise
contextual rather than absolute (whatever "absolute" might mean).
Information has also been called "negentropy," where entropy is
disorder and negentropy (= information) is something that brings some
order into the disorder.
But "information" does not have a mental meaning; nor does information
mean "meaning." It should (in fact it must) be possible to quantify
the alternatives and the uncertainty objectively (as you would if
your were sending morse code messages and reckoning the probability
of getting a dash when a dot was sent, and vice versa).
So the classical Shannon/Weaver definition of information is not a
mentalistic definition; it has nothing directly to do with the mind at
all. See http://cord.iupui.edu/~lchen1/who3.html
and
http://www.wfu.edu/Academic-departments/Speech-Communication/infot/infop1.html
There IS a version of probability theory in which "subjective
probability" is used. That is an alternative to the more standard
theory of probability, which is based on relative frequency (marbles in
urns), rather than a thinker's subjective feeling about how likely some
outcome (say, Newcastle's becoming champions) is. This is the area in
which Bayes' theorem is used to calculate the conditional probability
of an outcome given some new evidence. The "prior probability" (before
the new evidence), corresponds to how likely the subject thought the
outcome was before the new evidence; and then the new evidence is used
to calculate the probability of that outcome, given the new evidence:
This OPERATES on a subjective prior probability to begin with, but
that's the end of subjectivity, because what that prior subjective
probability should become in the face of the new evidence is no longer
a subjective matter.
For example, suppose it was the beginning of the new season, in which
Newcastle had done well in the prior year; in fact. let's say they came
first last year, and did not lose a game. So you start the next season
with the subjective feeling that Newcastle is very likely to win again.
Let's say you would be willing to make a $100 bet on it. Unbeknownst
to you (and to everyone except those particular players and their
doctors), several Newcastle players have taken a lot of steroids
between seasons, and it has caused them some physical damage, so they
are playing somewhat worse this year.
After every game you have a chance to change your bet (lowering the
amount you are willing to bet on Newcastle's coming out first at the
end of the season).
After every game they lose, you should revise their probability of
winning, no matter what your initial feeling had been. It can be shown
that if you do not revise your prediction (by lowering the amount you
are willing to bet that they will come first again this year),
then you will lose a lot of money (especially if you imagine doing
this year after year, each time betting initially on last year's
winner).
So even with subjective probability, it is the objective odds that
matter, once you start getting new data. And the data are the carriers
of the information.
You can think of the new data as the odds that Newcastle will win; no
matter what your initial subject belief was, it should be
revised to conform to the objective odds or you lose. And to lose in
this situation is simply to fail to process potential information; it
is not not to have uncertainty increased.
> There are an
> infinite number of examples where information increases
> uncertainty.
There are an infinite number of ways that people can be unsure, become
unsure or stay stay unsure, despite data that would reduce their
unsureness (or increase their sureness). Let's use "sureness" to refer
to the subjective feeling, and "certainty" for the objective odds, and
how new information should (if you use it) change them.
I think you are conflating sureness (which is subjective) and certainty
(which is objective).
> Taking your 6-choice lunch machine as an example. What if
> you were provided with another piece of information which
> says that lunch may or may not be even present ? Would that
> not increase your uncertainty as to the probability of
> eating lunch on that day ? You say that information is the
> reduction of uncertainty between alternatives that matter to
> you. How would you describe a subsequent statement
> along the lines of "lunch may or may not be even present" ?
> If that is not information, what is it ? It doesn't reduce
> uncertainty, it increases it - it is not a directive or a
> question but it has relevance and it matters to you.
"Information" is not a mental concept. In the formal theory of
information, there is "information" and "entropy" (noise, chaos).
If you received "information" that lunch may or may not be present,
and in reality the odds had not changed at all, then that simply
would not be information in the technical sense; rather, it would be
noise.
> If you had made clear in the question that you expected the
> answer to come from the theory of information then I would
> have obviously selected (c), but the question did not have
> an explicit theoretical basis. This issue is one which has
> commonalities with the "would you mind getting off my foot
> ?" question.
How about if I settle for the answer coming from the text, lectures,
lecture notes, or discussion?
> I would submit that information which is attended to - ie
> you understand that it has reduced (or indeed increased)
> your uncertainty, is no longer information but rather
> intelligence - information is purely explicit, intelligence
> is an implicit derivation of information, and is therefore
> subjective because not all information is attended to.
>
> Sean Hawkins
We are all free to define our own terms, but a quiz question on what you
have learnt in this course is not the best place to introduce new
definitions. If I had been you, I would have skywritten about my
views on the inadequacy of the Shannon/Weaver definition of information
for explaining the mind. But not having challenged the conceot when it
was taught to you, challenging it on a multiple choice quiz is not very
practical: Challenges to questions on multiple-choice quiz are supposed
to be challenges to the clarity of the question or the existence and
uniqueness of the answer (assuming the answer has been given during the
course).
But it's a pity to raise such interesting questions so late, when
everyone is just trying to master the material that has gone
unchallenged when it was taught...
This archive was generated by hypermail 2b30 : Tue Feb 13 2001 - 16:23:53 GMT