WHAT IS COMPUTATION?

First, What was there before computation? Differential Equations and Dynamical Systems.

The Turing Machine is a hypothetical mechanical device that reads and writes symbols on a tape according to internal states that causally dictate what it does. An example would be:

Machine is in state I, which is such that it makes it read its input tape position now.

(Read input.)

(Input is 0.)

State I is such that if the input is 1, the machine goes into state J, which is Halt; if the input is 0, the machine goes into state K, which is, advance the tape, write 1 and Halt.

For a more formal defintion of a Turing Machine, see:
http://i12www.ira.uka.de/~pape/papers/puzzle/node4.html
or
http://obiwan.uvi.edu/computing/turing/ture.htm
or
http://aleph0.clarku.edu/~anil/math105/machine.html
These simple mechanical operations are what computation consists of. They are things any mindless device can do. They are based on making all operations explicit and automatic. No "thinking" is required. (That is the point!)

The Turing Machine is only a hypothetical device. (A digital computer is a finite physical approximation to  it, differing in that its tape is not infinitely long.)

Implementation-Independence:The physical device that actually implements the Turing Machine is irrelevant (except of course that it has to be implemented by a physical device). This fact is critical. It is simple, but often misunderstood or forgotten, yet, as you will see, it is essential to the definition of computation. It is also the basis of the hardware/software distinction.

Let's call the marks on the Turing machine's tape "symbols" (actually "symbol tokens," because "symbol" really refers to a symbol-type, a kind of generic pattern, like "A," whereas a symbol token is an actual instance of A; but we will use "symbol" for both symbol types and symbol tokens except where the difference matters).

The implementation-independence or hardware-independence of computation is related to the notation-independence of a formal system: Arithmetic is arithmetic regardless of what symbol or notational system I use, as long as the system has the right formal properties (something corresponding systematically to "0," "+" "=" etc.). This is exactly the same as the implementation independence of computation: A Turing Machine is performing a particular computation if it implements the right formal properties. The physical details matter no more than the details of the shapes of the symbols in a notational system.

The counterpart of the hardware-independence of computation is the shape-independence and arbitrariness of symbols and symbol systems: It does not matter whether I designate "add" by "add," "+" "&" or "PLUS" -- as long as I use the it consistently and systematically to designate adding.

A symbol cannot really be defined in isolation. Or rather, a single symbol, unrelated to any symbol system, is trivial. (There is a joke about a wonder-rabbi at his death-bed, with all his discipiples gathered together to hear his last words. The wonder-rabbi murmurs "Life.... is like.... a bagel." All his disciples are abuzz with the message: "Pass it on: The wonder-rabbi says life is like a bagel!" The word is passed on till it reaches the synagogue-sweeper, the lowliest of the flock. He asks" "Life is like a bagel?  How is life like a bagel?" The buzz starts again as the question propagates back the the deathbed of the wonder-rabbi: "Wonder-Rabbi, How is life like a bagel?"

The wonder-rabbi pauses for a moment and then says "Okay, so  life's not like a bagel."

The point is that you can read anything and everything into a point-symbol: It only becomes nontrivial if the symbol is part of a symbol system, with formal relations between the symbols. And, most important, the symbol system must be semantically interpretable. That is, it must mean something; it must make sense -- systematic sense.
 

For example, it doesn't matter what symbol you use for "addition" in arithmetic, but then whenever you refer to addition, you must use that  (arbitrary) symbol, and the strings of symbols in which it occurs must be systematically interpretable as denoting addition. In particular, "1 + 1" must equal "2" no matter what notation you use for "1" "+" "=" and "2".

Now we are ready to define computation (the Turing Machine was more an example than a definition): Computation is symbol manipulation. Nontrivial computations are systems of symbols with formal rules for manipulating them. The shape of the symbols is arbitrary. That is just part of a notational system. But the symbols and the manipulations must be semantically interpretable: It must be possible to interpret them (systematically, not point-wise, like the bagel) as meaning something.

For example, arithmetic is a formal symbol system, consisting of its primitive symbols (o, +, =, etc.) and strings of symbols (axioms) and manipulation rules (rules for forming well-formed formulas, for making logical inferences, and for making arithmetic calculations).

See:

http://www.csc.liv.ac.uk/~frans/dGKBIS/peano.html

 http://www-groups.dcs.st-and.ac.uk/~history/Mathematicians/Peano.html
 
Now there is one and only one way to interpret all the axioms, theorems and calculations of arithmetic: as referring to  numbers and their properties. (There are also so-called "nonstandard" interpretations, but these only apply to things very much like numbers and isomorphic to them in critical respects.) Unlike "Life is like a bagel," the Peano's arithmetic symbol system has, to all intents and purposes, only one coherent interpretation. It doesn't make sense if interpreted as a military manual, a planetary map, or a Shakespeare play (and vice versa). This systematic mappability into meanings and vice versa is the central property of symbol systems.

Symbol systems are also compositional: They consist of elementary symbols that are combined and recombined according to the symbol manipulation rules. Yet all the (well-formed) combinations are semantically intrpretable too, and all the interpretations cohere. This is not a trivial property. It is easily to invent an arbitrary code, consisting of symbols and symbol manipulation rules. It is much harder to invent one in which the symbols and symbol combinations all make sense.

And, conversely, it is hard to take an undeciphered symbol system that does have a unique, nontrivial interpretation, and decipher it so as to find that interpretation.

All formal or artificial symbol systems (including all of mathematics and logic, computer programmes, and artificial "languages") are subsets of natural language: We don't change languages when we begin to talk "geometry," "boolean algebra," or "C," we simply use a specialized subset of the vocabulary of English.

So all formulas are really sentences in English (or any other natural language). This means that natural language is the "mother of all symbol systems."

THE CHURCH/TURING THESIS