Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Computation is an entirely abstract thing. Nowhere does the definition of Turing machines or the lambda calculus makes any reference to actual physical things.


view as:

Church doesn't really define computation[1], and Turing most certainly bases his entire definition (the first ever definition of computation) on multiple references to physical things (finite "states of mind", limits of what can be read or manipulated at any one time etc.). Brouwer also bases intuitionism on the physical, when he appeals to the limited capability of a finite (idealized) mind. Computation carried out with an infinitely fast (in terms of physical time) and/or infinitely large computer is not Turing computation.

[1]: He conjectures that LC coincides with what he calls a "a vague intuitive notion" of an algorithm, and when he tries to make a rigorous argument he finds that he needs to rely on circularity, and writes this: "if this [circular] interpretation or some similar one is not allowed, it is difficult to see how the notion of an algorithm can be given any exact meaning at all."


Can you link to some of the references you cite? That seems like an interesting read.

As for computability being a physical notion, I can't speak to the motivation of Church and his students, but I do know that there are characterizations of total computable functions in domain theory which make no mention of physics. The intuition about computable functions is that they are precisely the functions whose output for any given input depends only on a finite part of the input. The really interesting thing about the Church Turing thesis is that there are so many different models of computation that capture precisely this idea (the surprising part is that these very simple models are expressive enough to cover all computable functions).


> Can you link to some of the references you cite? That seems like an interesting read.

Sure.

It's best to start with Jean van Heijenoort's seminal From Frege to Gödel A Source Book in Mathematical Logic, 1879-1931[1]. It includes the original texts by Hilbert, Brouwer and many others, but the text I personally found most interesting was Hermann Weyl's, which I quoted here[2] nearly in full (in the parent comment I quoted some relevant passages from Brouwer and Hilbert; in fact, it was as part of that Reddit discussion that I found the references and learned what little I know of the philosophy of mathematics).

I've also found Juliet's Floyd's discussion of Turing's (and Wittgenstein's) mathematical philosophy[3] fascinating. She pinpoints how and why Turing viewed the philosophy of mathematics the way he did, how that view led him to the discovery of computation, and why people like Gödel and others found that to be such a profound philosophical breakthrough.

The Stanford Encyclopedia of Philosophy's entry on the philosophy of mathematics gives a good overview[4].

> The intuition about computable functions is that they are precisely the functions whose output for any given input depends only on a finite part of the input.

Once you know what computation is, you can define it in many ways. But why would that definition coincide with what we call computation? You can define foo to be such functions, but why are you calling them computable? And, BTW, even the very definition of finite possibly (I'm not sure about this point) requires reliance on the physical due to the multiple models of FOL and the problems with SOL.

> The really interesting thing about the Church Turing thesis is that there are so many different models of computation that capture precisely this idea

It is absolutely trivial to come up with a super-Turing mathematical model (e.g. just use reals, as in "pick the supremum of this bounded set of reals", or use a Turing machine with an infinite number of heads). All those other models are somehow based on capturing an intuitive notion of the human thought process, which is completely physical, and therefore, it is not surprising that they coincide. Turing was just the first who was able to give the notion a precise mathematical meaning. If you go back to the inception of those ideas, from Brouwer's intuitionism, Hilbert's formalism and even to the far older notion of the algorithm, you see that they're all tied to the capabilities of a physical human mind.

It's true that not all of those ideas actually mention physics because not all thinkers necessarily believed the mind to be completely physical, but they're all based on the idea of a limited mind, which I think we can safely call physical.

[1]: http://www.hup.harvard.edu/catalog.php?isbn=9780674324497

[2]: https://www.reddit.com/r/programming/comments/5k1v04/is_math...

[3]: https://mdetlefsen.nd.edu/assets/201037/jf.turing.pdf

[4]: https://plato.stanford.edu/entries/philosophy-mathematics/


P.S.

I also strongly recommend the excellent 1988 paper by Robin Gandy, The Confluence of Ideas in 1936, published in The Universal Turing Machine A Half-Century Survey (ed. Rolf Herken), 1995[1] (pp. 51-102).

Gandy gives a full historical background of who knew what in 1936, what the general state of knowledge was at the time, and the reception of both Church's and Turing's papers.

[1]: https://www.amazon.com/Universal-Turing-Machine-Half-Century...


The lambda calculus preceded Turing machines, in fact Turing worked on the lambda calculus before he published his work on Turing machines. Where on Earth did you pick up this rubbish?

Oh, I picked it up from Church's original 1936 paper. Those were direct quotes. Did you actually read Church's An Unsolvable Problem of Elementary Number Theory and Turing's On Computable Numbers? If so, could you point out Church's rigorous definition of computation?

BTW, while the lambda calculus does indeed precede Turing's work, its recognition as a universal expression of what is calculable happened at exactly the same time. But regardless, Church does not define the notion of computation, while Turing does.

But you don't have to take my word for it. You can find such rubbish by Gandi (who called Turing's work a "paradigm of philosophical analysis"), Davis, Gödel and many others: Gödel offered enthusiastic praise when he wrote that Turing offered "the precise and unquestionably adequate definition of the general concept of formal system" (Floyd, https://mdetlefsen.nd.edu/assets/201037/jf.turing.pdf)

The belief that Church defined computation rigorously when he solved the Entschidungsproblem using the lambda calculus rather than just made an imprecise conjecture is a common mistake and historical revisionism. If anyone came close to defining computation before Turing, that would have been Brouwer.

Also, I don't know if Turing worked on lambda calculus before he wrote On Computable Numbers. After completing the manuscript, he saw Church's new paper and incorporated it in an appendix. Do you have any reference for that?


>The belief that Church defined computation rigorously when he solved the Entschidungsproblem using the lambda calculus rather than just made an imprecise conjecture is a common mistake and historical revisionism.

Ok I'm out, you do you man.


You're free to believe what you want, or you can go read the abundant material, starting with Church's paper. If you find a thorough treatment of the concept of computation, you can write a paper about it. Church himself explicitly writes that he's unable to give a precise definition. Church was the first (scooping Turing by a few months) to claim that a certain formalism is what we now call Turing complete, i.e., universal with respect to the "vague intuitive notion" of the algorithm, but he was unable to provide a satisfactory justification for his claim (he made a circular argument and then gave up, writing the sentence I quoted above).

Turing, on the other hand, gives a fundamental treatment of what computation is, independent of any particular formalism (in his review of Turing's paper, Church described it as explaining how to construct "arbitrary" computing systems, and Gödel said, in the quote I've given, that Turing was able to give a general, universal, treatment of what a formal system is).

While I can't argue with your claim that Turing had worked with lambda calculus prior to writing On Computable Numbers when he was 24, I have seen no mention of this, but would be happy to see a reference.


Legal | privacy