However, if said strategy’s implementation requires actual infinities, eg each prisoner having an infinite memory, then that is why you find it intuitively objectionable.

It is useful here to think of computer algorithms and not just math. While mathematical arguments have no problem supposing infinite amounts of actors, the next question is whether each actor can have infinite memory.

In mathematics, infinity can be thought of as a property of a *set*. It can also be thought of as some limit of an infinite sequence of operations on sets, which is a statement that is simultaneously true about each member of that sequence.

This is useful because it can tie constructions we observe in the real world into patterns that approximate and converge to the limit of this infinite sequence. And then the question is how the computational complexity grows.

So in your example here, each FINITE set of prisoners can’t coordinate a strategy. So there is no “approaching a limit” – the thing only starts working with an infinite set of prisoners, each of whom has infinite memory etc. And that is why you get your intuition alarm bells go off ðŸ™‚

But it is even more than that. Your construction requires each prisoner to *use the axiom of choice in order to take an action based on the NAME of the chosen member* which is used to demonstrate the existence of a sequence of *actions* that satisfies a certain property. However, when the axiom of choice is used normally, it is not used to actually NAME the chosen element, but merely work with it like a black box. By NAME, I mean an id that distinguishes it from all other elementa, and lets you pick it out and examine is properties THAT ARE DIFFERENT than all other elements in that set.

In other words, Sure, you can assume that the chosen “representative” sequence has the same property as any other in the equivalence class — namely that all but finitely many terma are equal. BUT the part where you “cheat” is having the prisoner “find out” more than that about the representative sequence, in particular its initial values up to an arbitrary depth.

]]>1. “In SIA the differential calculus is reduced to simple algebra.” Nonstandard Analysis does the same, but in fact, as has been pointed out, the question of whether this is an “advantage” depends upon whether one WANTS students to learn the limit concept or not. This is essentially a matter of taste, not a matter of rigor.

2. “SIA does not lead to contradictions such as the Banach-Tarski

paradox as does Limit theory/NSA.”

The Banach-Tarsi Paradox is merely a paradox. It is NOT a contradiction. The fact that some people call it a contradiction is not evidence that it is a contradiction. It seems counterintuitive, but ONLY UNTIL one realizes that the paradaoxical decomposition of the unit ball that allows recombination via rigid transformations into a ball of radius greater than one REQUIRES that the pieces into which it is decomposed be non-measurable. Thus there is no formal contradiction in this result at all.

3. “The method of microadditivity found in physical derivations is a

natural application of SIA but not of Limit theory/NSA.”

I am not familiar with “microadditivity” found in “physical derivations”, so I will look it up before completing my comment. First, though, I doubt that “physical derivations” makes sense. If I cannot find a definition of that term, I’m likely to reject this as a non-issue, based upon a vagueness in your terminology.

4. “The â€˜taking the standard partâ€™ fraud of NSA is unnecessary in SIA

because the infinitesimals cancel eachother out.”

I do not see how “the infinitesimals cancel each other out” in SIA. If all infinitesimals are real multiples of only one unit infinitesimal, then for any two infinitesimals dx and dy, the product dxdy is zero, I’ll grant you. However, if one’s SIA theory allows for nil cubed infinitesimals, then there will be some infinitesimals dx and other infinitesimals dy such that dxdy is nonzero. In this case, you need a “standard part map”, and referring to the standard part map in NSA as a “fraud” is disingenuous at best.

Your last paragraph contains several claims:

a. “The best book on SIA is A Primer of Infinitesimal Analysis by J L Bell.”

I’ve seen it and read some parts of it, but I’m not sure I could say it’s the best book on SIA. Can you list some others so that the interested reader can look them up and compare them themselves?

b. “Many of the criticisms of your approach given above are addressed in his book.”

Does John L. Bell use the term “fraud” in his book, in his discussion of NSA?

c. “I personally found it useful to compare Bellâ€™s book with The Foundations of Mathematics by Stewart and Tall. In their book Stewart and Tall use the quest to explain calculus to justify the Completeness axiom and classical Real analysis â€“ a justification which falls apart with SIA.”

I would have to look over the book by Stewart and Tall myself, but it seems to me that it is a matter of the historical record that the quest to explain or justify analysis (of which the common term “calculus” is used, in spite of the fact that that “calculus” to which the term is often applied is not the only caclulus) led to many of the discoveries of phenomena such as the Completeness Axiom, and many of the developments in classical Real Analysis. I’m not sure why this is a criticism…

d. “They also use the Axiom of Choice in their coverage of Cardinal numbers; but this axiom also implies the LEM thereby disallowing nilsquare (that is, genuine) infinitesimals.”

It is patently false that the Axiom of Choice disallows nil square infinitesimals. One can easily embed the ring of real numbers into a ring that includes nil square infinitesimals. Then you can use those nil square infinitesimals to your heart’s content, never running afoul of the Axiom of Choice or the Law of the Excluded Middle. Now, I have never before heard the claim that AC implies LEM, so please demonstrate, or give a reference. I want to see the proof that you claim exists. I expect that one reason I have never heard such a claim before is that it is rather strange. The LEM is a Law of one’s Logic System, and the Axiom of Choice is an Axiom of ZFC, in the language of Set Theory. One can use Intuitionistic Logic (or various other Logics without the LEM) and write down the axioms of ZFC. One can then study the Intuitionistic consequences of the Axiom of Choice.

e. “Consequently, you can believe in infinite numbers or infinitesimals but not both, or at least not both at the same time.”

Nonsense. NSA does exactly that, and the approach of NSA can be carried out in an Intuitionistic framework as well.

f. “This may explain Cantorâ€™s objection to infinitesimals!”

It seems to me to be a bit precarious to claim to know what was going on in Cantor’s mind unless he told us what it was, so I won’t venture a detailed comment on this here. It is a matter of record, however, that people such as Bishop Berkeley objected to infinitesimals, and they objected to nil square infinitesimals, on the grounds that the theory of analysis using infinitesimals presented up until then was ACTUALLY inconsistent. If I had to guess, I would expect that Cantor was trying to avoid including blatant inconsistencies in his work.

g. “It is interesting to note that Fuzzy Logic also depends on the repudiation of the LEM.”

Fuzzy Logic can be presented as a special case of classical logic, and in fact, many presentations in an applied setting of Zadeh’s version of Fuzzy Logic (which came up late historically in Logic and Philosophy) are based upon very sophomoric views of Logic. One can observe in the Logic and Model Theory Community and Literature in the early twentieth century that “Fuzziness” of truth values was prevalent. The notion of a logic system whose truth values are taken in a Boolean Algebra appears, and plays a reasonably significant role, and you can see some of this in a text by John L. Bell and Alan Slomson called Models and Ultraproducts: An Introduction. An example of the “Boolean Fuzzification” of Logic is easily seen as a kind of overarching philosophy behind the development of notions like the Lindenbaum-Tarski Algebra, etc. Also in the early twentieth century, Quantum Logics were developed without the help of Lotfi Zadeh, and those are Logics whose truth values are taken in (often non-distributive) lattices, and so they often do not satisfy the LEM. Long before the twentieth century, philosophers discussed “modes”, and the resulting reasoning systems (often devised for argumentation in matters close to law and politics) are now called “Modal Logics”, and they form another family of “multi-valued logics”. “Fuzzy Logic”, depending upon what you mean by it, has been around for a long long time. By the way, it is known that using the interval [0,1] for your set of truth values, and the most common fuzzy interpretations of conjunction and disjunction and negation as in Lotfi Zadeh’s version of Fuzzy Logic is overkill, because three element logic will suffice. (See http://www.worldscientific.com/doi/abs/10.1142/S021848859700021)

h. “Calculus and Fuzzy Logic are perhaps the two branchs of mathematics which are the most useful in modelling reality.”

You’re leaving out combinatorics, probability, finite fields, finite groups, graphs, etc… Why? You might find it useful to look into the history of these subjects, which are not entirely bound up in “calculus”, and which use classical logic.

i. “Perhaps Cardinals, â€˜Realâ€™ analysis, and the LEM should be banished to the fringes of philosophy.”

No.