Nature and the universe is all about continuous quantities; integral quantities and whole numbers represent an abstraction. At a micro level this is less true -- elementary particles specifically are a (mostly) discrete phenomenon, but representing the state even of a very simple system involves continuous quantities.
But the Cantor vision of the real numbers is just wrong and completely unphysical. The idea of arbitrary precision is intrinsically broken in physical reality. Instead I am off the opinion that computation is the relevant process in the physical universe, so approximations to continuous quantities are where the "Eternal Nature" line lies, and the abstraction of the continuum is just that -- an abstraction of the idea of having perfect knowledge of the state of anything in the universe.
They're unphysical, and yet the very physical human mind can work with them just fine. They're a perfectly logical construction from perfectly reasonable axioms. There are lots of objects in math which aren't physically realizable. Plato would have said that those sorts of objects are more real than anything which actually exists in "reality".
On the one hand, this article is talking about the hierarchy of "physicality" of various mathematical concepts, and they put Cantor's real numbers at the floor. I disagree with that specifically; two quantities are interestingly "unequal" only at the precision where an underlying process can distinguish them. Turing tells us that any underlying process must represent a computation, and that the power of computation is a law of the underlying reality of the universe (this is my view of the Universal Church-Turing Thesis, not necessarily the generally accepted variant).
The other question is whether Cantor's conception of infinity is a useful one in mathematics. Here I think the answer is no. It leads to rabbit holes that are just uninteresting; trying to distinguish inifinities (continuum hypothesis) and leading us to counterintuitive and useless results. Fun to play with, like writing programs that can invoke a HaltingFunction oracle, but does not tell us anything that we can map back to reality. For example, the idea that there are the same number of integers as even integers is a stupid one that in the end does not lead anywhere useful.
A skeptic in what way? He said a lot.
Can it? We can only work with things we can name and the real numbers we can name are an infinitesimal fraction of the real numbers. (The nameable reals and sets of reals have the same cardinality as integers while the rest are a higher cardinality.)
I didn't mean to suggest that the reals are the floor of reality, rather that they are more floorlike than the integers.
> The other question is whether Cantor's conception of infinity is a useful one in mathematics. Here I think the answer is no.
Tools are created by transforming nature into something useful to humans. Is Cantor's conception of infinity more natural? I can't really say, but the uselessness and confusion seems more like nature than technology.
it leads to the idea that measuring 2 sets via a bijection is a better idea than measuring via containment
Nah, you're likely thinking of the rationals, which are basically just two integers in a halloween costume. Ooh a third, big deal. The overwhelming majority of the reals are completely batshit and you're not working with them "just fine" except in some very hand wavy sense.
Integers come into existence long before god - as the only presumption required is a difference between one thing and another (or nothing). The integers also create infinite gaps. The primes.
So no - I do not think reals are closer to the divine. They require we import infinity twice to be defined, and I'm undecided on whether our reality has unbounded 'precision' like that - or if 'just' an infinite number of discrete units.
you said a lot and i probably don't understand but doesn't pi contradict this? pi definitely exists in physical reality, wherever there is a circle, and seems to be have a never ending supply of decimal points.
Is there a circle in physical reality? Or only approximate circles, or things we model as circles?
In any case, a believer in computation as reality would say that any digit of π has the potential to exist, as the result of a definite computation, but that the entirety does not actually exist apart from the process used to compute it.
One could argue that nature always deals in discrete quantities and we have models that accurately predict these quantities. Then we use math that humans clearly created (limits) to produce similar models, except they imagine continuous inputs.
(And the discrepancy might not be in the physical continuum being simpler than the mathematical reals, as some here postulate, but rather in the continuum being far stranger than the reals, in ways we may never observe nor comprehend.)
The standard construction for the real numbers is to start with the rationals and "fill in all the holes". So why even bother with filling in the holes and instead just declare God created the rationals?
The universe requires infinite divisibility, i.e. a dense set. It doesn't require infinite precision, i.e. a complete set. Our equations for the universe require a complete set, but that would be confusing the map with the territory. There is no physical evidence for uncountable infinities, those are purely in the imagination of man.
In fact, if you are to argue that we cannot know a “raw” real number, I would point out that we can’t know a natural number either! Take 2: you can picture two apples, you can imagine second place, you can visualize its decimal representation in Arabic numerals, you can tell me all its arithmetical properties, you can write down its construction as a set in ZFC set theory… but can you really know the number – not a representation of the number, not its properties, but the number itself? Of course not: mathematical objects are their properties and nothing more. It doesn’t even make sense to consider the idea of a “raw” object.
Math is math, if you start with ZFC axioms you get uncountable infinites.
Maybe you don't start with those axioms. But that has nothing to do with truth, it's just a different mathematical setting.
There have been attempts to create discrete models of time and space, but nothing useful has resulted from those attempts.
Most quantities encountered in nature include some dependency on work/energy, time or space, so nature deals mostly in continuous quantities, or more precisely the models that we can use to predict what happens in nature are still based mostly on continuous quantities, despite the fact that about a century and a half have passed since the discreteness of matter and electricity has been confirmed.
What does it mean to "exist in physical reality"?
If you mean there are objects that have physical characteristics that involve pi to infinite precision I think the truth is we have not a darn clue. Take a circle, that would have to be a perfect circle. Even our most accurate and precise physical theories only measure and predict things to 10s of decimal places. We do not possess the technology to verify that it's a real true circle to infinite precision, and many reason to think that such a measurement would be impossible.
Citation needed.
Especially since there are well-established math proofs of irrational numbers.
I am not sure what you are arguing here. We’ve been teaching this to all undergraduate mathematicians for the last century; are you trying to make the point that this part of the curriculum is unnecessary, or that mathematics has not contributed to the wellbeing of society in the last hundred years? Both of these seem like rather difficult positions to defend.
This is a Jewish and Christian conception of God. How can this be true when so many things that give us comfort in the natural world: fresh fruit, shade trees, sunshine and warm sand between our toes, etc., were not created by man?
Even in mathematics itself: how improbable, how ludicrous, how miraculous is it that the 3rd, 4th, and 5th natural numbers -- numbers you could discover by looking at your own hands -- have the amazing property of demonstrating the Pythagorean theorem?
The Islamic ideal of God (Allah) is so much more balanced. God created both the integers AND the reals. He created everything, some things for our comfort and rest, some things to drive us close to madness, and a lot of stuff in between. Peel back enough layers of causality and all of creation has the stamp of the divine.
This is a problem of modeling optimization. The models based on uncountable "real" numbers are logically consistent and simple to use, so they are adequate for predicting what happens in natural or artificial systems.
All attempts to avoid the uncountable infinities produce models that are both more complicated and also incomplete, as they do not cover all the applications of traditional infinitesimal calculus, topology and geometry.
Unless someone will succeed to present a theory that avoids uncountable infinities while being as simple as the classic theory and being applicable to all the former uses, I see such attempts as interesting, but totally impractical.
Caveat: former Catholic; 50+ years of fervent atheism.
I'm so used to thinking this way that I don't understand what all the fuss is about, mathematical objects being "real". Ideas are real but they're not real in the way that rocks are.
Whenever there's a mysterious pattern in nature, people have felt the need to assert that some immaterial "thing" makes it so. But this just creates another mystery: what is the relationship between the material and the immaterial realm? What governs that? (Calling one or more of the immaterial entities "God" doesn't really make it any less mysterious.)
If we add entities to our model of reality to answer questions and all it does is create more and more esoteric questions, we should take some advice from Occam's Shovel: when you're in a hole, stop digging.
ps. Various numerology phenomena have a similar vibe, and no wonder so many people who go off the deep end tend to get trapped by them. Maybe I will be one of them as I become old and senile :-D
So yes, generally not starting with ZFC.
I can't speak to "truth" in that sense. The skepticism here is skepticism of the utility of the ideas stemming from Cantor's Paradise. It ends up in a very naval-gazing place where you prove obviously false things (like Banach-Tarski) from the axioms but have no way to map these wildly non-constructive ideas back into the real world. Or where you construct a version of the reals where the reals that we can produce via any computation is a set of measure 0 in the reals.
Well you can be skeptical of anything and everything, and I would argue should be.
Addressing your issue directly, the Axiom of Choice is actively debated: https://en.wikipedia.org/wiki/Axiom_of_choice#Criticism_and_...
I understand the construction and the argument, but personally I find the argument of diagonalization should be criticized for using finities to prove statements about infinities.
You must first accept that an infinity can have any enumeration before proving its enumerations lack the specified enumeration you have constructed.
https://en.wikipedia.org/wiki/Cantor%27s_diagonal_argument
> Math is math, if you start with ZFC axioms
This always bothers me. "Math is math" speaks little to the "truth" of a statement. Math is less objective as much as it rigorously defines its subjectivities.
My point was that it is possible that all values in our universe are rational, and it wouldn't be possible for us to tell the difference between this and a universe that has irrational numbers. This fact feels pretty cursed, so I wanted to point it out.
The idea that a quantity like 1/3 is meaningfully different than 333/1000 or 3333333/10000000 is not really that interesting on its own; only in the course of a physical process (a computation) would these quantities be interestingly different, and then only in the sense of the degree of approximation that is required for the computation.
The real numbers in the intuitionalist sense are the ground truth here in my opinion; the Cantorian real numbers are busted, and the rationals are too abstract.
The continuum is the reality that we have to hold to. Not the continuum in the Cantor sense, but in the intuitionalist or constructivist sense, which is continuously varying numbers that can be approximated as necessary.
The axiom of choice is not required to prove Cantor’s theorem, that any set has strictly smaller cardinality than its powerset.
Actually, I can recount the proof here: Suppose there is an injection f: Powerset(A) ↪ A from the powerset of a set A to the set A. Now consider the set S = {x ∈ A | ∃ s ⊆ A, f(s) = x and x ∉ s}, i.e. the subset of A that is both mapped to by f and not included in the set that maps to it. We know that f(S) ∉ S: suppose f(S) ∈ S, then we would have existence of an s ⊆ A such that f(s) = f(S) and f(S) ∉ s; by injectivity, of course s = S and therefore f(S) ∉ S, which contradicts our premise. However, we can now easily prove that there exists an s ⊆ A satisfying f(s) = f(S) and f(S) ∉ s (of course, by setting s = S), thereby showing that f(S) ∈ S, a contradiction.
The real numbers exist and are approximable, either by rationals or by decimal expansion. The idea of approximability and computability are the critical things, not the specific representation.
...
> The Jewish [Christian] ideal of God (YHVH) is so much more balanced.
There's enough bigotry out there. Let's not make assumptions about people's beliefs.
I think the conceit is supposed to be that analysis—and therefore the reals—is the “language of nature” more so than that we can actually find the reals using scientific instruments.
To illustrate the point, using the rationals is just one way of constructing the reals. Try arguing that numbers with a finite decimal representation are the divine language of nature, for example.
Plus, maybe a hot take, but really I think there’s nothing natural about the rationals. Try using them for anything practical. If we used more base-60 instead of base-10 we could probably forget about them entirely.
[0] Jesus being human changes the calculus quite a lot, of course, as elaborated in e.g. Hebrews 4:14–16. God, who was fully transcendent, became human, hence why Jesus is also called Immanuel/Emmanuel (lit. “God with us”) in the Bible.
Egg cartons might sound contrived but the reals don't necessarily make sense without reference to rulers, scales, etc. And in fact the defining completeness / Dedekind cut conditions for the reals are necessary for doing calculus but any physical interpretation is both pretty abstract and probably false in reality.
Let me add that we have no clue how to do a measurement that doesn't involve a photon somewhere, which means that it's pure science fiction to think of infinite precision for anything small enough to be disturbed by a low-energy photon.
Any rotating/spinning black hole will no longer be a perfect sphere.
Is sqrt(2) computable?
Is BB(777) computable?
Is [the integer that happens to be equal to BB(777), not that I can prove it, written out in normal decimal notation] computable?
So yes sqrt(2) is computable.
Every BB(n) is computable since every every natutal number can be computed. It's the BB function itself that is not computable in general, not the specific output of that function for a given input.
the first 2 naturals form an integer.
that integer and a 3rd natural constitute a real (but this 3rd natural best be bigger than zero, else we're in trouble)
what I choose to focus after observing the "unphysical" nature of numbers. is the sense of natural opposition (bordering on alternation) between "mathematical true" and "physical true". both are claiming to be really real Reality.
in the mathematical realm, finite things are "impossible", they become "zero", negible in the presence of infinities. it's impossible for the primes to be finite (by contradiction). it's impossible for things (numbers or functions of mathematical objects) to be finite.
but in the physical reality, it's the "infinite things" which become impossible.
the "decimal point" (i.e. scientific notation i.e. positional numeral systems) is truly THE wonder of the world. for some reason I want something better than such a system... so I'm still learning about categories
I'm under the impression that all our theories of time and space (and thus work) break down at the scale of 1 plank unit and smaller. Which isn't proof that they aren't continuous, but I don't see how you could assert that they are either.
then maths is really THE absolute best description available of language and nature.
but non-mathematical minds will simply wonder and be amazed at how "maths explains the world", a clear indication that somebody is not thinking like a mathematician.
> Whenever there's a mysterious pattern in nature, people have felt the need to assert that some immaterial "thing" makes it so. But this just creates another mystery: what is the relationship between the material and the immaterial realm?
the relationship between the material and the immaterial pattern beholden by some mind can only be governed by the brain (hardware) wherein said mind stores its knowledge. is that conscious agency "God"? the answer depends on your personally held theological beliefs. I call that agent "me" and understand that "me" is variable, replaceable by "you" or "them" or whomever...
oh, and I love (this kind of figurative) digging. but I use my hands no shovels.
yes, I also enjoy trying to answer this question.
what is such an structure even mean? how could it be that simply defining numbers, obersving addition, and generalizing it away into multiplication would yield this natural structure?
It all begins with zero. the predecessor of One, the best known number.
zero can be assumed by anyone. the surprise is how all zeros are the same zero. (by uniqueness of emptyset; but as I hope you can see, I'm a crank. a nutjob. I'll stop
> A busy beaver hunter who goes by Racheline has shown that the question of whether Antihydra halts is closely related to a famous unsolved problem in mathematics called the Collatz conjecture. Since then, the team has discovered many other six-rule machines with similar characteristics. Slaying the Antihydra and its brethren will require conceptual breakthroughs in pure mathematics.
https://www.quantamagazine.org/busy-beaver-hunters-reach-num...
I've solved multiple continuous value problems by discretizing, applying combinatorics to the techniques, and then taking the limit of the result - you of course get the same result if you had simply used regular integration/differentiation, and it's a lot easier to use calculus than combinatorics.
But the point is the "rational", discretized approach will get you arbitrarily close to the answer.
It's why many analysis textbooks define a (given) real number as "a sequence of converging rational numbers" (before even defining what a limit is).
The logic is circular, simply because mathematicians are the ones who invented irrationals. Of course they have proofs on them. They also have proofs on lots of things that don't exist in this universe.
And as I pointed out elsewhere, many analysis textbooks define a real number to be "a (converging) sequence of rationals". The notion of convergence is defined before reals even enter into the picture, and a real number is merely the identifier for a given converging sequence of rationals.
But given the answer, I suppose you could write a program that just returns it. This seems to hinge on the definition of “computable.” It’s an integer, so that fits the definition of a computable number.
My mistake.
So as you noticed, it only makes sense to talk about whether a function is computable, we can't meaningfully talk of computable numbers.
So, here's to math keeping our imagination limber and extending our ideas of what's real.
Then HH the function itself is not computable, but the numbers 0 and 1, which are the only two outputs of HH are computable.
Integers themselves are always computable, even if they are the output of functions that are themselves uncomputable.
Just models, useful but flawed abstractions.
At least at this stage I think it relates to whether you believe "the universe"/reality is a sort of momentary collection of the currently-existing things. Vs seeing reality as the set of all things that might obstruct "me" or any entity from doing something.
To me, even if the wall is invisible it's still a wall
It’s fairly easy to go from integers to many subsets of the reals (rationals are straightforward, constructible numbers not too hard, algebraic numbers more of a challenge), but the idea that the reals are, well real, depends on a continuity of spacetime that we can’t prove exists.
As a young math researcher, my mentor definitely did not believe that Math was the absolute descriptor of the universe.
You can definitely imagine a scenario where the world does not operate perfectly mathematically correct though Math still exists - as an abstract separate entity.
You can do this such that everytime you recognize a new quirk in the world, then you can invent some new math/logical framework to match/approximate the current understanding. I don't know if this is the reality of this world, but when you look at things like complexity theory you have to wonder "okay... maybe we designed a useful system rather than discovering a true law of reality"
The real numbers are a useful mathematical trick that make it possible to prove results in calculus. What you surrender in return for being able to prove statements is to give up the ability to compute expressions. This may be a worthwhile trade-off for physicists but for the universe (which does many computations and zero proofs) it's quite a burden.
Perhaps our theories of time and space would break down at some extremely small scale, but for now there is no evidence about this and nobody has any idea which that scale may be.
In the 19th century, both George Johnstone Stoney and Max Planck have made the same mistake. Each of them has computed for the first time some universal constants, Stoney has computed the elementary electric charge in 1874 and Planck has computed the 2 constants that are now named "Boltzmann's constant" and "Planck's constant", in several variants, in 1899, 1900 and 1901. (Ludwig Boltzmann had predicted the existence of the constant that bears his name, but he never used it for anything and he did not compute its value.)
Both of them have realized that new universal constants allow the use of additional natural units in the system of fundamental units of measurement and they have attempted to exploit their findings for this purpose.
However both have bet on the wrong horse. Before them, James Clerk Maxwell had proposed two alternatives for choosing a good unit of mass. The first was to choose as the unit of mass the mass of some molecule. The second was to give an exact value to the Newtonian constant of gravity. The first Maxwell proposal was good and when analyzed at the revision of SI from 2018 it was only very slightly worse than the final choice (which preferred to use two properties of the photons, instead of choosing an arbitrary molecule besides using one property of the photons).
The second Maxwell proposal was extremely bad, though to be fair it was difficult for Maxwell to predict that during the next century the precision of measuring many quantities will increase by many orders of magnitude, while the precision of measuring the Newtonian constant of gravity will be improved only barely, in comparison with the others.
Both Stoney and Planck have chosen to base their proposals for systems of fundamental units on the second Maxwell variant, and this mistake made their systems completely impractical. The value of Newton's constant has a huge uncertainty in comparison with the other universal constants. Declaring its value as exact does not make that uncertainty disappear, but it moves the uncertainty into the values of almost all other physical quantities.
The consequence is that if using the systems of fundamental units of George Johnstone Stoney or of Max Planck, almost no absolute value of any quantity can be known accurately. Only the ratios between two quantities of the same kind and the velocities can be known accurately.
Thus the Max Planck system of units is a historical curiosity that is irrelevant for practice. The right way to use Planck's constant in a system of units has become possible only 60 years later, when the Josephson effect was predicted in 1962, and SI has been modified to use it only after other 60 years, in 2019.
The units of measurement that are chosen to be fundamental do not matter in any way upon the validity of physical laws at different scales. Even if the Planck units were practical, that would give no information about the structure of space and time. The definition of the Planck units is based on continuous models for time, space and forces.
Every now and then there are texts in the popular literature that mention the Planck units as they would have some special meaning. All such texts are based on hearsay, repeating affirmations from sources who have no idea about how the Planck units have been defined in 1899 and about how systems of fundamental units of measurement are defined and what they mean. Apparently the only reason why the Planck units have been picked for this purpose is that in this system the unit of length happens to be much smaller than an atom or than its nucleus, so people imagine that if the current model of space breaks at some scale, that scale might be this small.
But even then, the biggest black hole we think is possible measured down to the planck length gives you a number with 50 digits. And the entire observable universe measured in planck lengths is about 60 digits.
So how are you going to get a physical pi of even a hundred digits on the path toward arbitrary precision?
I don't think it's debated on the ground of if it's true or not.
And I was imprecise with language, but by saying "math is math" I meant that there are things that logically follow from the ZFC axioms. That is hard to debate or be skeptical of. The point I was driving was that it's strange to be skeptical of an axiom. You either accept it or not. Same as the parallel postulate in geometry, where you get flat geometry if you take it, and you get other geometries if you don't, like spherical or hyperbolic ones...
To give what I would consider to be a good counterargument, if one could produce an actual inconsistency with ZFC set theory that would be strong evidence that it is "wrong" to accept it.
Compare Plato with Aristotle. Plato held that the all forms exist in some third realm, so numbers would be counted among them. Aristotle, however, held that forms exist in particular instantiations or in minds that abstracted them from reality. (Aquinas could be said to synthesize both views in the sense that forms exist in particulars and in minds, but also have their origin in God, thus making God a sort of third realm, in a way. Neo-Platonists would view the "mind of God" similarly.)
Now, in the Aristotelian view, numbers are quantities abstracted from concrete reality (indeed, quantity is one of the categories), but they are not substantial forms, as you will not see instances of numbers as substances in the world. They're abstractions of accidental forms. Furthermore, a form needn't be instantiated actually, but can exist potentially. This is how he resolves Zeno's paradoxes. You can divide a length an infinite number of times - or in a CS context, you can apply the successor function indefinitely - but only potentially; as a matter of actuality, you have not divided a length an infinite number of times.
So, for Aristotle, you have a finite plurality of things that are potentially infinitely divisible, or a finite series of actions that can be potentially infinitely repeated or whatever.
For a contemporary realist, Aristotelian treatment of math, James Franklin is worth checking out [0].
Well, there are the same number. So, uh, sorry?
Therefore using the Planck length for any purpose is meaningless.
For now, nobody can say anything about the value of a Schwartzschild radius in this range, because until now nobody succeeded to create a theory of gravity that is valid at these scales.
We are not even certain whether Einstein's theory of gravity is correct at galaxy scales (due to the discrepancies non-explained by "dark" things), much less about whether it applies at elementary particle scales.
The Heisenberg uncertainty relations must always be applied with extreme caution, because they are valid in only in limited circumstances. As we do not know any physical system that could have dimensions comparable with the Planck length, we cannot say whether it might have any stationary states that could be characterized by the momentum-position Heisenberg uncertainty, or by any kind of momentum. (My personal opinion is that the so-called elementary particles, i.e. the leptons and the quarks, are not point-like, but they have a spatial extension that explains their spin and the generations of particles with different masses, and their size is likely to be greater than the Planck length.)
So attempting to say anything about what happens at the Planck length or at much greater or much smaller scales, but still much below of what can be tested experimentally, is not productive, because it cannot reach any conclusion.
In any case, using "Planck length" is definitely wrong, because it gives the impression that there are things that can be said about a specific length value, while everything that has ever been said about the Planck length could be said about any length smaller than we can reach by experiments.
But honestly, the whole question is akin to asking how many angels can dance on the head of a pin.
On a separate subject, that site you linked does something strange with scrolling on Firefox mobile. Hey web devs - stop screwing around with things like scrolling! Browser devs implement scrolling in a consistent manner. You aren't making the user experience better with your silly JavaScript tricks!
Obviously this does not exclude the possibility that in the future some experiments where much higher energies per particle are used, allowing the testing of what happens at much smaller distances, might show evidence that there exists a discrete structure of time and space, like we know for matter.
However, that has not happened yet and there are no reasons to believe that it will happen soon. The theory about the existence of atoms is more than 2 millennia old, then it has been abandoned for lack of evidence, then it was revived at the beginning of the 19th century, due to accumulated evidence from chemistry, and it was eventually confirmed beyond doubt in 1865, when Johann Josef Loschmidt became the first who could count atoms and molecules, after determining their masses.
So the discreteness of matter had a very long history of accumulating evidence in favor of it.
Nothing similar applies to the discreteness of time and space, for which there has never been any kind of evidence. The only reason of the speculations about this is the analogy made with the fact that matter and electricity had been believed to be continuous, but eventually it has been discovered that they are discrete.
Such an analogy must make us keep an open mind about the possibility of work, time and space being discrete, but we should not waste time speculating about this when there are huge problems in physics that do not have a solution yet. In modern physics there are a huge amount of quantities that should be computable by theory, but in fact they cannot be computed and they must be measured experimentally. Therefore the existing theories are clearly not good enough.
Computation can only use rationals, and of course can get arbitrarily close to an answer because they are dense in the reals.
However, the entire edifice of analysis rests on the completeness axiom of the reals. The extreme value theorem, for example, is equivalent to the completeness axiom; the useful properties of continuous functions break down without it; the fundamental theorem of calculus doesn't work without it; Etc. So if the maths used in your physics (the structure of the theory, not just the calculations you perform with it) relies on these things at all, you're relying on the reals for confidence that the maths is sound.
Now you could argue that we don't need mathematical rigour for physics, that real analysis is a preoccupation of mathematicians, while physicists should be fine with informal calculus. I'm not going to argue that point. I'm just pointing out what the real numbers bring to the table.
Here's Tim Gowers on the subject: https://www.dpmms.cam.ac.uk/~wtg10/reals.html
Beware, it’s not always useful to work in complex numbers, you sometimes want to do something different for reals and complex numbers. The prime example here is complex analysis. Defining differentiation is based on limits, on the complex plane there are a lot more directions to approach a limit vs just two on the real line. This has some interesting implications. For example, any function differentiable on the complex plane is infinitely differentiable.
I don't quite get what the text is about.
entia non sunt muliplicanda praeter necessitatem.
Thou shalt not multiply entities beyond necessity.
https://youtu.be/GL77oOnrPzY?si=nllkY_E8WotARwUM
Also Bells Therom implies no locality or non realism which to me furthers the nail on the coffin of spacetime
I’m not convinced that we could have our current universe without irrationals - wouldn’t things like electromagnetism and gravity work differently if forced to be quantized between rationals? Saying ‘meh it would be close enough’ might be correct but wouldn’t be enough to convince me a priori.
For AC and CH, the answer is provably “no” as these axioms have been shown to say nothing about the behavior of halting problems, which any question about the manipulation of symbols can be phrased in terms of (well, any specific question—more general cases move up the arithmetical hierarchy).
If it’s not reflective in this precise sense, then the derivation of, e.g., a set-theoretic ∃ in some instances has no effect on any prediction of known physics (i.e., we are aware of no method of falsification).
Chaitin has a great paper on this and shows how Cantor's constructions were reflected a half-century later by Turing. https://arxiv.org/abs/math/0411418
Except of-course, while "hyper-Turing" machines that can do magic "post-Turing" "post-Halting" computation are seen as absurd fictions, real-numbers are seen as "normal" and "obvious" and "common-sensical"! It was amusing sometime back to see people pooh-pooh the likes of Hava Siegelmann for being funded for their "super-Turing" machines with "real-number" computation, without realizing that the core issue is the "real"-number itself!
I've always found this quite strange, but I've realized that this is almost blasphemy (people in STEM, and esp. their "allies", aren't as enlightened etc. as they pretend to be tbh).
Some historicans of mathematics claim (C. K. Raju for eg.) that this comes from the insertion of Greek-Christian theological bent in the development of modern mathematics.
Anyone who has taken measure theory etc. and then gone on to do "practical" numerical stuff, and then realizes the pointlessness of much of this hard/abstract construction dealing with "scary" monsters that can't even be computed, would perhaps wholeheartedly agree.
edit: The post has a great link to a note on Cantor's theology,
It is funny you say that when Turing defined Turing machines to compute real numbers (like π for example). In its original definition, a number was computable if its Turing machine did not stop. Which makes sense since π does not have a finite decimal expansion.
Today, we usually define Turing machines to decide problems and a problem is decidable if for every input its Turing machine stops with a ``yes'' or ``no'' answer. I guess this is what makes people think what you said in the quote above. Maybe this definition is more intuitive but this conclusion from it could not be more wrong.
Think about it for a second, if the computable numbers were countable there would be no uncomputable problem (Turing actually used the classic cantor diagonal argument to prove that there were uncomputable numbers)
So every subset that allows you to do your daily calculations contains the rationals.
Because energy is action per time, it inherits the continuity of time. Action is also continuous, though its nature is much less well understood. (Many people make confusions between action and angular momentum, speaking about a "quantum of action". There is no such thing as a quantum of action, because action is a quantity that increases monotonically in time for any physical system, so it cannot have constant values, much less quantized values. Angular momentum, which is the ratio of action per phase in a rotation motion, is frequently a constant quantity and a quantized quantity. In more than 99% of the cases when people write Planck's constant, they mean an angular momentum, but there are also a few cases when people write Planck's constant meaning an action, typically in relation with some magnetic fluxes, e.g. in the formula of the magnetic flux quantum.)
Perhaps when you said that energy is discrete you thought about light being discrete, but light is not energy. Energy is a property of light, like also momentum, frequency, wavenumber and others.
Moreover, the nature of the photon is still debated. Some people are not convinced yet that light travels in discrete packets, instead of the alternative where only the exchange of energy and momentum between light and electrons or other leptons and quarks is quantized.
There are certain stationary systems, like isolated atoms or molecules, which may have a discrete set of states, where each state has a certain energy.
Unlike for a discrete quantity like the electric charge, such sets of energy values can contain arbitrary values of energy and between the sets of different systems there are no rational relationships between the energy values. Moreover, all such systems have not only discrete energy values but also continuous intervals of possible energies, usually towards higher energies, e.g. corresponding to high temperatures or to the ionization of atoms or molecules.
There are already several decades of such discussions, but no usable results.
Time and space are primitive quantities in any current theory of physics, i.e. quantities that are assumed to exist and have certain properties, and which are used to define derived quantities.
Any alternative theory must start by enumerating exactly which are its primitive quantities and which are their properties. Anything else is just gibberish, not better than Star Trek talk.
However, the units of measurement for time and length are not fundamental units a.k.a. base units, because it is impossible to make any physical system characterized by values of time or length that are stable enough and reproducible enough.
Because of that, the units of time and length are derived from fundamental units that are units of some derived quantities, currently from the units of work and velocity (i.e. the unit of work is the work required to transition a certain atom, currently cesium 133, from a certain state to a certain other state, i.e. which is equal to the difference between the energies of the 2 states, while the unit of velocity is the velocity of light in vacuum).
This is the whole point of the un-reality of "real" numbers: "all" of it (= measure 1) is uncomputable except a "tiny" measure-0 set.
Let's all take a minute to ask ourselves what we mean by "real" every time we use that word. It may be that everyone's talking about a different thing.
Doesn't it though?
What happens when three bodies in a gravitationally bound system orbit each other? Our computers can't precisely compute their interaction because our computers have limited precision and discrete timesteps. Even when we discard such complicated things as relativity, what with its Lorentz factors and whatnot.
Nature can perfectly compute their interactions because it has smooth time and infinite precision.
A better way to dispute the unit square diagonal argument for the existence of sqrt(2) would be to argue that squares themselves are unphysical, since all measurements are imprecise and so we can't be sure that any two physical lengths or angles are exactly the same.
But actually, this argument can also be applied to 1 and other discrete quantities. Sure, if I choose the length of some specific ruler as my unit length, then I can be sure that ruler has length 1. But if I look at any other object in the world, I can never say that other object has length exactly 1, due to the imprecision of measurements. Which makes this concept of "length exactly 1" rather limited in usefulness---in that sense, it would be fair to say the exact value of 1 doesn't exist.
Overall I think 1, and the other integers, and even rational numbers via the argument of AIPendant about egg cartons, are straightforwardly physically real as measurements of discrete quantities, but for measurements of continuous quantities I think the argument about the unit square diagonal works to show that rational numbers are no more and no less physically real than sqrt(2).
To me at least, if you can write down a finite procedure that can produce a number to arbitrary precision, I think it is fair to say the number at that limit exists.
This made me think of a possible numerical library where rather than storing numbers as arbitrary precision rationals, you could store them as the combination of inputs and functions that generate that number, and compute values to arbitrary precision.
I think teachers lie to children and say that decimals are just another way of representing rationals, rather than the approximation of real numbers that they are (and introduce somewhat silly things like repeating decimals to do it), which makes rationals feel central and natural. That’s certainly how it was for me until I started wondering why no programming languages come with rational number packages.
But take one thing and then another: you have two things. That’s true whether or not anyone notices. Some mathematics is a human system of ideas, but some of it isn’t. Arithmetic reflects real patterns in the world. Logic, too, is not merely invention, it formalizes cause and effect. Numbers, in the Pythagorean sense, aren’t just marks on paper or symbols of order; they are the order inherent in reality, the ratios and structures through which the world exists at all.
At bottom, this debate is about the logos: what makes the universe intelligible at all, and why it isn’t simply chaos. When people say “math is real,” they mean it in the Platonic sense, not that numbers are rocks, but that they belong to the intelligible structure underlying reality.
God enters the picture not as a bolt-on explanation, but as the consequence of taking mathematical order seriously. If numbers and geometry are woven into reality itself, then the question isn’t whether math is real, it’s why the universe is structured so that it can be read mathematically at all. Call that intelligible ground the logos, or call it God; either way, it’s not an extra mystery but the recognition that reason and order are built into the world.
Calling math “just useful” misses the point. Why is the universe so cooperative with our inventions in the first place? The deeper issue is the logos: that the world is intelligible rather than chaos. That’s what people mean when they say math is real, not that numbers are physical things, but that the order they reveal is woven into reality itself.
That doesn't follow. Nature can perfectly compute them because they are nature. Nowhere is it required to have infinite precision, spatial or temporal.
Certainly Turing and Godel showed that computation is not universal (complete). Looking at one element of number theory in isolation seems unproductive? Arithmetic space is staggeringly complex, but structured, layered. IMHO number theory is like a hall of mirrors without a defined interface with physics. See Yang Mills mass gap.
Hard disagree. This is the problem with math disconnected from physics. The real world is composed of quanta and spectra, i.e. reality is NOT continuous!
Pretty much:
Even for elementary particles, we can't be sure that all electrons, say, are exactly alike. They appear to be, and so we have no reason yet to treat them differently, but because of the imprecision of our measurements it could be that they have minutely different masses or charges. I'm not saying that's plausible, only that we don't know with certainty
This isn't true in general, because for example you can take two equal volumes of a material and put them together, you will have less than two times the volume because of gravity. The mathematical statement that 1+1=2 follows by definition, and it's useful in applications only when the conditions are met that make it accurate, or accurate enough for the given purposes.
Mathematics is useful because the physical world exhibits regularities in its structure. Talking about logos or God adds an air of mystery to that but I don't know what more it adds
But how did you come to this conclusion unless by assuming that there are infinitely many natural numbers?
However, at each finite n we are still dealing with discrete quantities, i.e. integers and rationals. Even algebraic irrationals like sqrt(2) are ultimately a limit, and in my view the physicality of this limit doesn’t follow from the physicality of each individual element in the sequence. (Worse, quantum mechanics strongly suggests the sequence itself is unphysical below the Planck scale. But that’s not actually relevant - the physicality of sqrt(2) ultimately assumes a stronger view about reality than the physicality of 2 or 1/2.)
> They were both put in a room and at the other end was a $100 and a free A on a test. The experimenter said that every 30 seconds they could travel half the distance between themselves and the prize. The mathematician stormed off, calling it pointless. The engineer was still in. The mathematician said “Don’t you see? You’ll never get close enough to actually reach her.” The engineer replied, “So? I’ll be close enough for all practical purposes.”
While you nod your head OR wag your finger, you continuously pass by that arbitrary epsilon you set around your self-disappointment regarding the ineffability of the limit; yet, the square root of two is both well defined and exists in the universe despite our limits to our ability to measure it.
Thankfully, it exists in nature anyhow -- just find a right angle!
One could simply define it as the ratio of the average distance between neighboring fluoride atoms and the average distance of fluoride to xenon in xenon tetrafluoride.
Personally, I’d go with the sideline cut definition.
So, like, I’m saying that if Einstein’s model of gravity is applicable at very tiny scales, and if the [p,x] relation continues to hold at those scales, then stuff gets weird (either by “measurement of any position to within that amount of precision results in black-hole-ish stuff”, OR “the models we have don’t correctly predict what would happen”)
Now, it might be that our current models stop being approximately accurate at scales much larger than the Planck scale (so, much before reaching it), but either they stop being accurate at or before (perhaps much before) that scale, or things get weird at around that scale.
Edit: the spins of fermions don’t make sense to attribute to something with extent spinning. The values of angular momentum that you get for an actual spinning thing, and what you get for the spin angular momentum for fermions, are offset by like, hbar/2.
You cannot justify this statement without equally justifying my position.
Say you conceive of a counterfactual world without any humans in it. You know that within this world there could be a rock and another rock, you understand that this would be two rocks, and so you are reassured that one and one is two, even though no one is watching within this counterfactual world.
All of this happened in your mind. All along, you were the observer of the supposedly unobserved world you conceived of.
You are the unavoidable human observer of any counterfactual world you conceive of. You intend the world to have no human observers, but your intention fails. It is impossible. The properties of a truly unobserved world are unknowable to you.
This is why the Enlightenment left Platonism behind centuries ago. We can't say what the world would be without us, because any attempt is not only constructed within the mind, but also contemplated and observed through the mind. You can't escape projecting your systems of ideas onto everything you think about.
Once this is taken into account, Platonism has no explanatory power and is nothing more than superfluous metaphysical mystification.
The set of real numbers is almost all extraneous junk that the universe definitely doesn't care about but is very important to mathematicians.
So Einstein's theory depends in an essential way on matter being continuous. This is fine at human and astronomic scales, but it is not applicable at molecular or elementary particle scales, where you cannot approximate well the particles by an averaged density of their energy and momentum.
Any attempt to compute a gravitational escape velocity at scales many orders of magnitude smaller than the radius of a nucleus are for now invalid and purposeless.
The contradiction between the continuity of matter supposed by Einstein's gravity model and the discreteness of matter used in quantum physics is great enough that during more than a century of attempts they have not been reconciled in an acceptable way.
The offset of the spin is likely to be caused by the fact that for particles of non-null spin their movement is not a simple spinning, but one affected by some kind of precession, and the "spin" is actually the ratio between the frequencies of the 2 rotation movements, which is why it is quantized.
The "action" is likely to be the phase of the intrinsic rotation that affects even the particles with null spin (and whose frequency is proportional with their energy), while those with non-null spin have also some kind of precession superposed on the other rotation.
The accuracy of their volume and radius did not reach the level of a 64-bit float, but it was several orders of magnitude better that of 32-bit FP numbers.
While you cannot build a thing made of molecules with an accuracy better than that of a FP64 number, you can have a standing wave in a resonator, which stays in a cryostat, where the accuracy of its wavelength is 4 orders of magnitude better than the accuracy of a FP64 number, and where the resonator is actively tuned, typically with piezoelectric actuators, so that its length stays at a precise multiple of the wavelength, i.e. with the same accuracy. Only the average length of the resonator has that accuracy, the thermal movements of the atoms cause variations of length superposed over the average length, which are big in comparison with the desired precision, which is why the resonator must be cooled for the best results.
However, it does not really matter whether we can build a perfect sphere or circle. What it matters that modelling everything while using a geometry that supposes the existence of perfect circles we have never seen errors that could be explained by the falseness of this supposition.
The alternative of supposing that there are no perfect circles is not simpler, but much more complicated, so why bother with it?
I agree though, that you have come up with a contradiction. Specifically, because you seem to believe these two statements:
There are finitely many natural numbers.
Given any finite list of natural numbers, we can always produce another natural number not on that list.
But taking the limit of a sequence of rationals isn’t guaranteed to remain in the rationals (classic example: https://en.wikipedia.org/wiki/Basel_problem. Each partial sum is rational, but the limit of the partial sums is not)
So, how does that statement rebut “You can't do rigorous calculus (i.e. real analysis) on rationals alone.”?
When talking about whether arbitrarily precise numbers are real in the universe, it extremely matters.
Sadly, atoms exist. In some ways that makes things more complicated, but it's the truth. Anything made of discrete chunks in a grid can't have arbitrarily precise dimensions.
I'm not saying it does. What I'm saying is that you can make a correspondence with the reals by using only rationals.
You can define convergence without invoking the reals (Cauchy convergence). If you take any such sequence, you give that sequence a name. That name is the equivalent of a real number. You can then define addition, multiplication - any operation on the reals - with respect to those sequences (again, invoking only rational numbers).
So far, we have two distinct entities: The rationals, and the converging sequences.
Then, if you want, you can show that if you take the rationals and those entities we're calling "converging sequences" together, you can make operations involving the two (e.g. adding a rational to that converging sequence) and eventually build up what we know to be the number line.
I don’t expect this to work. For one thing, we already know the conditions under which the spin precesses. That’s how they measure g-2 .
Also, orbital angular momentum is already quantized. So, I don’t know why you say that the “precession” is responsible for the quantized values for the spin.
the representations of SU(2) for composite particles, combine in understood ways, where for a combination of an even number of fermions, the possible total spin values match up with the possible values for orbital angular momentum.
Could you give an explanation for how you think precession could cause this difference? Because without a mathematical explanation showing otherwise, or at least suggesting otherwise, my expectation is going to be that that doesn’t work.
I have used "precession" for lack of a better term for suggesting its appearance, because while there is little doubt about the existence of 2 separate kinds of rotations in the particles with non-null spin, there exists no complete model of how they are combined.
What it means for there to be infinitely many natural numbers is that for any finite list, there are natural numbers not on that list (something you appear to agree with). If there were finitely many natural numbers this wouldn't be true
Some eggs are smaller than others; some are more dense, etc. Yes, the "count" is maybe sort of interesting in some very specific contexts, but certainly not in any reductive physical context. It only works in an economic context because we have standards like what constitutes a "chicken egg large white grade AAA".
Otherwise it's pretty much a dead end unless you're in the weeds. You just mutter "almost everywhere" as a caveat once in a while and move on with your life. Nobody really cares about the immensely large group of numbers that by definition we cannot calculate or define or name except to kowtow to what is in retrospect a pretty bad theoretical underpinning for formal analysis.
You might say, I can imagine 2 apples, but I can't imagine pi apples, but you could just as easily imagine unrolling a circle with a diameter of 1, and you have visualized "pi" just as well as you can visualize 2 apples.