> That is how Whitehead and Russell did it in 1910. How would we do it today? A relation between S and T is defined as a subset of S × T and is therefore a set.
> A huge amount of other machinery goes away in 2006, because of the unification of relations and sets.
Relations are a very intuitive thing that I think most people would agree that are not the invention of one person. But the language to describe them and manipulate them mathematically is an invention that can have a dramatic effect on the way they are communicated.
For instance, I frequently use the example "1+1=10" in binary to illustrate that, while our reasoning may seem fundamentally different, it's simply because we're starting from different premises, using distinct methods, and approaching the same problem from unique angles.
Really low level embedded work? Most programming I know about effectively works in base 10 or sometimes hex.
One plus one equals two.
One + 0x01 ≡ 2.0
1+1=10 (in binary)
None of these are "vastly different conclusions". None of these are starting from different premises. None of these are using different reasoning. You're literally just writing it differently. Okay, so? This is a pointless distinction that doesn't even apply in a verbal debate at all. It'd be like having a philosophical debate with someone and them suddenly saying "oh yeah, but what if we were arguing in Spanish!? Wouldn't that BLOW YOUR MIND!?" No? It has absolutely nothing to do with anything. I would be annoyed at you if you tried to use this in an argument with me.
There are no problems with the square root of two.
> show me any physical system where an action times an action does not equal a reaction.
Show me any gazzbok where a thrushbloom minus a grimblegork does not equal a fistelblush. Haha, you can't do it, can you!? I WIN!
That is to say: you're using silly made up definitions of "action" and "times" here.
But 1+1=10 and 1+1=2 are not different conclusions, they are precisely the same conclusions but with different representations.
A better example might be 9 vs 6 written on the parking floor: depending on where you're standing, you'll read the number differently (and yet one of the readings is wrong).
Either I misunderstand the notation or there seems to be something missing there - the right hand side of that implication arrow is not a formula.
I would assume that what is meant is α⊂β→α∪(β−α)=β
Not quite sure what an action times an action is, but how about rotating a 2d shape 180 degrees? Do that twice and it's the same as not rotating it at all.
While the article is nice, I believe that the tradition entrenched in mathematics of taking sets as a primitive concept and then defining ordered pairs using sets is wrong. In my opinion, the right presentation of mathematics must start with ordered pairs as the primitive concept and then derive sequences, sets and multisets from ordered pairs.
The reason why I believe this is that there are many equivalent ways of organizing mathematics, which differ in which concepts are taken as primitive and in which propositions are taken as axioms, while the other concepts are defined based on the primitives and other propositions are demonstrated as theorems, but most of these possible organizations cannot correspond to an implementation in a physical device, like a computer.
The reason is that among the various concepts that can be chosen as primitive in a mathematical theory, some are in fact more simple and some are more complex and in a physical realization the simple have a direct hardware correspondent and the complex can be easily built from the simple, while the complex cannot be implemented directly but only as structures built from simpler components. So in the hardware of a physical device there are much more severe constraints for choosing the primitive things than in a mathematical theory that only describes the abstract properties of operations like set union, without worrying how such an operation can actually be executed in real life.
The ordered pair has a direct hardware implementation and it corresponds with the CONS cell of LISP. In a mathematical theory where the ordered pair is taken as primitive and sets are among the things defined using ordered pairs, many demonstrations correspond to how various LISP functions would be implemented. Unlike ordered pairs, sets do not have any direct hardware implementation. In any physical device, including in the human mind, sets are implemented as equivalence classes of sequences, while sequences are implemented based on ordered pairs.
The non-enumerable sets are not defined as equivalence classes of sequences and they cannot be implemented as such in a physical device but at most as something of the kind "I recognize it when I see it", e.g. by a membership predicate.
However infinite sets need extra axioms in any kind of theory and a theory of finite sets defined constructively from ordered pairs can be extended to infinite sets with appropriate additional axioms.
Even base-N representations are an invention: S() and zero are all you need, but Roman Numerals were an improvement over base-1 representations and base-N is significantly more convenient to work with.
Luckily, our imaginary reality of precision is close enough to the true reality of probability that it enables us to build things like computer chips (i.e., all of modern civilization). And yet, the nature of physics requires error correction for those chips. This problem becomes more obvious when working at the quantum scale, where quantum error correction remains basically unsolved.
I’m just reframing the problem of finding a grand unified theory of physics that encompasses a seemingly deterministic macro with a seemingly probabilistic micro. I say seemingly, because it seems that macro-mysteries like dark matter will have a more elegant and predictive solution once we understand how micro-probabilities create macro-effects. I suspect that the answer will be that one plus one is usually equal to two, but that under odd circumstances, are not. That’s the kind of math that will unlock new frontiers for hacking the nature of our reality.
The axiom schema of specification is added to avoid Russell's paradox.
A set in the naive meaning is just a well-defined collection of objects.
As ordered pairs are a binary relation, foundedness or order are operation dependant, and assuming an individual set is unordered is a useful assumption.
But IMHO it is problematic from a constructivist mathematics perspective. The ambiguity of a nieve set, especially when constricting the natural numbers, which are obvious totally ordered is a challenge to overcome.
I know the Principia was focused on successor sets, so mostly avoid it, but IMHO they would have hit it when trying to define an equally operation
If you remember membership and not elements define a set:
{a,b,c}=={a,b,c,b}=={c,b,b,a}
In a computing context, there were some protocols that may have been IBM specific that required duplicate members to be adjacent.
So while the first and the third sets would be equivalent, the second wouldn't be, so order mattered.
Most actual implementations just dropped the redundant elements, vs track membership, but I was just trying to provide an actual concrete example.
IIRC the axiom schema of specification is one of those that was folded into others in modern ZFC textbooks so it is easy to miss.
No-one can stop you from using terms as you please and investigating their consequences, but, at least in modern mathematical parlance, a binary relation is the set of ordered pairs that are "related" by it. (Your relation would seem to be just a bare set, or perhaps a unary relation, not a binary relation which I think is what is often meant without default modifier.)
(ZERO)
and numbers are (ADD1 (ZERO))
(ADD1 (ADD1 (ZERO)))
etc. The prover really worked that way internally, as I found out when I input a theorem with numbers such as 65536 in it. I was working on proving some things about 16-bit machine arithmetic, and those big numbers pushed SRI International's DECSystem 2060 into thrashing.Here's the prover building up basic number theory, one theorem at a time.[1] This took about 45 minutes in 1981 and takes under a second now.
Constructive set theory without the usual set axioms is messy, though. The problem is equality. Informally, two sets are equal if they contain the same elements. But in a strict constructive representation, the representations have to be equal, and representations have order. So sets have to be stored sorted, which means much fiddly detail around maintaining a valid representation.
What we needed, but didn't have back then, was a concept of "objects". That is, two objects can be considered equal if they cannot be distinguished via their exported functions. I was groping around in that area back then, and had an ill-conceived idea of "forgetting", where, after you created an object and proved theorems about it, you "forgot" its private functions. Boyer and Moore didn't like that idea, and I didn't pursue it further.
Fun times.
[1] https://github.com/John-Nagle/pasv/blob/master/src/work/temp...
Or is the difficulty in introducing a canonical order for the ordered pair, or introducing well/partial-ordering in sets themselves? I guess I see an ordered pair as more of an indexical definition than an ordering definition.
I believe they’re quoting Howard’s Rogan interview, fwiw
So it is a simple example showing that the way humans process language influences the representation/definition of mathematical ideas.
The issue is 1+1 has no guarantee it will be two. You look carefully you can see the first 1 is exactly the same as the second 1 !!!!
Hence put the set of all Russell that do that kind of maths and add to another Russell also do that maths. You still ended up with one Russell.
That is why go all the trouble to say no intersection and first oneness set does not overlap with the second oneness set etc etc
Qed
Also it’s not even true. There is no hardware representation for the ordered pair containing the earth and the moon. You now need a bit encoding of the information.
The distinctions of infinite constructions you mention are already well understood. See “recursively enumerable set”.
Ordered pairs are trivially definable in terms of sets. It’s a distinction which does not change any of the foundational proofs and gives you no new insight. This is like arguing that bounded vs counted ranges are foundationally important. We can show they are equivalent in one paragraph and move on.
An actually new ideas will give new results.
Naive set theory. Halmos, Paul R. http://people.whitman.edu/~guichard/260/halmos__naive_set_th...
Note the first entry of "Ordered Pairs"
> What does it mean to arrange the elements of a set A in some order?
Also note how the earlier section on "Unordered Pairs" is more about building the axiom of pairing etc...to get to ordered pairs which gets to the Cartesian product, which outputs ordered pairs.
It doesn't matter if you go through Zermelo's theorem+Zorn, that states that every set can be well-ordered, or though Cartesian product's and/or AC. (Note: This is in FoL well-ordering and AC are the same, but not in SoL and HoL)
It is not that sets are expressly unordered, as a set of points in a line segment would very much have an order, but that you didn't actively arrange the elements in order to take advantage of properties that are useful to you.
Maybe I just hit mental blocks but IMHO it is important that when you make the assumption that "there exists a set." it is very important to realize that it is "unordered" because you haven't imposed one, but is not an innate property of an element of the set.
Hopefully that helps in addressing this from your original post.
> "ordered pair" is not part of set theory
While many creators of both naive and formal set theories may choose to not define (a,b) = {{a},{a,b}} explicitly, the output of the Cartesian product is the ordered pairs, so it doesn't matter, you don't have a useful set theory without them.
A strange thing happened to me in mathematics. When I got to the point where these symbols started showing up (ninth grade, more or less) I did not get a thorough explanation of the symbols; they just appeared and I tried to intuit what they meant. As more symbols crept into my math, I tried to ignore them where possible. Eventually this meant that I could not continue learning math, as it became mostly all such symbols.
I got as far as a minor in math. I'm not sure how any of this this happened, but I wish I had a table of these symbols in ninth grade.