Number

I spent a year and a half teaching in a Montessori Upper Elementary classroom. We tried to be constantly aware of developmental issues in the light of the best available child development research, and there is some very good data that has been around for several decades. The 9 to 12 year olds are particularly interesting in relationship to number and arithmetic. Researchers tell us that somewhere at the end of this age range, true abstract thought becomes possible. I’ve read it is linked to physical development of the brain. Whatever the reason, you can see it happen.

The Montessori approach to teaching math views the learning process as a triangle with three sides. The base is Concrete, and the two sides are Symbolic and Abstract. The goal is to reach the abstract understanding of number and the operations of arithmetic in concert with use of the symbols. The basic operations are of course adding, subtracting, multiplying, and dividing. With this understanding, daily use of common symbolic math is based on understanding, not just rote learning. This is the basis for the power of math to aid a person in life.

When doing concrete math, the child manipulates physical objects to solve math problems. To teach basic addition to a young child, you might ask the child to take two objects in separate places and put them together, physically performing the math operation of one plus one equals two. In the upper elementary curriculum there are very sophisticated materials that will physically operate with fractions, perform long division, present negative numbers, and even extract square roots and demonstrate basic calculus concepts.

At the same time, you ask the child to write the equation he or she is solving: 1 + 1 = 2. This is symbolic math. It’s the language we use to write math using symbols. The intent is that by combining the concrete with the symbolic, the child, when developmentally ready, will master the abstract concepts behind the symbols. And it is abstract. A child can take one apple from here and one apple from there, put them together and see that there are now two apples. 1 + 1 = 2, however, can mean *anything*, as long as we start with one of something and another of the same thing and end up with two of them.

When explaining the math program at parent meetings, I would start by holding up a poster with a big number “1” drawn on it and ask the parents to tell me what it is. Someone would correctly answer: “It’s the number 1”. I would say, “One what?”

There is no answer to that question. (You could say, “One number”, but that’s called a confusion of types, and people seem to instinctively understand that it doesn’t answer the question.)

The reason there is no answer is that the “what” of the question refers to a thing, and a number is not a thing when it is viewed according to its function. It is completely abstract, a concept waiting for a thing to refer to so it can operate. If you view it as a thing (one of a list like 2, 5, 7, 86, for example) it ceases to function as a number. This is the problem sometimes when children are asked to learn “math facts” without supporting their study with concrete examples. Guiding children to the abstract understanding of number is one of the most rewarding activities I know. Many children who are asked only to memorize operations in symbolic math never attain this understanding.

Abstract implies it is not concrete. That means in a physical sense it is not real. The implications of that are significant.

We are sometimes told how beautifully math is revealed in Nature. We are not often told how miraculous (or perhaps how trivial---I’ll explain later) that is.

In a very important sense, number, and therefore math, does not exist at all in nature. For any number other than “1’ or “0” to have meaning there has to be more than one of the same thing. It is in deciding what “same thing” means that we run into difficulties. In the physical world around us, each thing is unique. If two red round objects fall from the same tree, they will always be different sizes, shapes, and shading *within certain limits.* These limits allow us to generalize the similarities and class them both as apples. Then we can say we have two apples. Voila! The number “2”.

But look what we did to create two of something. We choose to ignore the differences and generalize certain similarities. In other words, we abstracted from the physical world a definition of sameness that allowed us to create two “apples”. We have said that the fact that they are clearly different in size, shape, and color (let alone occupying different spaces!) doesn’t matter. They’re still two examples of the same thing.

NOTE - If we had decided to call them “fruits” then we could have an apple and a pear together and again have the number 2. How we number them depends totally on how we define them. The apple and the pair couldn’t care less. None of this has anything to do with the things themselves. It’s all in our minds.

This is functional and adaptive. We do this to make sense of our world and to operate in it and on it to survive and to thrive. The point is that number is created in our minds. It doesn’t exist in the things themselves.

We forget this. It is one of the reasons why so many wise people have regularly encouraged us to “see things as they are” once in a while. See the differences. See the individual unique nature of each part of our world. As adult mental beings, we become used to the mental classifications of the world that have made us so successful as a species and lose the easy ability to just see what’s in front of us.

Number is the ultimate abstraction from the natural world. Why then do mathematicians find so much math in the natural world? The possible answers have two poles and a wide spectrum between them. The two boundaries are that either 1) the physical world is built up on the concepts of mathematics, because math is fundamental to existence; or 2) humans, being mental creatures, will always find mathematical answers because that’s how we ask the questions, not because it’s fundamentally there. (This is the trivial possibility I mentioned above.)

This debate has been the focus at one time or another of most of the great philosophers in history, so I’m not going to try to decide it here. I just want to present the question again, because I don’t think it should ever be far from our minds. We should regularly stop and ask ourselves how much of what we see and judge around us is actually coming from us, not from what is there.

So what is a number? If it isn’t real, how do we define it? In the rich history of numbers, one development that stands out is the use of the number 0. It’s very rare. Historians generally believe that 0 has been independently invented at most 3 times, by the Hindus, the Arabs, and the Mayans, and there is a strong possibility that the Arabs learned it from the Hindus, making only *two* independent appearances.

Think about it. Roman numerals don’t have a 0. Neither did the ancient Greeks use it, nor did the Egyptians, the Mesopotamians, the Chinese, and many other highly advanced cultures.

Knowing this should prepare you for the revelation that number was not successfully defined in a logical way until around the turn of the 20^{th} century. This seems bizarre at first thought. Socrates, Aristotle, Archimedes, Galileo, Newton, etc, never actually knew what “number” means! This shows a couple of things. Obviously, in most cases, knowing the definition of number must not be important in order to use numbers and use them in highly sophisticated ways. Also, it means that number is so abstract a concept and so deeply imbedded in how we think, that putting the concept into words is something our minds aren’t naturally equipped to do. It’s a little like asking a fish to define “water”. The fish has no need to define something so basic. It just swims and lives in it. We, of course, like to understand things, or at least believe that some one of us somewhere understands it. It’s how we’re wired.

If you try to come up with a definition of number from scratch, your thought process might go like this: A number represents the number of things in all groups of things that have the same number of things. Oops. That won’t work. I’ve used number twice trying to define number. A definition of something has to be completely in terms of concepts that are understood without the use of the concept being defined. After all, it doesn’t do any good to define the color blue by saying it’s the color of all blue things. OK. I did use the concept ‘group’. That’s a different concept. Maybe I can expand on that. Maybe number can be defined by using groups. It’s certainly true that a number defines a kind of group. Maybe the converse is true?

I won’t go through all of this. Different people would pursue the line of thought differently and to different lengths. Frege, with later clarification by Russell, used set theory to define number in exactly this way. Note: In set theory the words ‘class’, ‘group’, and ‘set’ are basically synonyms and are used to distinguish various levels of collections of things.

To paraphrase their definition, a number is the class of all classes that are congruent with each other. “Congruent classes” means that each member of any two of the classes can be paired up with a member of the other class with no members left out in either class. I’ll repeat that. Each member of any two of the classes can be paired up with a member of the other class with no members left out in either class. Just like congruent triangles, there is a sense in which you could put one on top of the other and they will match up exactly. Of course, if one class contains apples and the other contains dump trucks, they won’t LOOK the same, but they can be paired up exactly, one by one, with nothing left over, if the two classes are congruent.

There is some more complication to this, which you can look up for yourself, but this is the basic definition. This version should give you a sense of how it was done and to introduce set theory.

One of the reasons I’ve presented this is to show that set theory is familiar at its root. It, like most useful concepts, is just a formalization of something that all humans do naturally. As I pointed out at the beginning, we naturally class everything into groups (or sets) of related objects.

Let’s go back to the mysterious 0. What exactly does 0 mean?

The first answer is usually “nothing”. It means you have nothing. That’s true in a sense, but 0 is more concrete than that. If you just have nothing, there is no need to write it down. In fact, just to know that you have nothing you have to be aware that there is a particular something you could have in place of that nothing. Otherwise, how could you know you don’t have it?

0 means you don’t have something, but you are aware you could have it. It means you have none of a particular thing. Like any number, it only has meaning when things are attached to it.

Remember, we number things in our minds. They aren’t numbered in the world. That means we choose how we group things when we number them. Our current civilization most commonly uses the decimal system. The decimal system counts sets of units, tens, hundreds, thousands, etc, separately and adds them together in order to make a single number.

The number 7,012 means 7 sets of a thousand things, 0 sets of a hundred things, 1 set of ten things, and 2 sets of one thing. The 0 is crucial here. Without it, the number would be 712, which means something very different. 7,012 is the decimal name of the class of all classes congruent with this particular grouping of decimal sets.

This is the beginning of set theory, and as I said before, it’s something we use naturally every day. It just gets a little complex when we talk about it.

Set theory also incorporates the distinction between 0 and the more general concept “nothing”. “Nothing” is called “null”, (the “null set”). This is a universal set of potential. It is completely undefined by things and is not a number at all. It’s sort of a potential for number to exist. It means not only do you not have something, but you don’t even know about the existence of what you don’t have.

0 on the other hand, is a number holding the place of something definite. You may have no dollars, or no apples, or no time, but for it to really be a zero, it has to be able to refer to some particular thing.

If you think about it, I’m sure you can come up with examples of situations in your life when a null turned into a 0. Sometimes, unfortunately, by going through some larger number first. Suppose something that you didn’t know existed (it was null to you) was acquired by you suddenly: a surprise gift of a particular rare vase from a friend. Then you had 1 of this vase. Later your dog knocked it off the table, so now you have 0 of the vase. Once you have a category for something (rare vase given to me by Thelma), 0 describes not having it. Before you knew about it, it was null.

It could also have happened if Thelma had only told you about seeing the vase at an expensive shop. You still don’t have it, but the null has turned into a zero with less stress involved.

This realization, that there is a practical distinction between nothing and 0, must have been necessary for the historical inventions of 0. It’s no wonder it’s so rare an occurrence.

My intent has been to explain the abstract nature of something that everyone does on a regular basis---use numbers---and to give some appreciation of the logical processes involved with using them. These processes that we consider normal are so subtle that the combined efforts of some of the brightest people who ever lived have only recently been able to begin to describe them and still can’t fully explain them.

That’s OK. Just like a fish in water, we can go on using them. As we do, we might occasionally think about what they are and enjoy the wonder of how easily our minds handle the deep abstraction we are performing.