Unless you’re a mathematician or a computer programmer, the chances are that you’ve never heard the term “number system” before. It simply never comes up in the normal bits of life.

When I was born, only mathematicians knew about number systems, and they were the only ones who could program a computer. Then, despite everyone saying it couldn’t be done, Grace Hopper invented high level programming; computer languages like Assembly, FORTRAN, and COBOL. No longer did you have to be a mathematician to program computers. Her accomplishments should be taught in primary school!

Normally, we think of a number system as simply counting. There’s no system, you start with one, and if you’re recording your counts, when you reach nine, the next number is ten, a one and a zero. But that is a number system. It’s base ten, or “decimal”. It’s base ten because it’s based on ten digits, zero through nine.

But that’s not how counting has always worked. Have you ever wondered why clocks go from one to twelve and there are twenty four hours in a day? It’s because at some time in the past, they had a base 12 or 13 number system (the zero is a relatively modern concept), perhaps why 13 is now considered unlucky. 12+1=“Time’s up.”

The very first number system wasn’t written down, because nobody had yet invented writing, but was very obviously base six; zero, which was meaningless then, through five. The digits were the fingers on their hands, the very first calculating devices. One finger on one hand was equal to five on the other hand. “One sheep,” one finger, “two sheep,” two fingers... “five sheep,” open hand. “Six sheep,” one finger on the other hand, first hand closed. You could say a closed fist is zero. “fifteen sheep,” three fingers on one hand and two on the other. Easy to keep track of how many sheep you’ve counted, as long as you don’t have to pick anything up or scratch your ass.

This base six number system became Roman numerals, with IV meaning “one fewer than all fingers” and V signifying an outstretched hand. As their society became more complex, so their method of writing down numbers became more complicated.

The decimal system was invented between the first century and the fourth by the Hindus, and the Arabians learned it from them in the ninth century. We use the Arabian marks for the numbers, as does almost everyone else these days, with variations.

After fingers, the first computer was a pile of rocks, and nobody knows when the first pile of rocks was used as a primitive abacus. The rocks later advanced to become beads on the abacus. The math could be done in any number system with an abacus as long as you have enough beads on a string to cover all the digits in your number system, or a pile of rocks.

A modern digital Turing computer uses the binary number system, with two digits: zero and one, on or off. Five in binary is 101, and yes, you can count on your fingers in binary. The prehistoric base six lets you count higher on your fingers than base ten, which ends at ten, and binary lets you count even higher on your fingers. Yes, in school you can cheat in math class by using your fingers as abacuses if they won’t let you use a calculator.

You can do things in binary math you simply can’t do in decimal, like ANDing or ORing. The Who most likely didn’t know, when they sang “Bargain”, that the lyrics “one and one ain’t two, one and one is one” that they were talking not about romantic love, but boolean algebra. In it, 101 (binary five) AND 011 (binary three) are 001; or:

101

AND __011__

001

That’s 5 (101 binary) AND 3 (011 binary) equals 1. So if someone says “five and three is one,” they’re correct. With an AND, both numbers must be one for that digit to be one. An OR is the opposite; the answer is one if either is one. 5 OR 3 = 7.

That’s really handy in programming. Not so much in day to day life.

The prehistoric base six number system became base 12, an easy conversion from base six, for timekeeping. Because, of course, there are twelve full moons in a year (thirteen in a year unlucky enough to have a blue moon).

Programmers also use octal, or base eight, and hexadecimal, or base sixteen, because they are the easiest number systems to convert to binary.

A digital computer is basically a complex abacus with one bead each on thousands or trillions of strings. Some people say a big enough computer could become sentient. I’m still asking, how many beads do I need to add to my abacus before it becomes self-aware?

Classified Transparencies

Make America Great Again

Teaching an Old Dog New Tricks

Last year's postings

Share on Facebook

You can read or download my books for free here. No ads, no login, just free books.