Tuesday, November 3, 2009

A Brief History of Addition, Part 1

Here is the first part of the Project I student text narrative I recently wrote for a new project-based computer programming course. The course was developed for Pasadena public high school students by the Pasadena Educational Foundation under the leadership of PEF Executive Director Joan Fauvre in Pasadena, California. As I've already mentioned, it was written to provide an historical context for the development of a desktop calculator.

The rest of the narrative will follow in future postings, but in my next post I'll introduce the work I did in collaboration with Rory Flynn on her recent book, THE BARON OF MULHOLLAND: A Daughter Remembers Errol Flynn.

A BRIEF HISTORY OF ADDITION

For thousands of years, almost all scientific innovation has come from the work of inspired individuals. History has shown repeatedly that one person can indeed make a difference. The development of the computer is no exception – its story is a fascinating tale tracing contributions from a wide range of innovative thinkers over many years.

The computer today is everywhere, and is involved in most transactions of daily life, from buying groceries to connecting a telephone call, from sending an email message across the country to watching a movie on DVD at home.

Computers also enable governments and huge corporations to accomplish large-scale tasks that would be impossible without them, like providing Social Security checks on time, distributing daily newspapers around the country or running large factories efficiently.

Another way to think of the power of a computer, like all important scientific advances throughout history, is to examine its ability to introduce a greater sense of intellectual order to a chaotic and difficult-to-understand universe.

Mankind has probably always sensed that there is a good deal of order in nature, but it was not until Galileo “mathematized” the physical sciences around the year 1600 that scientists could begin to explore exactly how the world worked. His ability to develop formulas in order to explain a range of phenomena, from the rate of acceleration of bodies falling in space to the paths of planetary orbits, demonstrated that the world around us is something that can be understood through observation. In this way Galileo helped pave the way for modern science.

Computers today are so integral to our daily lives that it’s hard to believe that the first computer was only unveiled in 1946. ENIAC – the acronym stands for Electronic Numerical Integrator And Computer – was 100 feet long, 10 feet high, and three feet deep. It contained 70,000 resistors, 18,000 vacuum tubes, 10,000 capacitors and 6,000 switches…not exactly a model for today’s personal computer.

Because ENIAC was developed in secret during the Second World War, few people have heard of John Mauchly, who developed a fascination with electrical devices at an early age, or Presper Eckert, a young genius who was winning science fairs by the age of twelve and applied for his first patent before the age of twenty-one.

Together, Mauchly and Eckert created the ENIAC, and though the entirety of its computing capacity could today sit on a single integrated circuit, Mauchly and Eckert’s invention made all future computers possible.

More than six decades later, we now live in the age of the supercomputer, and most such machines today are actually composed of highly tuned computer clusters capable of astonishingly fast speeds. The world’s fastest supercomputer is the IBM Roadrunner, located at Los Alamos Laboratory. It was built for the U.S. Department of Energy’s National Nuclear Security Administration. Operational since 2008, the Roadrunner is about twice as large as the ENIAC and accomplishes feats such as simulating how nuclear materials age in order to keep the United States’ aging nuclear arsenal safe and reliable.

For all the extraordinary things that a computer can do, it remains a machine whose basic task is the collecting, processing, tracking, comparing, retrieving and storing of information, and its capabilities are still based on mathematics and the primary calculations of addition, subtraction, multiplication and division.

To trace the beginnings of the computer, then, let’s take a look at how those basic mathematical operations first developed, made their way around the world and eventually led to the machine we call the computer.

Back to the Beginning
There is some evidence that primitive man perceived numbers as a quality rather than an abstract representation. In other words, if primitive man saw six horses, then later saw four horses, he would certainly be aware of the difference, even though he never saw those differences in terms of numbers.

Once communication of specific numbers became necessary, the hand became the first counting tool, as children everywhere still demonstrate by learning to count “on their fingers.”

Most likely the necessity for mathematics first arose because of the basic transactions of buying and selling. As societies developed, taxation and the need to measure land also became important.

Historians agree that the earliest man-made counting tool was the abacus, and the earliest version of the abacus, a flat stone covered with sand on which letters and numbers were formed, has been traced to Babylonia, around 2400 BC in the ancient land of Sumer. Located in present-day southern Iraq, Sumer flourished in the region known as the Cradle of Civilization.

Interestingly, the Sumerian numeral system had sixty as its base; this base-60 system was passed down through various civilizations and a form of it is still used today in computing time (sixty seconds to the minutes, sixty minutes to the hour), measurement of angles (360 degrees in a circle), and mapping coordinates (a total of 360 degrees in latitude and longitude).

Figure This
What made the Sumerian system so useful is the fact that sixty is the smallest number that is evenly divisible by every number from 1 to 6.

This means that concepts like hours, angles and mapping directions can be evenly divided into small portions. Most other counting systems are based on ten, probably for the simple reason that that’s how many fingers humans have. One exception, however, is the base-20 numeral system of the Mayan civilization. Historians believe the Mayans used a base of 20 because the huarache sandals they wore exposed their toes, thereby giving them 20 digits with which to count.

The abacus – think of it as one of the earliest forms of man-made technology – was created in various forms throughout many different parts of the world, including Egypt, Greece, Rome, India, China, and even among North Americans of the Western Hemisphere:
• In ancient Babylon, an early form of the abacus was used primarily for the operations of addition and subtraction
• A description of the round abacus used in ancient Egypt was written by a Greek historian, and archaeologists have uncovered a number of discs that are thought to have indeed served as Egyptian counting tools
• Archaeological evidence going back to the 5th Century indicates that the Greek abacus was a wooden table into which small counters made of either wood or metal were set. This style was passed on to ancient Rome and was used even later in the West, right up to the time of the French Revolution in the late 1700’s
• The Incas used a yupana as a counting tool and a series of knotted ropes, called quipu, to record the sums

Perhaps the greatest refinement, however, came with the development of the Chinese abacus. Though the earliest written mention of the abacus in China dates from the 14th Century, an illustration of an early abacus, called a suanpan, can be found in a famous painting by Zhang Zeduan that was completed about a thousand years ago in the early 11th Century.

Instead of using loose beads, the Chinese designed an abacus with beads strung on several parallel rows of thin rods. The beads are used by simply moving them up or down, and sophisticated techniques have been developed to accomplish the operations not only of addition, subtraction, multiplication and division, but also that of square root and cube root, all at a pace that would amaze users of other mechanical counting devices.

In many parts of the world, especially throughout Asia, the abacus is still in widespread use. In Japan, for example, instruction in using the Japanese abacus known the soroban is still taught in elementary school classrooms.

It All Adds Up
The study of these different methods of performing mathematical computations suggests that the process of developing number systems around the world has evolved in similar ways over the past several thousand years, and different number systems even affected each other as civilization grew.

For example, the writing of numbers in columns in order to depict the values of units, tens, hundreds, etc., is common to all of them. Even the Mayan base-20 system has corresponding columns that represent fives, tens and fifteens.

Another example that illustrates how numerical systems evolved in similar ways involves the concept of zero. Originally, zero was used only to represent a number that has no value. This has evolved over time, so that, as a digit, zero is today used primarily as a placeholder. In other words, in the number 2,046, for example, the zero is a placeholder in the hundreds column because there are no hundreds in this number.

The use of zero as a placeholder came about because a number of systems, beginning with the Sumerian, simply left a space or a non-numerical symbol where no value was to be indicated in a larger number, as in the hundreds column of the number “2,046” cited above.

The problem was that this made computations very difficult. The Greeks, too, were puzzled when considering the concept of zero, and scratched their heads over “whether nothing could be considered something” right up to the Middle Ages.

Fractionally Speaking
Fractions, too, illustrate the development of number systems through history.

The earliest known use of the fraction was in ancient Egypt, dated by historians to about 1800 BC. Since all Egyptian writing, even for numbers, was done using hieroglyphs, or pictures that represented concepts, any calculations were extremely difficult to do.

In ancient Rome fractions were also express in words, another sign that they were originally devised as a way to express the relationships between whole numbers, rather than to be used as numbers in their own right.

The Sumerians were the first to devise a more useful way to express fractions within their base-60 number system. Because they didn’t use a zero as a placeholder, however, their numbers and computations were often open to interpretation.

Then a breakthrough happened in India about the year 500AD. It was at this time that several key developments in the Indian number system led to the number system we use today:
• The concept that each numerical figure is represented by a symbol that does not resemble that figure as a picture in the way that hieroglyphs did
• The notion that the value of each numerical figure depends on its placement within the whole number (think units, tens, hundreds, etc.), and
• The idea that zero, besides meaning a value of nothing, can also be used as a placeholder

The Indians developed our current method of writing fractions by placing the numerator over the denominator, though without the horizontal line we use between them.

Due to the trade between India and Arab countries at that time, the use of the Indian number system soon spread to Arab nations. The Arabs were the ones who added the line to separate the numerator from the denominator, sometimes a horizontal line, sometimes a diagonal one.

Once a reliable system of numbers was established, it led to the development of the world’s first mathematicians, often brilliant minds who used numbers in various ways to measure – and help make sense – of the world around them.
In the 9th Century, the caliph of Baghdad, Harun al-Rashid, established the House of Wisdom, a major intellectual center of the Islamic Golden Age. Within it were gathered many of the world’s brightest scholars of the day, devoted to gathering and translating all the knowledge of the known world.

One such scholar, a Persian mathematician named Al-Khwarizmi, spent most of his adult life at the House and made significant contributions to the fields of mathematics, astronomy, geography and cartography.

It was Al-Khwarizmi who took the Indian invention of using zero as a placeholder within a positional number system – a positional number system is one in which each digit derives its value from its position in the number – and went on to establish the new field of algebra, advancing mathematics far beyond the Greeks’ main contribution of developing the principles of geometry. Among Al-Khwarizmi’s major works is a book entitled AL-JABR, named for the operation used to solve quadratic equations that he described in the book.

Some three centuries later, Leonardo Fibonacci of Pisa (c. 1170) was born the son of a state official in charge of the trading customs house in the city today known as Bejaia, Algeria, in North Africa. As a boy he traveled there with his father and there became fascinated with Hindu-Arabic numerals. Because they were so much easier and efficient than Roman numerals, he made a study of them for some years.

At the age of 32, Fibonacci wrote and published LIBER ABACI (BOOK OF CALCULATIONS) and with it introduced to Europe all that he had learned of Hindu-Arabic numerals.

In LIBER ABACI Fibonacci wrote of the nine Arabic numerals we use today. He also presented the concept of zero as a placeholder in numbers. In addition, the book demonstrated the usefulness of the new system in a variety of applications, such as its use in weights and measures, in general bookkeeping and in calculating interest.

Fibonacci’s LIBER ABACI was very well received and eventually caught the attention of Holy Roman Emperor Frederick II, who was a major patron of science and the arts. In this way, learning made its way around the world.

Part 2 is entitled "A Revolution in Ideas."

No comments:

Post a Comment