Negative Numbers • • • • • • • • • As we have already mentioned, negative numbers are a relatively new type of number. It was not until the 1800s before they were fully accepted as being numbers. Initially, numbers were created by counting things or parts of things, or talking about the absence of things. To have less than nothing, however, seemed like an impossible situation. When did negative numbers come from then? By solving equations such as x + 3 = 1 and x2 + 3x + 2 = 0. The Egyptians and Mesopotamians could solve linear equations more than 3000 years ago, but they did not consider negative solutions. Greek scholars also ignored negative numbers. (Remember, most of our mathematics that we study today came from the Greeks.) Greek mathematicians considered negative numbers to be: impossible, absurd, false, etc. Ancient Chinese mathematicians, on the other hand, did utilize negative numbers when solving equations, as long as the negative numbers were part of the intermediate steps and were not the result. Ancient Indian mathematicians also worked with negative numbers, though they considered them suspicious and inapplicable. Brahmagupta, an Indian mathematician from around the 7th century, recognized and worked with negative quantities. • He considered positive numbers as possessions and negative numbers as debts. • He had rules for adding, subtracting, multiplying and dividing negative numbers. Indian mathematics came to Europe through the Arabs. The Arabs, however, did not use negative numbers. • Al-Khwarizami’s algebra book, for example, ignored negative numbers. • The Arabs did know how to expand products such as (x - a)(x - b), which caused them to use the rules that a negative times a negative is a positive and that a positive times a negative is a negative. European mathematicians made a number of advances after the Renaissance and at this time began exploring negative numbers. • Although mathematicians had learned to add, subtract, multiply, and divide with negative numbers, they still had difficulty with the concept of negative numbers. • Due to this, there was still a lot of skepticism. • Negative numbers were still considered by some as “fictitious” or “absurd”. • More hesitation occurred when solving equations and coming upon solutions that • ! • involved "1 . If we were going to accept negative numbers, then we would have to accept numbers of this sort. John Wallis (1655) claimed that negative numbers were bigger than infinity. He said that if 3/0 was infinite, then 3/(something smaller than 0) must be even bigger. Therefore, 3/1 = -3 is bigger than infinity. Isaac Newton (1707) wrote, “Quantities are either Affirmative, or greater than nothing, or Negative, or less than nothing.” European Acceptance of Negative Numbers By the middle of the 18th century, negative numbers became more or less accepted as numbers. Euler, in 1770, wrote: Since negative numbers may be considered as debts, because positive numbers represent real possessions, we say that negative numbers are less than nothing. Thus, when a man has nothing of his own, and owes 50 crowns, it is certain that he has 50 crowns less than nothing; for if any one were to make him a present of 50 crowns to pay his debts, he would still be only at the point of nothing, though really richer than before. Despite this overall acceptance, there were still doubters. In 1831, Augustus DeMorgan (a British mathematician) wrote: The imaginary expression "a and the negative expression –b have this resemblance, that either of them occurring as the solution of a problem indicates some inconsistency or absurdity. As far as real meaning is concerned, both are equally imaginary, since 0 – a is as inconceivable as "a . The birth of Abstract! Algebra in the early 19th century, however, helped overcome the resistance to negative numbers. Abstract Algebra is a discipline in mathematics in which we look at the rules of algebra and qualities of algebraic systems in a way which is! removed from specific numbers or sets of numbers. For example, with addition, we call 0 the “identity element” because a + 0 = a. With multiplication, 1 is the identity element, because a * 1 = a. With addition, -a is the inverse of a, because a + (-a) = 0 (the identity element). With multiplication, 1/a is the inverse of a because a * (1/a) = 1 (the identity element).