- Zero redirects here. For other meanings of that word, see also zero (disambiguation).
Zero or nought (0) is a number that precedes the positive one, and all positive numbers, and follows negative one, and all negative numbers.
Zero is a number which means nothing, null, void or an absence of value. For example, if the number of one's brothers is zero, then that person has no brothers. If the difference between the number of pieces in two piles is zero, it means the two piles have the same number of pieces.
In certain calendars it is common usage to omit the year zero, such as the proleptic Gregorian calendar and proleptic Julian calendar.
History
Historically, it is important to distinguish the number zero (as in the "zero brothers" example above) from the numeral or digit zero, used in numeral systems where the position of a digit signifies its value. Successive positions of digits have higher values, so the digit zero is used to skip a position and give appropriate value to the preceding and following digits.
The Indus Valley Civilization peoples (c. 2600 BC) demonstrate the earliest known physical use of decimal fractions in an ancient weight system — 0.05, 0.1, 0.2, 0.5 and negative numbers. They also adopted a miniscule unit of measure equal to 1.704mm, the smallest division ever recorded on a scale of the Bronze Age. However, whether or not they recognized the "0" as a place holder in any sort of symbolic, written representation of these quantities is still unknown.
By the mid second millennium BC, Babylonians had a sophisticated sexagesimal positional numeral system. The lack of a positional value (or zero) was indicated by a space between sexagesimal numerals. By 300 BC a punctuation symbol (two slanted wedges) was co-opted as a placeholder in the same Babylonian system.
Records show that the Ancient Greeks seemed unsure about the status of zero as a number: they asked themselves "how can 'nothing' be something?", leading to interesting philosophical and, by the Medieval period, religious arguments about the nature and existence of zero and the vacuum. The paradoxes of Zeno of Elea depend in large part on the uncertain interpretation of zero. (The ancient Greeks even questioned that 1 was a number.)
By 130 Ptolemy, influenced by Hipparchus and the Babylonians, was using a symbol for zero (a small circle with a long overbar) within a sexagesimal numeral system otherwise using alphabetic Greek numerals. Because it was used alone, not as just a placeholder, this Hellenistic zero is the earliest known documented use of zero as a number in the Old World. In later Byzantine manuscripts of his Syntaxis Mathematica (Almagest), the Hellenistic zero had morphed into the Greek letter omicron (otherwise meaning 70).
But the late Olmec had already begun to use a true zero (a shell glyph) several centuries before Ptolemy in the New World (possibly by the fourth century BC but certainly by 40 BC), which became an integral part of Maya numerals. Another true zero was used in tables alongside Roman numerals by 525 (first known use by Dionysius Exiguus), but as a word, nulla meaning nothing, not as a symbol. When division produced zero as a remainder, nihil, also meaning nothing, was used. These medieval zeros were used by all future computists (calculators of Easter). An isolated use of their initial, N, was used in a table of Roman numerals by Bede or a colleague about 725, a true zero symbol.
The earliest known decimal digit zero is documented as having been introduced by Indian mathematicians about 300.
An early documented use of the zero by Brahmagupta dates to 628. He treated zero as a number and discussed operations involving this number. By this time (7th century) the concept had clearly reached Cambodia, and documentation shows the idea later spreading to China and the Islamic world, from where it is recorded to have reached Europe in the 12th century.
The word zero (as well as cipher) comes from Arabic sifr, meaning "empty".
In mathematics
Zero (0) is both a number and a numeral. The natural number following zero is one and no natural number precedes zero. Zero may or may not be counted as a natural number, depending on the definition of natural numbers.
In set theory, the number zero is the size of the empty set: if one does not have any apples, then one has zero apples. In fact, in certain axiomatic developments of mathematics from set theory, zero is defined to be the empty set.
The following are some basic rules for dealing with the number zero. These rules apply for any complex number x, unless otherwise stated.
- Addition: x + 0 = x and 0 + x = x. (That is, 0 is an identity element with respect to addition.)
- Subtraction: x − 0 = x and 0 − x = − x.
- Multiplication: x · 0 = 0 · x = 0.
- Division: 0 / x = 0, for nonzero x. But x / 0 is undefined, because 0 has no multiplicative inverse, a consequence of the previous rule.
- Exponentiation: x0 = 1, except that the case x = 0 may be left undefined in some contexts. For all positive real x, 0x = 0.
The expression "0/0" is an "indeterminate form". That does not simply mean that it is undefined; rather, it means that if f(x) and g(x) both approach 0 as x approaches some number, then f(x)/g(x) could approach any finite number or ∞ or −∞; it depends on which functions f and g are. See L'Hopital's rule.
The sum of 0 numbers is 0, and the product of 0 numbers is 1.
Extended use of zero in mathematics
Computer science
Numbering from 1 or 0?
Human beings usually number things starting from one, not zero. Yet in computer science zero has become the popular indication for a starting point. For example, in almost all old programming languages, an array starts from 1 by default, which is natural for humans. As programming languages have developed, it has become more common that an array starts from zero by default (zero-based). This is because, with a one-based index, one must be subtracted to obtain a correct offset for things like obtaining the location of a specific element.
Null pointer
A null pointer is a pointer in a computer program that does not point to an object. In C it usually contains the memory address zero. However, it is not required to be zero. Some computer architectures use bit patterns other than zero as their null pointer.
Null value
In databases a field can have a null value. This is equivalent to the field not having a value. For numeric fields it is not the value zero. For text fields this is not blank nor the empty string. The presence of null values leads to three-valued logic. No longer is a condition either true or false, but it can be undetermined. Any computation including a null value delivers a null result. Asking for all records with value 0 or value not equal 0 will not yield all records, since the records with value null are excluded.
Distinguishing zero from O
The oval-shaped zero (appearing like a rugby ball stood on end) and rectangular letter O together came into use on modern character displays. The zero with a dot in the centre seems to have originated as an option on IBM 3270 controllers (this has the problem that it looks like the Greek letter Theta). The slashed zero, looking identical to the letter O other than the slash, is used in old-style ASCII graphic sets descended from the default typewheel on the venerable ASR-33 Teletype. This format causes problems for certain Scandinavian languages which use Ø as a letter.
The convention which has the letter O with a slash and the zero without was used at IBM and a few other early mainframe makers; this is even more problematic for Scandinavians because it means two of their letters collide. Some Burroughs/Unisys equipment displays a zero with a reversed slash. And yet another convention common on early line printers left zero unornamented but added a tail or hook to the letter-O so that it resembled an inverted Q or cursive capital letter-O.
The typeface used on some European number plates for cars distinguish the two symbols by making the O rather egg-shaped and the zero more rectangular, but most of all by opening the zero on the upper right side, so here the circle is not closed any more (as in German plates).
In paper writing one may not distinguish the 0 and O at all, or may add a slash across it in order to show the difference, although this sometimes causes ambiguity in regard to the symbol for the null set.
On the seven-segment displays of calculators, watches, etc., 0 is usually written with six line segments (at right), though on some historical calculator models it was written with four line segments. This variant glyph has not caught on.
"Zero" as a verb
In computing, zero is a default digit, meaning none and initial value. To zero (or zeroise or zeroize) a set of data means to set every bit in the data to zero (or off). This is usually said of small pieces of data, such as bits or words (especially in the construction "zero out").
Zero means to erase , to discard all data from. This is often said of disks and directories, where "zeroing" need not involve actually writing zeroes throughout the area being zeroed. One may speak of something being "logically zeroed" rather than being "physically zeroed".
In firearms, to zero a weapon means adjusting the iron sights or the telescopic sight so that it aims exactly where the bullet goes at a given distance. If the weapon was "zero-ed" at 100 yards, shooting at a target at 150 yards will require to aim higher, as requested by the science of ballistics.
Other uses
In names zero usually means "start" or "origin":
References
See also
Last updated: 05-13-2005 07:56:04