In computing,
signedness is a property of variables representing
numbers in computer programs. A numeric variable is
signed if it can represent both
positive and
negative numbers, and
unsigned if it can only represent
non-negative numbers (zero or positive numbers).
While signed numbers can represent negative numbers they lose a range of larger numbers which can only be represented with unsigned numbers of the same size (in bits).
This is because in signed variables, roughly half the possible values are non-positive values. Unsigned variables can dedicate all the possible values to the positive number range.
For example, a signed 16 bit integer could hold (approximately) the values -32768 to 32767, while an unsigned 16 bit integer could hold the values 0 to 65535. In signed values, the leftmost bit (most significant bit) marks if the value is positive or negative (0 for positive, 1 for negative).
Lower level programming languages and their respective compilers (C++ is a good example of this) support the declaration of both signed and unsigned data types, however an issue that will occasionally occur when using both signed and unsigned numbers is called an signed/unsigned mismatch. An example in which a signed/unsigned mismatch would occur is when attempting to compare Number 1 and Number 2, or assign the value of Number 1 to Number 2, when Number 1 is signed and Number 2 is unsigned (or vice versa).
Compilers will usually output a Warning, but should continue to compile the code anyway, such as with Visual Studio (2010 in this example):
Warning 25 warning C4018: '<=' : signed/unsigned mismatch [C:/Some file.cpp] 23417
(A signed/unsigned mismatch).
An issue with signed and unsigned numbers is also when attempting to cast (See Type conversion) one number into another with different signedness.
This can potentially cause data loss as the range of numbers a signed integer can store differentiates from that of an unsigned integer.
See also
Sign bit
Two's complement
Signed number representations
Sign (mathematics)
Category:Computer arithmetic
Category:Type theory