- published: 21 May 2016
- views: 9474
A bit (a contraction of binary digit) is the basic capacity of information in computing and telecommunications; a bit represents either 1 or 0 (one or zero) only. The representation may be implemented, in a variety of systems, by means of a two state device.
In computing, a bit can be defined as a variable or computed quantity that can have only two possible values. These two values are often interpreted as binary digits and are usually denoted by the numerical digits 0 and 1. The two values can also be interpreted as logical values (true/false, yes/no), algebraic signs (+/−), activation states (on/off), or any other two-valued attribute. The correspondence between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program. The length of a binary number may be referred to as its "bit-length."
In information theory, one bit is typically defined as the uncertainty of a binary random variable that is 0 or 1 with equal probability, or the information that is gained when the value of such a variable becomes known.