July 22, 2011

BIT Definition 16-bit - 256-bit

Bit Definition for the smallest unit of data in a computer tells you whether a transistor (chip component) leads or less. As a BIT binary digit can have 2 values​​: 0 or 1.

On the computer, the bits are used in groups called Word, which have different length depending on system architecture. The computers were first-generation architecture with 4 or 8-bit: they could then manipulate 4 or 8 bits at a time.
Technological advances have led to the construction of systems capable of using 16, 32 and 64 bits.

Bit Definition In 2011, the year in which it was updated this entry in the glossary, you start talking about computers with 128-bit architecture, while the 256-bit size is still a fiction, but you can be sure that in a few years, if not even a few months, these technologies will become obsolete, replaced by others at the moment beyond any imagination.

Labels: