In July 1956 German born American computer scientist Werner Buchholz coined the term byte as a unit of digital information during the early design phase for the IBM 7030 Stretch, IBM's first transistorized supercomputer. A byte was an ordered collection of bits, which were the smallest amounts of data that a computer could process ("bite").The Stretch incorporated addressing to the bit, and variable field length (VFL) instructions with a byte size encoded in the instruction. Byte was a deliberate respelling of bite to avoid accidental confusion with bit.
"Early computers used a variety of 4-bit binary coded decimal (BCD) representations and the 6-bit codes for printable graphic patterns common in the U.S. Army (Fieldata) and Navy. These representations included alphanumeric characters and special graphical symbols. These sets were expanded in 1963 to 7 bits of coding, called the American Standard Code for Information Interchange (ASCII) as the Federal Information Processing Standard which replaced the incompatible teleprinter codes in use by different branches of the U.S. government. ASCII included the distinction of upper and lower case alphabets and a set of control characters to facilitate the transmission of written language as well as printing device functions, such as page advance and line feed, and the physical or logical control of data flow over the transmission media. During the early 1960s, while also active in ASCII standardization, IBM simultaneously introduced in its product line of System/360 the 8-bit Extended Binary Coded Decimal Interchange Code (EBCDIC), an expansion of their 6-bit binary-coded decimal (BCDIC) representation used in earlier card punches. The prominence of the System/360 led to the ubiquitous adoption of the 8-bit storage size, while in detail the EBCDIC and ASCII encoding schemes are different" (Wikipedia article on Byte, accessed 01-15-2015).