Our IT guy posted an article in the lunchroom the other day from itworldcanada.com (he's Canadian). The article reported on how the Tokyo Institute of Technology broke the record for data transfer in the Terahertz band width. Their transfer rate was 3Gbps which broke the old record of 1.5Gbps by, well, double. The theoretical upper limit of data transfer for Terahertz wi-fi is 100Gbps. I could totally go for 100Gbps wi-fi at my house.
It was at just this moment that somebody looked over my shoulder and commented on how fast 100 Gigabyte per second data transfer would be. I didn’t bother to correct him, but it did make me realize just how many seemingly intelligent people don’t know the difference between Bit and Byte in this digital age.
The basic definition of Bit and Byte is so fundamental to our use of computers that we take it for granted. Everybody who understands the difference, assumes that everybody else also understands the difference, so they don’t bother to explain it. Everybody who doesn’t know the difference, uses the terms interchangeably and doesn't bother to ask. This is not the first computer savvy person (Mine Engineers and Geologists included) to mix up the terms and he won’t be the last.
The most basic definition of a Bit is that it is a single piece of information. In electronics this is an open or closed circuit. To illustrate this, think of a light switch. When the circuit is closed, electricity flows through the wires and the light turns on. When the circuit is open, no electricity can flow and the light turns off. This is the information stored in a Bit. On or off. Open or closed.
In electronics, we represent the closed circuit with a straight line ( | ). The open circuit is a circle ( O ). Sometimes these are combined into one symbol to represent power. Look at your computer right now. I bet this symbol is on the power button. If you don’t see it there try your television or blue-ray player. This symbol is all over the place in today’s world ( Φ ).
Computer people often talk about binary code being a series of ones and zeros. This is not quite accurate. What they really mean is a series of symbols representing open and closed electric circuits. They just look a lot like ones and zeros.
A Bit can’t store a whole lot of data (it can store two things, open or closed) so to make this storage format more useable we combine eight bits together and called them a Byte. A Byte can store 256 pieces of information (2^8=256).
In writing about Bits and Bytes we use the lower case letter ‘b’ to represent Bit and the upper case letter ‘B’ to represent Bytes. Things like computer storage space are measured in Bytes (B) or kilobytes (KB) or Megabytes (MB) or Gigabytes (GB). Things like data transfer and computer operating systems are measured in bits (b) or Kilobits (Kb) or Megabits (Mb) or Gigabits (Gb).
Your 32 Bit operating system can address about 4 billion data locations (2^32=4,294,967,296) but because each data location is the address for one Byte, your maximum available RAM memory is about 4 Gigabytes. Now that you know the difference between Bits and Bytes be sure to refer to your hard drive in Gigabytes (GB) but your internet connection speed in Megabits (Mb).