A computer can think at speeds of over a billion machine cycles per second. However, no spoken language is a computer's native tongue. Learn how a computer translates analog to digital signals in this lesson.
The bits, or binary digits, are 0 and 1
ASCII, pronounced 'ask-e,' is an acronym for the American Standard Code for Information Interchange. ASCII is a code created in the 1960s for programming consistency.
There are 128 characters, including upper-case, lower-case, and special characters, in the ASCII basic coding. ASCII breaks down into a base-2 system. This is a binary code; remember, 'bi' equals 2. Binary digits, or bits, are either 0 or 1, and if you want to visualize this, they're either 'on' or 'off.' This is what your computer can actually understand.
This particular system has eight bits that are strung together representing one character. So eight bits is equal to one character is equal to one byte. For example, 01001100 is an L. It is also one byte. It would be very difficult for us to use binary language to communicate. For instance, my name is Lori. In binary language that would be 01001100 01101111 01110010 01101001. I would run out of space on forms!
Let me give you a comparison on bytes. Now remember, again, eight bits is equal to one byte. A two page double-spaced document is about 1,000 characters and about 15 kilobytes. 1,084 bytes is a kilobyte. The rest of these references are approximate. 1,024,000 bytes is a megabyte; over one billion bytes is a gigabyte; just over one trillion bytes is a terabyte; eight quadrillion bytes (that's fifteen zeros) is a petabyte. And a new term for you to impress your friends and relatives with is the yottabyte. That would be 1,208,925,819,614,629,174,706,176 bytes!
So you can visualize a little better, here is a capacity comparison: an old 3.5 inch floppy disk holds 1.44 megabytes - that's about 750 pages. A compact disc holds 700 megabytes, or about an hour's worth of songs. And a single-sided DVD holds 4.7 gigabytes, or just a regular old movie. All of these storage and byte references can be considered as received from an analog input method. In other words, what we put into the computer can't actually be processed by the computer. That's where the ASCII comes back in.
The storage capacity of common mediums
The binary language will take what we put in (the analog) and convert it to a digital language the computer can understand and process. It then translates the digital language the computer generates back to the analog response we can see and understand.
So remember, the American Standard Code for Information Interchange, or ASCII, is used for consistency in programming, allowing our computer to take what we input and convert it to a language the computer can understand. It's a base-2 system with the binary digit, or bit, consisting of zeros and ones, or 'on' and 'off.' One byte, or set of eight bits, represents one alphabetic or special character and builds to file sizes based on input or storage capacity.