ASCII, pronounced 'ask-e,' is an acronym for the American Standard Code for Information Interchange. ASCII is a code created in the 1960s for programming consistency.
An error occurred trying to load this video.
Try refreshing the page, or contact customer support.
You must cCreate an account to continue watching
Register to view this lesson
As a member, you'll also get unlimited access to over 84,000 lessons in math, English, science, history, and more. Plus, get practice tests, quizzes, and personalized coaching to help you succeed.
Get unlimited access to over 84,000 lessons.Try it now
Already registered? Log in here for accessBack
There are 128 characters, including upper-case, lower-case, and special characters, in the ASCII basic coding. ASCII breaks down into a base-2 system. This is a binary code; remember, 'bi' equals 2. Binary digits, or bits, are either 0 or 1, and if you want to visualize this, they're either 'on' or 'off.' This is what your computer can actually understand.
This particular system has eight bits that are strung together representing one character. So eight bits is equal to one character is equal to one byte. For example, 01001100 is an L. It is also one byte. It would be very difficult for us to use binary language to communicate. For instance, my name is Lori. In binary language that would be 01001100 01101111 01110010 01101001. I would run out of space on forms!
Let me give you a comparison on bytes. Now remember, again, eight bits is equal to one byte. A two page double-spaced document is about 1,000 characters and about 15 kilobytes. 1,084 bytes is a kilobyte. The rest of these references are approximate. 1,024,000 bytes is a megabyte; over one billion bytes is a gigabyte; just over one trillion bytes is a terabyte; eight quadrillion bytes (that's fifteen zeros) is a petabyte. And a new term for you to impress your friends and relatives with is the yottabyte. That would be 1,208,925,819,614,629,174,706,176 bytes!
So you can visualize a little better, here is a capacity comparison: an old 3.5 inch floppy disk holds 1.44 megabytes - that's about 750 pages. A compact disc holds 700 megabytes, or about an hour's worth of songs. And a single-sided DVD holds 4.7 gigabytes, or just a regular old movie. All of these storage and byte references can be considered as received from an analog input method. In other words, what we put into the computer can't actually be processed by the computer. That's where the ASCII comes back in.
The binary language will take what we put in (the analog) and convert it to a digital language the computer can understand and process. It then translates the digital language the computer generates back to the analog response we can see and understand.
So remember, the American Standard Code for Information Interchange, or ASCII, is used for consistency in programming, allowing our computer to take what we input and convert it to a language the computer can understand. It's a base-2 system with the binary digit, or bit, consisting of zeros and ones, or 'on' and 'off.' One byte, or set of eight bits, represents one alphabetic or special character and builds to file sizes based on input or storage capacity.
To unlock this lesson you must be a Study.com Member.
Create your account
Register to view this lesson
Unlock Your Education
See for yourself why 30 million people use Study.com
Become a Study.com member and start learning now.Become a Member
Already a member? Log InBack
ASCII and Unicode to Represent Characters in Binary Code
Related Study Materials
Explore our library of over 84,000 lessons
- College Courses
- High School Courses
- Other Courses
- Create a Goal
- Create custom courses
- Get your questions answered