ASCII and Unicode to Represent Characters in Binary Code

An error occurred trying to load this video.

Try refreshing the page, or contact customer support.

Coming up next: What Is Random-Access Memory (RAM)? - Definition & History

You're on a roll. Keep up the good work!

Replay
Your next lesson will play in 10 seconds
• 0:05 ASCII
• 0:17 Bits
• 1:27 Bytes
• 2:40 Translation
• 3:05 Lesson Summary

Want to watch this again later?

Timeline
Autoplay
Autoplay

Recommended Lessons and Courses for You

Lesson Transcript
Instructor: Eric Garneau
A computer can think at speeds of over a billion machine cycles per second. However, no spoken language is a computer's native tongue. Learn how a computer translates analog to digital signals in this lesson.

ASCII

ASCII, pronounced 'ask-e,' is an acronym for the American Standard Code for Information Interchange. ASCII is a code created in the 1960s for programming consistency.

Bits

There are 128 characters, including upper-case, lower-case, and special characters, in the ASCII basic coding. ASCII breaks down into a base-2 system. This is a binary code; remember, 'bi' equals 2. Binary digits, or bits, are either 0 or 1, and if you want to visualize this, they're either 'on' or 'off.' This is what your computer can actually understand.

This particular system has eight bits that are strung together representing one character. So eight bits is equal to one character is equal to one byte. For example, 01001100 is an L. It is also one byte. It would be very difficult for us to use binary language to communicate. For instance, my name is Lori. In binary language that would be 01001100 01101111 01110010 01101001. I would run out of space on forms!

Bytes

Let me give you a comparison on bytes. Now remember, again, eight bits is equal to one byte. A two page double-spaced document is about 1,000 characters and about 15 kilobytes. 1,084 bytes is a kilobyte. The rest of these references are approximate. 1,024,000 bytes is a megabyte; over one billion bytes is a gigabyte; just over one trillion bytes is a terabyte; eight quadrillion bytes (that's fifteen zeros) is a petabyte. And a new term for you to impress your friends and relatives with is the yottabyte. That would be 1,208,925,819,614,629,174,706,176 bytes!

To unlock this lesson you must be a Study.com Member.

Register for a free trial

Are you a student or a teacher?

See for yourself why 30 million people use Study.com

Become a Study.com member and start learning now.
Back
What teachers are saying about Study.com

Earning College Credit

Did you know… We have over 160 college courses that prepare you to earn credit by exam that is accepted by over 1,500 colleges and universities. You can test out of the first two years of college and save thousands off your degree. Anyone can earn credit-by-exam regardless of age or education level.