Many people know that computers only understand binary (1s and 0s), so when we type characters such as ‘a’ or ‘b’, how does the computer understand and store these characters?

The answer is encoding: characters are mapped to numbers, which the computer can easily translate to binary. There are different standards of encoding. Let’s examine 2 notable ones: ASCII and Unicode (most used today).

ASCII

Until the 1980s, ASCII was the leading character encoding standard. ASCII has most common Western characters (and a few other things) map to numbers 0 to 127. An example is shown below for a.

Image for post
Image for post

a maps to the the number 97, which ASCII encodes to binary as 01100001. …

About

Matthew Liu

Software Engineer

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store