Many people know that computers only understand binary (1s and 0s), so when we type characters such as ‘a’ or ‘b’, how does the computer understand and store these characters?
The answer is encoding: characters are mapped to numbers, which the computer can easily translate to binary. There are different standards of encoding. Let’s examine 2 notable ones: ASCII and Unicode (most used today).
Until the 1980s, ASCII was the leading character encoding standard. ASCII has most common Western characters (and a few other things) map to numbers 0 to 127. An example is shown below for a.