This is a way in which we can represent characters.
There are 127 different characters using the stardard (non-extended) ascii table:
As you can see the table doesn't show binary this is because it would require too much space on the table, but, we can convert the dec number to binary very easily.
This means that it would use 7 digits to represent it right?
Nope, ASCII actualy uses a byte, this is because we can use the last digit to add extended ascii (extra characters). Also we can use the 8th digit as an error checking method (we will talk about this in another post).
Unicode
Not much to say about this but basicly:
- It uses 16 bits which is a huge amount. So huge that we can use it to represent all languages and mathematical symbols.
- And they would normaly use hex.
- so get familiar with it at http://www.unicode.org/charts//
Good summary - is there any relationship between the ascii codes assigned to the latin characters and the unicode code assigned to those letters?
ReplyDelete