How does Unicode cope with hexadecimal digits?
The hexadecimal number system, used in computing, is not that special: you can base a number system on any natural number except the number 1. The most widely used base is 10, but 2, 8, and 12 have also seen extensive use as number bases, whether in computing or archaic mathematics. Hence, it is not wise to define a particular set of digits for every number system somebody might wish to apply. Rather, Unicode, much like its predecessors, assumes that hexadecimal numbers be written with the ordinary (decimal) digits (representing zero through nine), and the letters A through F (representing ten to fifteen). Only from context, it becomes clear whether a string of digits is to be meant as a number, and if so, in which number system. Most applications have defined particular syntax rules to help distinguishing decimal, octal, or hexadecimal numbers from other input tokens, e. g., in some programming languages, “2010” is a decimal number, “0x7DA” is a hexadecimal number, “thisYear” is an id
Related Questions
- There are some digits that I received at the end of my SMS/WAP enquiry, in the format refXXXXXX. What do these figures mean?
- How can I enter Unicode characters in Tramigo? I hear I need to use four digit hexadecimal codes or escape characters?
- How many hexadecimal digits are required to represent decimal numbers up to 20000?