How Does Java Use Unicode?

Author Topic: How Does Java Use Unicode?  (Read 2531 times)

Offline kazi shahin

  • Hero Member
  • *****
  • Posts: 607
  • Fear is not real
    • View Profile
    • Personal website of kazi shahin
How Does Java Use Unicode?
« on: August 22, 2010, 10:07:15 PM »
Java was created around the time when the Unicode standard had values defined for a much smaller set of characters. Back then it was felt that 16-bits would be more than enough to encode all the characters that would ever be needed. With that in mind Java was designed to use UTF-16. In fact, the char data type was originally used to be able to represent a 16-bit Unicode code point.

Since Java SE v5.0, the char now represents a code unit. It makes little difference for representing characters that are in the basic multilingual plane because the value of the code unit is the same as the code point. It does mean that for the characters on the other planes two chars are needed. The important thing to remember is a single char data type can no longer represent all the Unicode characters.
Kazi Shahin                   
092-15-795
Department of CSE   
Cell : 01718 699 590
Blood Group: O+
Google + :  https://plus.google.com/u/0/101741817431143727344/about?hl=en
Facebook : http://www.facebook.com/kazishahin.rahman
Web : http://www.kazishahin.com/