COMPUTER FUNDAMENTALS

DATA REPRESENTATION AND NUMBER SYSTEMS

ASCII AND UNICODE CHARACTER ENCODING

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
How many bits are used in Unicode?
A
8
B
16
C
2
D
32
Explanation: 

Detailed explanation-1: -Unicode uses two encoding forms: 8-bit and 16-bit, based on the data type of the data that is being that is being encoded. The default encoding form is 16-bit, where each character is 16 bits (2 bytes) wide.

Detailed explanation-2: -Unicode groups all the characters, irrespective of the program, language or the platform and assigns a unique code value to them for processing. The initial version of Unicode used 16 bits for encoding each character. By using 2 bytes for the encode process, a total of 65, 536 characters only could be represented.

Detailed explanation-3: -In all, the Unicode standard provides codes for over 100, 000 characters from alphabets, ideograph sets, and symbol collections, including classical and historical texts of many written languages.

Detailed explanation-4: -UTF-8 requires 8, 16, 24 or 32 bits (one to four bytes) to encode a Unicode character, UTF-16 requires either 16 or 32 bits to encode a character, and UTF-32 always requires 32 bits to encode a character.

There is 1 question to complete.