LEXICAL ANALYSIS
ROLE OF THE LEXICAL ANALYZER
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
Integer Literal
|
|
Addition Operator
|
|
Assignment operator
|
|
Identifier
|
Detailed explanation-1: -Explanation: The process of forming tokens from an input stream of characters is called tokenization. 2. When expression sum=3+2 is tokenized then what is the token category of 3? Sum “Identifier” = “Assignment operator” 3 “Integer literal” + “Addition operator” 2 “Integer literal”; “End of statement".
Detailed explanation-2: -In programming language, keywords, constants, identifiers, strings, numbers, operators and punctuations symbols can be considered as tokens.
Detailed explanation-3: -The process of forming tokens from an input stream of characters is called tokenization, and the lexer categorizes them according to a symbol type.
Detailed explanation-4: -Explanation: The sequence of characters in the token is called a lexeme.