MICROPROCESSOR AND MICROCONTROLLER

INTRODUCTION TO MICROPEOCESSOR

MICROCOMPUTER SYSTEM

Question [CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
It is the maximum deviation in step size from the ideal step size.
A
Accuracy
B
Full-Scale Error
C
Linearity Error
D
Resolution
Explanation: 

Detailed explanation-1: -Linearity error is the maximum deviation in step size from the ideal step size. Some D/A converters are having a linearity error as low as 0.001% of full scale. The linearity of a D/A converter is defined as the precision or exactness with which the digital input is converted into analog output.

Detailed explanation-2: -The linearity error is defined as the maximum deviation of the instrument response curve from the linear fitted curve where the slope is the amplification coefficient [47].

Detailed explanation-3: -The resolution is the smallest increment of output that the DAC can produce. An 8-bit, DAC has a resolution of 8 bits, or one part in 28. This results in a percentage of 0.39%. Linearity. Linearity is the maximum allowable deviation from an ideal straight line drawn between the zero-scale and full-scale outputs.

There is 1 question to complete.