INTRODUCTION TO MICROPEOCESSOR
MICROCOMPUTER SYSTEM
Question
[CLICK ON ANY CHOICE TO KNOW THE RIGHT ANSWER]
|
|
Accuracy
|
|
Full-Scale Error
|
|
Linearity Error
|
|
Resolution
|
Detailed explanation-1: -Linearity error is the maximum deviation in step size from the ideal step size. Some D/A converters are having a linearity error as low as 0.001% of full scale. The linearity of a D/A converter is defined as the precision or exactness with which the digital input is converted into analog output.
Detailed explanation-2: -The linearity error is defined as the maximum deviation of the instrument response curve from the linear fitted curve where the slope is the amplification coefficient [47].
Detailed explanation-3: -The resolution is the smallest increment of output that the DAC can produce. An 8-bit, DAC has a resolution of 8 bits, or one part in 28. This results in a percentage of 0.39%. Linearity. Linearity is the maximum allowable deviation from an ideal straight line drawn between the zero-scale and full-scale outputs.