What is an A/D converter?
Basic operation of an A/D converter
Let's take a look at the basic operation of an A/D converter.
The A/D converter breaks up (samples) the amplitude of the analog signal at discrete intervals, which are then converted into digital values. The resolution of an analog to digital converter (indicating the number of discrete values it can produce over a range of analog values) is typically expressed by the number of bits. In the above case of a 3bit A/D converter, the upper value (b2) is referred to as the Most Significant Bit (MSB) and the lowest value (b0) the Least Significant Bit (LSB).
The graph below shows the relationship between the analog input and digital output. The minimum analog signal amplitude that can bring about a change in the digital signal is called the Least Significant Bit (LSB), while the (rounding) error that occurs between the analog and digital signals is referred to as Quantization Error.
In addition, the first digital change point (000→001) below 0.5LSB is the zero scale, while the last digital change point (110→111) is termed full scale and the interval from zero to full scale referred to as the full scale range.
Analog signals are converted into digital format by undergoing the following steps.
- Sampling is the process of taking amplitude values of the continuous analog signal at discrete time intervals (sampling period Ts).
[Sampling Period Ts = 1/Fs (Sampling Frequency)]
Sampling is performed using a Sample and Hold (S&H) circuit.
- Quantization involves assigning a numerical value to each sampled amplitude value from a range of possible values covering the entire amplitude range (based on the number of bits).
[Quantization error: Sampled Value - Quantized Value]
- Once the amplitude values have been quantized they are encoded into binary using an Encoder.