Digital Predistortion (DPD) is a popular technique for linearizing a power amplifier (PA) to help reduce the spurious emissions and spectral regrowth. DPD requires the learning of the inverse PA nonlinearities by training on the output of the PA. In practical systems, the analog output of the PA will have to go through an analog-to-digital converter (ADC) so that training can be done on a digital processor. The quantization degrades signal quality and may limit the performance of a DPD learning algorithm. However, a lower resolution ADC may cost less and allow for less computational complexity in the digital processing. We study this trade-off to try to find how much precision is needed in DPD systems and discover that for a full-band DPD as few as 6 bits can reliably be used. For sub-band DPD, a single bit ADC can be used.