Abstract or Demonstration Description: Linearity is a key performance metric for analog-to-digital converters (ADC). Differential nonlinearity (DNL) and integral nonlinearity (INL) indicate how much the actual ADC transfer curve deviates from the ideal transfer curve. While engineers have been measuring linearity for decades, the constant innovation in ADCs with higher resolution, higher sampling rates, and more varied input ranges means that measuring linearity for each ADC can bring new challenges. These challenges can include longer test times as number of bits increase, finding a precision source, and doing this within a budget. A linearity measurement consists of precision source generating a voltage across the entire input range of the ADC. An algorithm is used to determine the actual transfer function of the ADC and thus the linearity. Different types of signal sources such as ramp, triangle, and sine wave are used to drive the ADC inputs. Each type of signal source requires different instrumentation such as a precision DC source or a low distortion frequency generator. One common challenge in high precision ADC testing is finding a source which is more accurate that the ADC. Also, the algorithms require tuning as the parameters such as number of samples, input frequency, and test duration can greatly dictate the accuracy of the results. The objective of this study is to compare different linearity testing methods with different types of instrumentation and present guidelines for selecting the best combination for ADC types. The device under test (DUT) used in this study is a high precision 16-bit ADC.