Resolution; Accuracy - Agilent Technologies 34420A User Manual

Nano volt/micro ohm meter
Table of Contents

Advertisement

Chapter 8
Interpreting Meter Specifications

Resolution

Resolution is the numeric ratio of the maximum displayed value divided by
the minimum displayed value on a selected range. Resolution is often
expressed in percent, part-per-million (ppm), counts, or bits. For example,
a 6 1/2-digit meter with 20% overrange capability can display a measurement
with up to 1,200,000 counts of resolution. This corresponds to
about 0.0001% (1 ppm) of full scale, or 21 bits including the sign bit. All four
specifications are equivalent.

Accuracy

Accuracy is a measure of the "exactness" to which the meter's measurement
uncertainty can be determined relative to the calibration reference used.
Absolute accuracy includes the meter's relative accuracy specification plus
the known error of the calibration reference relative to national standards
(such as the U.S. National Institute of Standards and Technology). To be
meaningful, the accuracy specifications must be accompanied with the
conditions under which they are valid. These conditions should include
temperature, humidity, and time.
There is no standard convention among meter manufacturers for the
confidence limits at which specifications are set. The table below shows the
probability of non-conformance for each specification with the given
assumptions.
Specification
Criteria
Mean ± 2 sigma
Mean ± 3 sigma
Mean ± 4 sigma
Variations in performance from reading to reading, and instrument to
instrument, decrease for increasing number of sigma for a given specification.
This means that you can achieve greater actual measurement precision for a
specific accuracy specification number. The 34420A is designed and
tested to meet performance better than mean ± 3 sigma of the published
accuracy specifications.
286
Probability
of Failure
4.5%
0.3%
0.006%

Hide quick links:

Advertisement

Table of Contents
loading

Table of Contents