Measurement Calibration; What Is Accuracy Enhancement - HP 8753E User Manual

Network analyzer
Table of Contents

Advertisement

Measurement Calibration

Measurement calibration is an accuracy enhancement procedure that effectively removes the
system errors that cause uncertainty in measuring a test device. It measures known standard
devices, and uses the results of these measurements to characterize the system.
This section discusses the following topics:
causes of measurement errors
n
calibration considerations
n
effectiveness of accuracy enhancement
n
correcting for measurement errors
n
ensuring a valid calibration
n
modifying calibration kits
n
power meter calibration
n

What Is Accuracy Enhancement?

A perfect measurement system would have inllnite dynamic range, isolation, and directivity
characteristics, no impedance mismatches in any part of the test setup, and flat frequency
response. In any high frequency measurement there are measurement errors associated with
the system that contribute uncertainty to the results. Parts of the measurement setup such as
interconnecting cables and signal-separation devices (as well as the analyzer itself) ail introduce
variations in magnitude and phase that can mask the actual performance of the test device.
Vector accuracy enhancement, also known as measurement calibration or error-correction,
provides the means to simulate a nearly perfect measurement system.
For example, crosstalk due to the channel isolation characteristics of the analyzer can
contribute an error equal to the transmission signal of a high-loss test device. For reflection
measurements, the primary limitation of dynamic range is the directivity of the test setup.
The measurement system cannot distinguish the true value of the signal reflected by the test
device from the signal arriving at the receiver input due to leakage in the system. For both
transmission and reflection measurements, impedance mismatches within the test setup cause
measurement uncertainties that appear as ripples superimposed on the measured data.
Error-correction simulates an improved analyzer system. During the measurement calibration
process, the analyzer measures the magnitude and phase responses of known standard devices,
and compares the measurement with actual device data. The analyzer uses the results to
characterize the system and effectively remove the system errors from the measurement data
of a test device, using vector math capabilities internal to the network analyzer.
When you use a measurement calibration, the dynamic range and accuracy of the measurement
are limited only by system noise and stability, connector repeatability, and the accuracy to
which the characteristics of the calibration standards are known.
Application and Operation Concepts
6-57

Hide quick links:

Advertisement

Table of Contents
loading

Table of Contents