Your analyzer has a calibration check feature that can compute and display
corrected measurement uncertainties (residual errors) that apply to the
current instrument settings and calibrations.
During a calibration check, you are prompted to connect calibration standards
to your measurement ports. These standards are measured with the current
calibration corrections applied and the residual errors are then calculated and
can be displayed.
The calibration check feature can be a useful tool to help you troubleshoot
poor measurements and to help you determine how often to perform
calibrations on your analyzer.
If you suspect that you may not be making valid measurements, you can use
the calibration check to confirm that your current measurement calibration is
valid or to reveal that it is faulty. Using a different set of calibration standards
for the calibration check than the ones you used for the initial calibration will
help you to rule out the possibility of degraded or faulty cal standards.
Using Calibration Check
Using the calibration check feature can help you determine the optimum
calibration interval for your analyzer in your particular environment.
Calibrations can degrade over time due to temperature drift, connector wear,
cable movements, etc.
To use the calibration check feature to determine the best calibration interval
for your measurements:
1. Perform the type of calibration desired.
2. Determine the residual errors by doing a calibration check.