Calibration; Introduction: Calibration Of Smart Instruments; Primary; Guided Calibration - Fisher FIELDVUE DLC3010 Instruction Manual

Digital level controller
Hide thumbs Also See for FIELDVUE DLC3010:
Table of Contents

Advertisement

Configuration
October 2014

Calibration

Introduction: Calibration of Smart Instruments

Analog instruments generally have only one interface that can be calibrated by the user. A zero and span output
calibration is normally performed at the corresponding two input conditions. Zero/Span calibration is very simple to
use, but provides little versatility. If the 0% and 100% input conditions are not available to the user, a calibration can
sometimes be accomplished, but the gain and offset adjustments will likely interact, requiring considerable iteration
to achieve accuracy. In contrast, intelligent instruments have many interfaces that can be calibrated or scaled by the
user, with consequent increased versatility.
Refer to table 4‐5 for a list of relationships in the DLC3010 that can be calibrated or configured by the user. Note that
not all relationships are listed here.
Table 4‐5. Relationships in the FIELVUE DLC3010 that can be User Calibrated or Configured
Torque Tube Rate
Reference (dry) Coupling Point
Driver Rod Length
Displacer Volume
SG
Displacer Length
Level Offset
URV (Upper Range Value)
LRV (Lower Range Value)
D/A Trim
Instrument Temperature Offset
Proc Temp Offset
These parameters are factory‐set to the most common values for the 249 sensors. Therefore, for the bulk of units sold
in simple level applications, it is possible to accept the defaults and proceed to Trim Zero. If any of the advanced
features of the instrument are to be used, accurate sensor and test fluid information should generally be entered
before beginning the calibration.

Primary

Guided Calibration

Field Communicator
Configure > Calibration > Primary > Guided Calibration (2-5-1-1)
Guided Calibration recommends an appropriate calibration procedures for use in the field or on the bench based on
your input. Follow the Field Communicator prompts to calibrate the digital level controller.
54
The scale factor between the internal digital representation of the measured pilot shaft rotation and the physical torque
input to the sensor.
The angle of pilot shaft rotation associated with the zero buoyancy condition. (The zero reference for the input of the PV
calculation).
The scale factor (moment arm) between a force input to the sensor driver rod and the torque developed as input to the
torque tube.
The scale factor relating the density of the process fluid to the maximum force that can be produced as an input to the
driver rod of the sensor.
The density of the process fluid normalized to the density of water at reference conditions. The scale factor that
transforms displacer volume and measured buoyancy into a level signal normalized to displacer length.
The scale factor to convert normalized level to level on the displacer in engineering units.
The zero reference for the output of the PV calculation, referred to the location of the bottom of the displacer at zero
buoyancy condition.
The value of computed process variable at which a 20 mA output (100% Range) is desired.
The value of computed process variable at which a 4 mA output (0% Range) is desired.
The gain and offset of the D/A converter which executes the digital commands to generate output
Bias to improve the accuracy of the ambient temperature measurement used to provide temperature compensation for
the mechanical‐to‐electronic transducer.
Bias to improve the accuracy of the (RTD) temperature measurement used to provide compensation for
process‐temperature‐related density changes.
Instruction Manual
D102748X012

Hide quick links:

Advertisement

Table of Contents
loading

Table of Contents