How to calibrate HART transmitter

A HART transmitter has a microprocessor that manipulates the input data. Normally there are three calculation sections involved, and each of these sections can be individually tested and adjusted.

  • At the beginning the microprocessor measures the input electrical property that is affected by the process variable in interest. The measured value may be millivolts, capacitance, reluctance, inductance, frequency, or some other property.

  • Then the measured valve is converted into the equivalent milliamp representation.

  • At last output section where the calculated output value is converted to a count value that can be loaded into a digital to analog converter. This produces the actual analog electrical signal.

Calibrating input, output sections:

To calibrate the input section, the basic multiple-point test and adjust technique is employed. To run a test, use a calibrator to measure the applied input, but read all the associated output (PV) with a communicator. Error calculation shows linear relationship between the input and output. Although a linear transfer function is the most common, pressure transmitters often have a square root option.

If it does not pass the test, then follow the manufacturer’s recommended procedure for trimming the input section. This may be called a sensor trim and typically involves one or more trim points. The principle form of this table is usually established by the manufacturer, but most HART instruments include commands to perform field adjustments. This is often referred to as a sensor trim.

To calibrate the output section, the same basic multiple-point test and adjust technique is employed, but with a new definition for input. To run a test, use a communicator to put the transmitter in a fixed current output mode. The input value for the test is the mA value that commands the transmitter to produce. Obtain the output value using a calibrator to measure the resulting current. This test also implies a linear relationship between input and output.

If the test doesn’t passed then follow the manufacturer’s recommended procedure for trimming the output section, this may be called a 4-20 mA trim, a current loop trim or a D/A trim.

Digital Range Change:

Range change is miss understood with calibration, it is not. Changing the range only affects the second block. The true calibration requires a reference standard, usually in the form of calibration equipment, to provide an input and measure the output.

Zero and span adjustment:

Using only the zero and span adjustments to calibrate a HART transmitter often corrupts the internal digital readings. The zero and span buttons only change the range because the instrument does not know the real value of the reference input. Only a digital command that transmits the reference value allows the instrument to make appropriate internal adjustments.

The correct way to correct a zero change condition is to use a zero adjustment. This adjusts the input block of the instrument so that the digital PV is in accordance with the calibration standard. When using digital process values for trends, statistical calculations, etc., disable the external zero and span buttons and avoid using them completely

Loop current adjustment:

To adjust the current loop so that an accurate input to the instrument agrees with some display device on the loop.

Suppose there is a digital indicator in the circuit that shows 0.0 to 4 mA and 100.0 to 20 mA. During the tests, it reads 1.0 with both ports with ventilation, and 101.0 with 100 inches of water applied. Using the communicator, the technician makes a current loop adjustment so that the screen is read correctly at 0 and 100.