•Basic Purpose



Basic Purpose of Instrumentation

Basic purpose of Instrumentation in any process is to obtain the requisite information pertaining to the successful completion of the process.

Definition of Instrumentation

Instrumentation is the philosophy based on --

–the preposition that the condition of human society and of industrial processes & operation, should be controlled.
–the principle that before a condition be controlled, it must be measured.
–the dictum that in order to measure a condition or property, it must be segregated.
–The logic “ if you control it manually, you should control it automatically”

Progress of Instrumentation

Progress of Instrumentation in Industry - largely in 1930’s with the introduction of measuring temperature.With the continuous growth of process industries, the need for continuous measurement of various process parameters like pressure, level, weight, flow etc. became urgent.


•What is measurement ?

•Aim of measurement

What is measurement ?

It’s a comparison of a given unknown quantity with one of its predetermined standard values

Aim of Measurement

Primary aim of measurement in process industries is to aid in the economics of Industrial operations by --

–improving the quality of the product

–improving the operating efficiency of production

Hence, Measurement & its quality are very-very important as far as above two conditions are concerned.

Functional Elements of an Instrument

All instruments contain various parts that perform prescribed functions in converting a variable quantity or condition into a corresponding indication.

*Primary Sensing Element

*Variable Conversion Element

*Variable Manipulation Element

*Data Transmission Element

*Data Presentation Element

Functional Elements of an Instrument System


Required to choose most suitable instrument for specific measuring task.

Characteristics can be categorized as :

*Static Characteristics

*Dynamic Characteristics

Static Characteristics

All Static Characteristics of an instrument are obtained in one form or another thru’ a process called Calibration.

*What is Calibration ?

*Related terms & its definitions

What is Calibration ?

Calibration is a broad subject but in general, it may be defined as the process for determination , by measurement or comparison with a standard, of the correct value of each scale reading on a meter or other measuring instrument.

Related terms & its definitions


It is determined as the maximum amount by which the result differs from the true value.


Degree of Exactness for which an Instrument is designed or intended to perform.


Closeness of agreement among a no. of consecutive measurements of the output for the same value of Input, under the same operating conditions.


Closeness of agreement among repeated measurements of the output for the same value of Input, made under the same operating condition over a period of time,approaching from both the directions.


Undesired change in output over a period of time that is unrelated to changes in Input, output conditions or load.


Ratio of Change in output to a specified change in input.


Least Incremental value of Input or Output that can be detected, caused or otherwise discriminated by the measuring device.

Dead zone

Largest range of values of a measured variable to which the instrument does not respond.


Lost motion or free play which is inherent in mechanical elements (such as gears, linkages)

True value

Errorless value of the measured variable.

Static error

Static error of Measuring instrument is the numerical difference between the true value of a quantity and its value as obtained by measurement.

Sources of Error

*Insufficient knowledge of process parameters and design conditions

*Poor design / Design limitations

*Changes in process parameters, irregularities, upsets etc.

*Poor maintenance

*Errors caused by people who operate Instrument


Instruments rarely respond instantaneously to changes in the measured variable.Instead they exhibit a characteristic of slowness / sluggishness due to inherent mass & internal reactions in some cases .Therefore,Dynamic behavior of Instrument is equally important as static behavior.

Dynamic Characteristics (Cont.)

The Dynamic behavior of an Instrument is determined by subjecting the primary element to some unknown and predetermined variations in measured quantity.The three most common variations are:

*Step change

*Linear change

*Sinusoidal change

Related terms & its definitions

*Speed of Response



*Dynamic Error

Speed of response

It is the rapidity with which an instrument responds to changes in the measured quantity.


It is the degree to which an instrument indicates the changes in measured variable without dynamic error.


Retardation or delay in the response of instrument to changes in the measured quantity.

Dynamic Error

Difference between true value of a quantity, changing with time and the value indicated by the instrument, if no static error is assumed.


*Direct indicating instrument (Instrument with no elect. connection)

•Flow meters, Thermometers, Manometers, Oil level gauges etc

*Signal Forming Instrument

•are those which converts the physical value to an electrical signal.
•this signal can be transmitted to central control system for further processing.






5.Weight / Displacement



•Most Fundamental Parameter, widely measured and controlled industrial variable.
•The temperature of a substance is the thermal state of a body or a substance which determines whether it will give heat to, or receive heat from other bodies.If two bodies are placed in contact then heat tends to flow from a body at a higher temperature to a body at a lower temperature, just water flows from higher to lower levels.

Temperature (Cont.)

*Temperature Scales

*Methods of Temperature measurement

*Comparison among Various methods

Temperature Scales

*Based upon some recognized fixed points which are constant in temperature

–Lower fixed point or ice-point
–Upper fixed point, or Steam point

Temperature scale is divided between these interval (Temp. difference)

*Relation between different scales

•°C = 5/9 (°F - 32)
•°C = °K - 273

Methods of Temperature measurement

•Expansion Thermometers

e.g. Bimetallic, Liquid-in-glass, Liquid-in-metal

•Electrical temperature Instruments

e.g. RTD, Thermocouple, Thermistor


e.g. Optical, Radiation

Expansion Thermometers

These type of Instruments are of direct indicating type and mainly used to measure temperature upto a max. 500 deg.C. Their construction are robust and hence, used widely in field based on different applications.

Resistance Temperature Detectors (RTD)

Principle: With the increase of temperature, the electrical resistance of certain metal increases in direct proportion to the rise of temperature.

Platinum, Copper & Nickel are generally used in RTDs.

Resistance elements are generally spring like wires enclosed in a metal sheath with insulation in between.



•Type of Thermocouples & measuring ranges

•Comparison of EMF of various thermocouples

Thermocouples (Cont.)

•Principle: It is based on thermoelectric effect. i.e. If two dissimilar metals are joined together and if one of its junction is heated, then a current flows in the circuit and the amount of current produced depends on the difference in temperature between the two junctions and on the characteristics of the two metals.

Thermocouples (Cont.)

•Different Ranges are here as under:


Thermally sensitive resistors

•Thermistors are semiconductors, made from a specific mixture of pure oxides of Nickel, manganese, copper, cobalt, iron, magnesium, titanium and other metals.sintered at temperature above 982 deg.C

•Small size & fast response, high negative temperature coefficient


Non-contact measurements


•Based upon measurement of radiant energy emitted by the hot body

•Temperature measurements upto 4000 Deg.C


•Based on comparing intensity of radiation emitted by unknown body with the source of known intensity.

•Temp.Range : 600 to 3000 deg. C

Comparison among methods


•Pressure is defined as the amount of Force applied to a surface or distributed over it and is measured as force per unit area.

•The pressure instruments usually refer to those that are used for measurement of the pressure exerted by a fluid or gas

Pressure (Cont.)

•Unit of Pressure & interrelations

•Related terms

•Methods of measurement

•Comparison among the methods

Units of pressure & interrelations

Related terms

•Gauge pressure (Above Atmospheric)

•Vacuum (Below Atmospheric)

•Barometric (Standard atmospheric - 760 mmHg or 14.7 PSI or 1 Bar)

•Absolute or Total (Gauge + Barometric)

Methods of Pressure m’ment


•Elastic pressure transducers

•Force balance

•Electrical pressure transducers


•Water & mercury are most widely used

•Diff.P = (Diff.) x g x h

P = Pressure

 = Density of Fluid

h = Difference in Fluid levels

Elastic Pressure transducers

•Bourdon tube

–Sealed at one end

–Made of Phosphor bronze, steel

–Mainly used in pressure gauges


–Mainly used in pressure switches and also in low pressure gauges

Force balance pressure transducers

•Dead weight piston gauge

–Used for the m’ment of Higher steady pressures

–For checking the instrument, the force produced on a piston of known area is measured directly by the weight it will support

Electrical pressure transducers

•Strain gauge type

•Capacitive type

Strain gauge type

–It is a passive resistance pressure transducer whose electrical resistance changes when it is stretched or compressed. It can be attached to a pressure sensing diaphragm.

Capacitive type

–Principle :

The capacitance of a capacitor varies when -

1) The changes in the plate of area (Directly proportional)

2) The distance between the two plates is changed (inversely proportional)

3) The dielectric constant of plate is changed. (Directly proportional)

Comparison among Various methods(Pressure)


•Measurement of Flow rate & quantity is the oldest of all measurements of process variables.

•Without flow measurements, plant material balancing, quality control and even the operation of any continuous process would be almost impossible

•Q = KA  2gh / 

(Q = Flow rate, A = Cross-sectional area of pipe / duct, h = Differential pressure / head)

Flow (Cont.)

•Liquid Flow


•Gas flow

–Annubar tubes

• Solid flow

–Impact type (Sankyo)


•Most extensively used to measure upto 900 Lt/Hr.

•The size of the restriction is adjusted by an amount necessary to keep the pressure differential constant when the flow rate changes, and the amount of adjustment required is proportional to the flow rate.

Annubar tubes

•Consists of two probes - one facing upstream senses velocity pressure and another facing downstream senses static pressure.The difference between two pressures gives average velocity.

Impact scale solid flow meters

•Works on the principle of Momentum & Impact

•FH =k (w/t) cos

•This force ultimately drives core of the LVDT, output of which is calibrated in terms of Feed rate.


–RF type

Independent but identical low power RF signals are provided to active & shield sections of the probe,whereas,the reference ground of electronics is connected to the vessel.The signal provided to the active media varies with change of media between probe & vessel shell


•Strain Gauge (Load cell)

•Linear Variable Differential Transformer (LVDT)

Strain Gauge

•It converts a Mechanical displacement(which may be in terms of weight or so) into a change of resistance

•Resistance change of the strain gauge is usually converted into voltage by connecting four gauges as arms of Wheatstone bridge and applying excitation to the bridge.


•It converts linear motion into electrical signal

•Construction of the LVDT is such that displacement of core causes increase in voltage in one secondary while reduction in another.The difference of the two voltages is the final output


•Tachometer Generators

–AC Tachometer

–DC Tachometer

•Contact less Tachometers (Proximity type)

Tachometer Generators

•It is an electromechanical device that generates a voltage output proportional to shaft speed

•AC Tacho: Similar to a Two-phase induction meter

•DC Tacho: consists of Permanent magnet to provide the magnetic flux,and an output winding placed on the rotor

Contact less Tachometers

•It consists of a proximity,which when applied voltages, develops flux and flux drops off as the tooth of the gear put in front of Equipment shaft(of which speed to be measured).The frequency of the voltage build-ups (and collapses) is linearly proportional to RPM.


•What it does ?

•Basic Functions

•How it works ?

•Type of actions


•Most Industrial processes require that certain variables such as flow, level, temperature and pressure should remain at or near reference value, called a set point.

•The device that serves to maintain a process variable value at the set point is called a controller.

Basic Functions

–to receive the actual measured value of the variable being controlled

–to compare the value with a reference or desired value (set point)

–to determine the magnitude and direction of any error or deviation

–to provide an output control signal as some function of the deviation to zero or to a small value

How a Controller works ?

Type of Actions

•On-Off type

–e.g. Slide gates, Water spray valves, HSS pumps

•Continuous type (P, I, D)

–DC Drives, Damper actuators, Control valves (e.g.”Fisher”)

PID Actions

•Proportional (P) or Gain

m  Kp x e

•Derivative (D) or Rate

m  Td x de/dt

•Integral (I) or Reset

m  1/Ti x e dt or Ti x dm/dt  e

( m=Output, Kp=Proportional Gain, e=Error, Td =Derivative time, Ti =Integral time )


•Measurement & Controls are effective only if and only if--

–Due care be taken for Environmental conditions (Temperature, Vibration, Dust, Water & corrosive gases) at the measuring point as well as Field Instruments.

–Correct calibration & maintenance of the Equipment / Instrument

–Due care by Operating Personnel / End Users in understanding & handling the Equipments / Instruments