RTD’s are based on the principle that the resistance of a metal increases with temperature. The temperature coefficient of resistance (TCR) for resistance temperature detectors (denoted by ), is normally defined as the average resistance change per °C over the range 0 °C to 100 °C, divided by the resistance of the RTD, R_{o}, at 0 °C.

where,

** **R_{0} = resistance of rtd at 0 °C (ohm), and

R_{100} = resistance of rtd at 100 °C (ohm),

Note: Here we are discussing about RTD PT100 only.

** **As a first approximation, the relationship between resistance and temperature, may then be expressed as (see Figure 2):

where: R_{t} = resistance of rtd at temperature t (ohm),

R_{o} = resistance of rtd at 0 °C (ohm), and

= temperature coefficient of resistance (TCR) at 0 °C (per °C)

**Example**

**A platinum RTD PT100 measures 100 Ω at 0 °C and 139.1 Ω at 100 °C.**

**calculate the resistance of the RTD at 50 °C.****Calculate the TCR for platinum.****calculate the temperature when the resistance is 110 Ω.**

**Calculate the Temperature Coefficient of RTD PT100**

From Equation – 1 :

**Calculate the resistance of the RTD at 50 °C**

From Equation – 2 :

R_{50} = R_{o}(1 + αt) = 100(1 + 0.00391×50) = 119.55Ω

**Calculate the temperature when the resistance is 110 ohms**

From Equation – 2:

Rt = Ro(1 + αt) ⇒ 110 = 100(1 + 0.00391t)

=1 + 0.00391t = 1.1 ⇒ 0.00391t = 0.1 ⇒ t = 25.58 °C.

**Also Read : Resistance Temperature Detectors Working Principle**

Dear sir,

last question u made a mistake that calculate the temperature for the resistance at 110 ohm but u mentioned 100 *c

Thank you Anand. updated the answer.

I think there is another easy method to convert ohms to temperature

To convert temperature to ohms..

If u knows then write please