Soil thermal resistivity testing measures the capacity of the ground to conduct or dissipate heat. A correct understanding of the thermal properties of a soil or layer of made ground is important for the design and installation of underground pipelines and transmission cables, to avoid premature failures. The heat produced by current flowing through an underground power cable must be properly dissipated.
​
The thermal resistivity of a soil will determine whether a buried power cable remains cool or overheats. A build-up of heat around the cable can reduce transmission efficiency, or in the worst cases cause the cable to melt. Potential problems can be identified by measuring the thermal resistivity of an in-situ soil. Remedial measures include changing the capacity and insulation of the cables or installing corrective thermal backfills in the cable trench.
Above : In-situ thermal resistivity testing with a needle probe.
​
Typical Applications
​
-
Assists the design and layout of underground pipelines
-
Prevents occurrence of heat build-up around power transmission cables
-
Measures the ground's heat dissipation properties for a pipeline installation
-
Calculates the optimum cable specifications for the local ground conditions
-
Determines if in-situ ground must be replaced by cable bedding thermal back-fill
Survey Operation
​
Soil thermal resistivity testing is normally carried out before a cable is actually laid out. Ideally the testing should be done at the proposed cover depth of the installation. Transient line heat source methods have been used to measure the thermal resistivity of porous materials for many years. Typically, a handheld device is used comprising a needle with heater and temperature sensor inside, which conforms to specifications outlined in IEEE 442 and ASTM 5334.
The probe is inserted into the ground to a specified depth and a current passed through the heater. The system monitors the temperature of the sensor over a period of time. Analysis of the sensor temperature determines the soil thermal resistivity of the material surrounding the probe. Heating times are kept as short as possible, to minimise thermally induced water movement and reduce reading times. Temperatures can be measured to an accuracy of one thousandth of a degree. The final results are given in units of: C.m/W, where C = degrees Centigrade and W = Watts.