I understand the advantage of 1000 ohm over 100 ohm RTDs, but what are the advantages of 100 ohm over 1000 ohm RTDs? I assume there are some advantages since 100 ohm are more popular. I am ordering some RTDs and I want to make an informed decision about which type to buy.
100 Ohm RTDs are more common and have less self heating affects than 1000 Ohm RTD. With our dataloggers, we use a 10kOhm resistor in the circuit to limit the self heating on either.
You don't have to worry as much about cable resistance with the higher value RTDs.
Is the 10kOhm resistor built in to the datalogger, or is the 10kOhm resistor is applied external to the datalogger via a TIM or a homemade circuit? What is CSI's authoritative documentation for using 3-wire and 4-wire RTDs with CSI dataloggers?
The resistor is in the TIM module. The manuals for the TIM modules are the best documentation for measuring a PRT on the datalogger.
https://www.campbellsci.com/3whb10k
https://www.campbellsci.com/4wpb1k