How does it effect the performance of a voltage mode detector when changing the gate resistor?
The reduction of the gate resistor in an uncompensated voltage mode detector is a convenient method to increase the stability of the Offset voltage in temperature ramps. It is therefore often used if a thermal compensated detector is too expensive. However, a reduced gate resistor produces an increased noise proportional to 1/sqrt R. Rule of thumb: By a thermal compensation the detectivity of an uncompensated detector decreases to 70%. For a similar stability an uncompensated detector has to be designed with a 1/16 of gate resistor (e.g. 5GOhm instead of 82GOhm), means the detectivity decreases to sqrt of 1/16 = 1/4. A thermal compensation is about 3 times better than reducing the gate resistor.
Related Questions
- CalFUSE gives warnings about the detector voltage changing during an exposure, but the data look OK to me. Is this something that I should be worried about?
- How does it effect the performance of a voltage mode detector when changing the gate resistor?
- What are the required resistor values for a voltage divider such that 12v equals 5v