What is an Electrometer?
Essentially, an electrometer is a highly refined digital multimeter (DMM). Electrometers can be used for virtually any measurement task that a conventional DMM can and offer the advantages of very high input resistance when used as voltmeters, and ultra-low current sensitivity with low voltage burden when used as ammeters. Electrometers are superior to DMMs by three to eight orders of magnitude in these respects. That makes them the instruments of choice for measuring voltages with high source impedance or currents with low source impedance (i.e., signals from non-ideal sources). Electrometers can also measure charge directly.
An electrometer is a device used to measure the charge or potential difference of electricity. Available in a variety of designs, the instrument is essential in finding the voltage between two places in the electrical circuit. It can also be used to determine the level of electromagnetic interaction of subatomic particles. The first electrometer was developed in the 1700s by Alessandro Volta and Abraham Bennet. This was a device that featured an electrode connected to two pieces of gold foil. The electrode was charged either through direct contact or by induction. The gold foil pieces would repel each other, indicating the presence of an electrical charge. The measurements were very crude and the device needed to be surrounded by a lead shielding to prevent leakage of the charge. A number of new designs have been developed over the course of time. The most common use for electrometers today are to record ionizing radiation in the field of nuclear physics. One familiar device that utili