Why does only the amps accuracy degrade?
For loads less than 60 watts, the current and power factor displays will have lower accuracy. However, the wattage and other displays will still be within 1.5% . So for instance, the meter should display 1.0 +/- 0.3 watts (1.5% = 0.0 watts, plus 3 counts = 0.3 watts, for a total of 0.3 watts), or between 0.7 – 1.3 for a 1 watt load. However, the meter will likely display 0.010 amps, which would be off considerably if the actual load had a low power factor and the actual current was 0.040 amps. The amps lose accuracy even though the wattage does not because we utilize a power metering chip that is specialized for watts. Watts are integrated over 1 second intervals which helps eliminate some error, whereas amps are discrete RMS values for single sine waves.