Important Notice: Our web hosting provider recently started charging us for additional visits, which was unexpected. In response, we're seeking donations. Depending on the situation, we may explore different monetization options for our Community and Expert Contributors. It's crucial to provide more returns for their expertise and offer more Expert Validated Answers or AI Validated Answers. Learn more about our hosting issue here.

Why does only the amps accuracy degrade?

amps degrade
0
Posted

Why does only the amps accuracy degrade?

0

For loads less than 60 watts, the current and power factor displays will have lower accuracy. However, the wattage and other displays will still be within 1.5% . So for instance, the meter should display 1.0 +/- 0.3 watts (1.5% = 0.0 watts, plus 3 counts = 0.3 watts, for a total of 0.3 watts), or between 0.7 – 1.3 for a 1 watt load. However, the meter will likely display 0.010 amps, which would be off considerably if the actual load had a low power factor and the actual current was 0.040 amps. The amps lose accuracy even though the wattage does not because we utilize a power metering chip that is specialized for watts. Watts are integrated over 1 second intervals which helps eliminate some error, whereas amps are discrete RMS values for single sine waves.

Related Questions

What is your question?

*Sadly, we had to bring back ads too. Hopefully more targeted.

Experts123