E4B02 - What is the significance of voltmeter sensitivity expressed in ohms per volt?

Question

What is the significance of voltmeter sensitivity expressed in ohms per volt?

Answer Options

  • A) The full scale reading of the voltmeter multiplied by its ohms per volt rating is the input impedance of the voltmeter
  • B) The reading in volts multiplied by the ohms per volt rating will determine the power drawn by the device under test
  • C) The reading in ohms divided by the ohms per volt rating will determine the voltage applied to the circuit
  • D) The full scale reading in amps divided by ohms per volt rating will determine the size of shunt needed

Correct Answer: A


Explanation

The term ‘ohms per volt’ is an older measure of sensitivity specifically for analog (moving coil) voltmeters. It indicates the DC input impedance of the meter. It is important because a voltmeter should draw as little current as possible so that it does not significantly alter the voltage it is trying to measure in the circuit under test.

In a moving coil meter, the full-scale voltage for a given range (e.g., 10V) multiplied by the meter’s ohms-per-volt rating determines the total input resistance for that range. Therefore, the full scale reading of the voltmeter multiplied by its ohms per volt rating is the input impedance of the voltmeter on that specific range. For example, a 20,000 ohms-per-volt meter on the 10V range has an input impedance of 20,000 \times 10 = 200,000 ohms.


This topic was automatically created to facilitate community discussion about this exam question. Feel free to share study tips, memory tricks, or additional explanations!