Multimeter Online Store

How a Digital Multimeter Works

Digital multimeters, DMMs have been available for very many years. Yet it is still difficult to find information about how a digital multimeter works.

The operation of a digital multimeter is relatively straightforward, although there are obviously differences in how digital mulimeters work if they are from different manufacturers.

Despite this, there are many similarities and some general principles of how digital multimeters work.

Basics of how a Digital Multimeter works

The key process that occurs within a digital multimeter for any measurement that takes place is that of voltage measurement. All other measurements are derived from this basic measurement.

Accordingly the key to understanding how a digital multimeter works is in understanding this process.

There are many forms of analogue to digital converter, ADC. However the one that is most widely used in digital multimeters, DMMs is known as the successive approximation register or SAR. Some SAR ADCs may only have resolution levels of 12 bits, but those used in test equipment including DMMs generally have 16 bits or possibly more dependent upon the application. Typically for DMMs resolution levels of 16 bits are generally used, with speeds of 100k samples per second. These levels of speed are more than adequate for most DMM applications, where high levels of speed are not normally required.

As the name implies, the successive approximation register ADC operates by successively homing in on the value of the incoming voltage.

The first stage of the process is for the sample and hold circuit to sample the voltage at the input of the DMM and then to hold it steady.

With a steady input voltage the register starts at half its full scale value. This would typically require the most significant bit, MSB set to "1" and all the remaining ones set to "0". Assuming that the input voltage could be anywhere in the range, the mid-range means that the ADC is

set in the middle of the range and this provides a faster settling time. As it only has to move a maximum of the full scale rather than possibly 100%.

To see how it works take the simple example of an 4-bit SAR. Its output will start at 1000. If the voltage is less than half the maximum capability the comparator output will be low and that will force the register to a level of 0100. If the voltage is above this, the register will move to 0110, and so forth until it homes in on the nearest value.

It can be seen that SAR converters, need one approximating cycle for each output bit, i.e. an n-bit ADC will require n cycles.

Digital Multimeter operation

Although the analogue to digital converter forms the key element within the instrument, in order to fully understand how a digital multimeter works, it is necessary to look at some of the other functions around the ADC.

Although the ADC will take very many samples the overall digital multimeter will not display or return every sample taken. Instead the samples are buffered and averaged to achieve high accuracy and resolution. This will overcome the effects of small variations such as noise, etc., noise created by the analogue first stages of the DMM being an important factor that needs to be overcome to achieve the highest accuracy.

Operation flow diagram for operation of a DMM

Measurement time

One of the key areas of understanding how a digital multimeter works is related to the measurement time. Apart from the basic measurement there are a number of other functions that are required and these all take time. Accordingly the measurement time of a digital multimeter, DMM, may not always appear straightforward.

The overall measurement time for a DMM is made up from several phases where different activities occur:

  • Switch time:   The switch time is the time required for the instrument to settle after the input has been switched. This includes the time to settle after a measurement type has been changed, e.g. from voltage to resistance, etc. It also includes time to settle after the range has been changed. If auto-ranging is included the meter will need to settle if a range change is required.

  • Settling time:   Once the value to be measured has been applied to the input, a certain time will be required for it to settle. This will overcome any input capacitance levels when high impedance tests are made, or generally for the circuit and instrument to settle.

  • Signal measurement time:   This is the basic time required to make the measurement itself. For AC measurements, the frequency of operation must be taken into account because the minimum signal measurement time is based on the minimum frequency required of the measurement. For example, for a minimum frequency of 50 Hz, an aperture of four time the period is required, i.e. 80 ms for a 50Hz signal, or 67ms for a 60Hz signal, etc.

  • Auto-zero time:   When autorange is selected, or range changes are made, it is necessary to zero the meter to ensure accuracy. Once the correct range is selected, the auto-zero is performance for that range.

  • ADC calibration time:   In some DMMs a calibration is periodically performed. This must be accounted for, especially where measurements are taken under automatic or computer control.

It is always useful to know how a digital multimeter works in order to be able to make the best use of it and the most accurate measurements. However it should be remembered that different multimeters from different manufacturers may work in different ways. It is therefore always helpful to consult the manufacturer's instructions to understand how a particular digital multimeter works.