Precision refers to the maximum allowable error that occurs in a specific usage environment. In other words, accuracy is used to indicate the closeness between the measured value of the digital Multimeter and the actual value of the measured signal.
For digital Multimeter, accuracy is usually expressed as a percentage of reading. For example, a reading accuracy of 1% means that when the digital Multimeter displays 100.0V, the actual voltage may be between 99.0V and 101.0V.
Specific values may be added to the basic accuracy in the detailed instructions. Its meaning is to change the number of words to be added at the far right end of the screen display. In the previous example, the accuracy may be marked as ± (1%+2). Therefore, if the reading of the meter is 100.0V, the actual voltage will be between 100.0+(100 × 1%+0.2) to 100.0- (100 × 1%+0.2), which is between 98.8V and 101.2V.
For example, the accuracy of the DC voltage 2V gear of the three and a half digit multimeter is expressed as ± (0.5%+1) in the instruction manual. The maximum display of the 2V gear on the three and a half digit multimeter is 1.999. At this time, the 1 in parentheses refers to 0.001V. The measured voltage at the terminal of No. 7 Dry cell is 1.755V, and the possible range of the voltage at the real terminal of the battery is 1.755V ± (1.755V × 0.5%+0.001V) is approximately 1.755V ± 0.010V, and the true value is between 1.745V and 1.765V.