Obtain a few resistors of varying known resistance values. If your resistivity meter is one of the standard resistivity meters, which usually measure up to kilohms (1,000 ohms) and megohms (1 million ohms), then you will need about five resistors that a range from 10 ohms to 1 million ohms.
Turn your resistance meter, also called an ohmmeter, on and switch it to the lowest setting. Most ohmmeters have different modes for different orders of magnitude. For example, you may have to flick a switch or turn a knob to change it from the mode in which it measures hundreds of ohms to the mode in which it measures thousands of ohms.
Plug the test leads into the correct holes. The test leads usually have plugs that fit into the ohmmeter on one side, with metal probes on the other side that you can use to touch different circuits to find resistivity values. These holes will vary between different ohmmeters, so use the manufacturer’s manual to direct you in the correct way to plug in the leads.
Place your resistor with the lowest value flat on the table in front of the ohmmeter. The resistor should be horizontal, so that the wire coming out of each end of the resistor is parallel to the bottom of the ohmmeter.
Touch each wire coming out of the resistor with one of the probes and read the resistance value that is on the screen of the ohmmeter. This value should be roughly equal to the known value of the resistor. For example, if you are testing a resistor of 18 ohms and your meter reads 17 ohms, your meter is probably functioning correctly. However, if your meter reads an 18-ohm resistor as 58 ohms, it needs to be recalibrated.
Turn the knob on the ohmmeter to raise it to the next setting (for example, from ohms to kilohms.) Then repeat Steps 4 and 5 for a new resistor with a higher value to see how accurate your resistance meter is at reading that value. Testing your meter at different settings and with different resistors will verify whether the meter is accurate.