A current test involves setting the multimeter to measure AC or DC current, connecting the test leads and touching the leads to test points on the circuit. For optimal results, the multimeter should connect in series with the rest of the circuit. This provides only one possible path for the electricity and ensures that the full load goes through the multimeter to provide an accurate measurement.
Breaking a circuit to measure it in series is often dangerous or unfeasible; therefore, electricians often use Ohm's Law to calculate the current. This law states that the current equals the total voltage divided by the amount of electrical resistance across the circuit. Measuring voltage requires a working power source, whereas measuring resistance requires the power to be disconnected. Dividing the voltage by the resistance yields the current. For example, 120 volts across 40 ohms of resistance produces 3 amperes of current.
Digital multimeters have two jacks in which to insert the positive leads for current measurement: one for currents below 200 milliamperes and another for currents from 200 milliamperes to 10 amperes. The red positive test lead goes in one of these jacks and the black one goes in the "COM" socket. The dial on the face of the multimeter further specifies the range, assuming the multimeter doesn't detect the range automatically.
Setting the correct range and inserting the leads correctly are only two aspects of digital multimeter use. The test probes should never touch unless the meter needs calibrating, and the test leads need to remain clean, dry and unfrayed. A damaged multimeter will give inaccurate measurements and put the user at personal risk.