The voltage drop in a conductor is the loss of electrical energy as it travels through a circuit and meets resistance from the load or appliances. To reduce the effects of voltage drop, use the correct wire size; increasing wire size reduces the amount of electrical energy lost through voltage drop.
Temperature plays a major role in dictating wire resistance. A rise in temperature causes an increase in resistance; this, in turn, increases the amount of electrical energy lost in form of heat to the environment. Increasing wire thickness and reducing wire length greatly reduces the internal resistance. Wires that are used in high-temperature environments, grouped in bundles or routed through conduits are most affected by high temperatures.
The current intended for a particular circuit determines the size of the wires used. The total current load through a circuit, which is the sum of the loads of the individual appliances connected to it, dictates the size of wires needed to supply enough power for those appliances. Smaller wires are good for circuits with small current loads, whereas larger wires are used for larger current loads. Using small wires for large loads could lead to burning insulation on the conductors.
The length of conductor needed, from the power source to the load, will be a factor in the selection of wire size -- particularly its cross-sectional area. Rather than simply increasing the length of conductor needed for a particular load, you may need to replace it with a thicker wire. Use the American wire gauge to determine the appropriate thickness.