The ampere, amp for short, is the standard unit of electrical current. It is defined as the current required to produce a certain force between two parallel and infinitely long wires separated by one meter.

According to Ohm’s law, one ampere of current is produced when one volt of potential difference exists across a conductor with one ohm of resistance. One ampere is also equal to the flow of one coulomb of electric charge per second.

History of the Amp

The ampere was defined at an international conference in 1881, and is named after Andre-Marie Ampere (1175-1836), a French physicist. The symbol is A, and is always written in uppercase.

Measuring Amperes

Amperes are measured using an ammeter or multimeter. Ammeter is the correct spelling, not ampmeter. Regular ammeters are placed in series with the circuit, meaning that all the current flows through the meter. For high currents, a clamp ammeter may be a better choice.

The clamp ammeter does not require any connection to the circuit, it simply clips over the cable. As current flows, the electromagnetic field induces a voltage in the clamp. This voltage is proportional to the strength of the current. To get an accurate reading, only one cable should be inside the clamp.

Amperes and Milliamperes

A milliampere (mA) is one thousandth of an ampere. Unlike volts and hertz, the range of common ampere values is quite low. A clock radio uses about twenty milliamperes, a light globe uses about half an ampere, while air conditioners use around ten amperes. Starter motors use several hundred amperes of current, but only for a few seconds.

Wire Thickness

The thickness of a wire needs to be large enough to allow current to flow without the metal inside melting. Tables are availiable that show the maximum amperes for various wire diameters.