Using the voltage meters installed on the control panel of a welder has little meaning if the welding is performed in the field where the leads are relatively long. The only meaningful voltage reading is between the electrode holder and the work piece. The voltage drop increases the further from the welding machine you are, i.e., the reading shown by the voltmeter at the machine is always higher than the actual voltage across the arc. When the welding cables are short, the voltage drop can be ignored, but as the length of the leads increase, the voltage drop increases. The diameter of the welding cable also plays a part in the voltage drop. The higher the welding current, the smaller the cable diameter, the longer the welding cable, the greater the voltage drop.
Amperage on the other hand is constant where ever you measure it. So, if the welding machine has an ammeter installed on the control panel, the readings can be used regardless of the length of the welding cables.
Another problem that can be encountered when welding in the field is ensuring the power supply can provide sufficient voltage at the point where welding is being done. There is an equation that can be used to verify the power supply is large enough to provide sufficient voltage to the arc. The equation is E= 20v + (0.04 x I) where E is the dynamic voltage deliverable by the power supply, I is the rated amperage of the welding machine, and of course 20v is a constant of 20 volts.
When the welding cables are long, the voltage drop due to cable length, loose or corroded connections, etc. must be figured in so that a power supply large enough can be provided to overcome the voltage drop. For instance, assuming the voltage drop across the arc is 26 volts (under load) and assuming a three volt drop in the cables and connections, the machine must be capable of providing 29 volts under load (while welding). The maximum open circuit voltage is static voltage, not the voltage under load (dynamic voltage). In our example the size of the welding machine would have to be at least:
29 = 20 + (0.04 x I)
Solving for I you get 0.04 x I = 29-20 = 9
then I = 9 / 0.04 = 225
In other words, irrespective of the actual amperage required to weld with a certain diameter electrode, a 225 amp power supply must be provided to produce the required dynamic voltage of 26 volts at the welding arc for the expected voltage drop due to connections, length of leads, and required voltage across the arc.
From the equation it is clear that when the welding leads are very long, when there are several connections used to joint several shot lengths of welding cable, or when the cable diameter is small, increased resistance causes increased voltage drops that must be factored in when selecting a power supply to ensure sufficient voltage is provided at the welding arc.
Playing the devil's advocate:
Calibrated voltmeters on the control panel? Limited value at best when welding in a shop with short welding cables; affected by voltage drop along the length of the welding cables, affected by cable diameter, current, and connections.
The value of calibrated meters on the control panel: better than a sharp stick in the eye in that it lets the welder know the machine is at least running and gives the inspector something do while walking around the shop floor.
The truth of the matter is if the inspector or the welder is going to use voltage and amperage as a means of ensuring compliance with the WPS, the parameters should be checked with a calibrated multi-meter. The parameters (at least the voltage) should be checked as close to the arc as practical, i.e., at the electrode holder/torch when the process is manual or at the wire feeder if the process is semi-automatic, to mitigate the effects of long welding cables, loose or corroded connections, etc. As mentioned previously, amperage is the same throughout the welding circuit.
As for accuracy of the meters, there is no real agreement from what I've seen. A common requirement is +/- 10% for the meter's range. The acceptable accuracy for a 200 amp meter is +/- 20 amps. The actual value is going to fluctuate as the welder welds due to variations in circuit’s resistance as the metal droplets are transferred through the arc plasma. Therefore, when checking the meters, a load cell should be used to provide a steady state load so the meters provide stable readings.
The bottom line is that most welding standards do not require the ammeter or the voltmeter to be calibrated for good reason. QC types get all wigged out about something that has limited value in the real world of welding. The meters are simply a reference to provide a level of comfort that the welder is welding within the prescribed parameters listed by the WPS.
Think about this for a moment. When most people record the values for amperage and voltage on the PQR, they do so as a single value. In reality, both the voltage and the amperage are varying somewhat as the welder traverses the length of the joint. Many people simply record what they feel is the average value of the parameters. Others will record the “range” observed as the welder traverses the length of the joint. The problem is somewhat mitigated if the process is mechanized and the human element is taken out of the equation, but still there is some fluctuation because the welding arc is dynamic system rather than static system. I suppose that if we were to do everything properly, the voltage would have to be taken at the electrode holder, the amperage anywhere that was convenient, travel speed would have to be controlled by mechanical means so as to be constant, and the instantaneous values of the parameters stored and averaged for each weld bead. There are instruments available with that capability, but the cost is steep and what real benefits would be derived?
Best regards - Al