When the governing construction Code requires impact testing, ASME IX specifies that an increase in heat input becomes an essential variable.
I'm from the great white north, Alberta, Canada. We do alot of low temp work with mandatory impact testing.
Per ASME IX, Heat input is measured as, Joules/in, can be calculated with the following formula, Amps X Volts X 60 / IPM;
Or,
An increase in heat input is defined as "an increase in size or a decrease in length of weld bead per unit length of electrode."
Typically during qualification of low temp procedures the amperage, voltage and travel speed are documented and used to calculate the maximum heat input qualified.
The method of selecting the specific voltage to be used in the calculation is not well defined. Most procedure writers that I have talked with use calibrated voltmeters. They take the average reading per pass. Then select the pass with the combination of amperage, voltage, and travel speed that results in the highest number of Joules/in to limit the upper end of their heat input range.
The table I want to provide is a reference tool to provide a reasonably accurate indication of compliance for our Welders and Inspectors.
Note: We have 39 welding machines (23-pheonix 456, 4-dimension 652, 4-XMT 304's, 4-dimension 400's, 4-aerowaves) in our shop that have continous voltage readout gauges.
Any good way to select a reasonably accurate representative average for voltage per specific AWS electrode class, diameter and amperage?