Hello prakash, I find your latest response a bit odd, you spoke of deviation from the original question and I believe instead the responses by everyone here are definitely relative to your question. A large majority of welding machines, whether they be engine driven or otherwise, are equipped with dials that utilize scaling numbers such as 1 to 10, or in some cases none at all. Once an output level has been "dialed in" according to observing either a calibrated analog or digital meter the operator goes to welding.
The number on the dial really only serves as a point of quick reference should the operator need to return to using this same machine later and it has been changed by someone else for another purpose, or the particular individual that was using it has had to do some other function and then goes back to the original task. For machines without the numbering or some other form of identification on the dial you would be completely dependent upon the meter readings.
Volt/Amp curves have little to do with CC power output other than to change when the arc is initially started(how easily or difficultly) and then be subjected to further change when the arc length is altered while welding. Or, in the case of some machines equipped with arc dig controls there can be a much more radical Volt/Amp curve modification based on arc length change and arc voltage being lowered or raised.
CV sources with inductance adjustment potential could be modified from a "standard" Volt/Amp curve, if there was such a thing, but I believe every manufacturer has their own recipe for Volt/Amp curve relationships on non-adjustable machines.
Machine calibration is all about matching the best form of output verification at the terminals and having the associated meters provide an accurate display of this current and or voltage. It will be what it is and will not change once it has been measured and verified by whatever form of standard is required in the fabrication/welding contract that applies. Follow any required re-calibration regimen and call it good. You will be chasing your tale if you have to provide further documentation of accuracy based on the point of weld, meaning: having to verify measurements at the stinger, welding gun, or whatever other type of apparatus is providing the actual welding arc. You already know that yourself as you mentioned length of leads, condition of conductors, size of cables, etc. all causing variations of true machine output. You will need to re-educate whoever it is that is making this sort of demand on you as they are truly overstepping what is reasonable.
I would probably not use this same sort of argument if this was a mechanized or robotic system, as I do believe there are ways that they could be calibrated and be repeatable. But for manual or semi-automatic type welding with the power sources you have listed I do believe the expectations are unreasonable. Best regards, Allan
I have to agree with Allan and js55 very strongly.
Someone is trying to make this TOO hard, time consuming, and unreasonable. That, or you are misunderstanding what they are expecting. It is also why for accurate calibrating we generally defer to outside agencies who have the proper equipment, experience, and training to do actual machine calibrations with adjustments to any digital readout on the front of the machine. But for the most part that is more for the company's own QC Manual and what they say they require on the welding floor for them to maintain their Pre-Approved Fabricator Status. If they would call things out different they wouldn't need that.
Many of the perameters you mention are even effected by arc length, length of stick electrode as it burns down, and many other factors. You start combining those factors and you can get quite a variance.
Most of what I (me personally in my particular situation) need to document can be handled just fine with a simple volt/amp meter that has capability of amperage readings high enough for welding currents. When more technical calibration is REQUIRED for whatever reason by the customer, be that customer the product end user or the fabrication shop, then I refer them to EXPERTS at that form of testing and calibration.
Personally, I don't think you are that kind of expert. I also don't think they have a clue why they want this calibration, too much head knowledge with no practical knowledge or understanding.
Have a Great Day, Brent
Hello Allen, Thanks for the valuable input noting that I don't mean that you deviated from the subject.
I explained the customer about the purpose of scale and the factors to be considered while checking the accuracy. It it is a fact that as per BS/ NEMA the current control scale graduated in AMPS should match with the output
As an example, 500A welding power source should have:
As per BS 7570: 1992, clause 4.2 for grade 1 welding power sources/ table 1 for validation accuracies
1) Current +/-10%
2) Rated No load voltage +/-5%
3) Load voltage 14 + 0.05 x Load Current
As per NEMA EW-1, clause 5.4.2 for class 1 welding power source/ table 5.1 for ratings and performance
1) Load amperes or load volts, or both, shall be permitted to be less than, but they shall not be more than 100A, 24V
2) Load amperes or load volts, or both, shall be permitted to be more than, but they shall not be less than 500A, 40V
3) The load voltage are based upon the equation, E = 20+0.04x Load Current. For load current larger than 600A, load voltage is 44V
Regards
Prakash