Not logged inAmerican Welding Society Forum
Forum AWS Website Help Search Login
Up Topic Welding Industry / Technical Discussions / Welding power source calibration
- - By Prakash Vasudevan Date 11-02-2010 17:39
Hi,

Could you please advice the standard procedure for testing the welding power sources used for SMAW, GMAW applications (diesel and electric driven) and acceptance criteria?

One of clients is insisting that the machine output shall be within +/-10% of the current control scale graduated in AMPS.

As far I know that the scale is only for indication to set the required current range and it is subject to various factors (cable size, length, input voltage if it is electric type). Also the same current control potentiometer is connected in series with the remote control potentiometer graduated in 0 - 10 and being controlled by the welder while welding (adjusting the current without reading the scale).

I think the machine’s output at 4 – 6 range shall be compared to the respective volt-amps curve, not the scale

Kindly assist
Parent - By aevald (*****) Date 11-02-2010 17:47
Hello Prakash Vasudevan, I have a calibration booklet from Miller that might help to answer some of your questions. I will try to attach it here so you can look it over. Best regards, Allan
Attachment: CALIBRATIONBOOKLET.zip (478k)
- - By prakashv59 (*) Date 11-03-2010 11:56
Thanks for the response and attachment which descibe about the meters only, but my concern is the welding power source accuracy, as the client is expecting the the output within +/-10% of the scale setting, which impossible/ subject to the cable size, length, input power, etc.

Regards

Prakash
Parent - By aevald (*****) Date 11-03-2010 15:02 Edited 11-03-2010 15:28
Hello Prakash, a number of years back I was hit with what I considered to be unreasonable requirements for certification of our shop for welder testing purposes. During the course of that time I shared dialogue with many individuals on this forum as well as technicians from Miller welding and a number of other folks who were responsible for welding machine calibration and related items.
     The idea behind machine calibration is to essentially verify that the meters on the machine, whether digital or analogue, match the dial settings that the operator selects on the machine and that they also match the calibration equipment that was used to verify their accuracy in the first place. Digital metered machines from most of the major manufacturers are very accurate and are less likely to need adjustments, analogue meter machines are more likely to need syncing but are not difficult to do. Machines that do not have meters are not to be calibrated, although you can install inline monitoring meter or tattletale systems at a fairly substantial cost that will provide you with welding parameters for analysis. Regardless of which type you are dealing with a load bank is the best way to accomplish this task, a clamp meter on the electrode cable will not give you a true level of accuracy. You are correct when you add that cable sizes, lengths, connection quality, and other variables come into play when you are truly trying to match machine meter settings to actuals. It will not happen and if you really took the time to figure out a way you would be in financial ruin trying to verify all of that information everytime you went to make a weld.
     One other way to look at this would go something like this: I have 2 machines that are identical makes and models, one of the machines will provide 50 more amps of output than the other when the controls are maxed out. So long as the meters accurately display the output of the 2 machines and I am able to set the welding parameters to match a specific WPS I will be able to make very adequate welds with either machine to satisfy the WPS requirements for a specific job. Hope this somewhat explains my take on your issue. Good luck and best regards, Allan
Parent - - By prakashv59 (*) Date 11-03-2010 17:02
Hi, Allen

I fully agree with your views. Calibrated meters are one of the major criteria to ensure proper welding. However, the power sources are also equally important, as it has to provide the required performance (volt-ampere curve).   

I wonder, why the booklet from MILLER is briefing only about the meters, while all confusions are in the testing process of machine performance. The client who doesn't have enough information insists to match the machine scale settings and output, which is impossible in most of the cases.

The two GENSET/ Italy MPM 20/500 PS machines supplied to a customer is rejected by their laboratory stating that the output is not within +/-10% current control settings.

The machines, engine driven type running at 1,500rpm, is giving 70VDC OCV, 500A at 60% duty cycle, (35rpm speed drop at full load) with single/ continuously variable (potentiometer) current control, while similar machine from MILLER Big Blue 500DX is running at 1,875rpm is giving 95VDC OCV, 500A at 60% duty cycle, (75 rpm speed drop) with 5 range current selector and rheostat type fine control.

It is pity, as the people don’t realize that they are power sources and are having different volt-ampere curve from brand to brand
Parent - - By aevald (*****) Date 11-03-2010 17:12
Hello again Prakash, I was afraid that you were going to say something like that. With inductance/arc force controls on some machines and as you stated, the different volt/amp curves employed by different manufacturers, folks can get into a real pickle if an engineer or someone decides to try to put restrictions on welding process based on those variables. I will be interested to see if you get responses from others who have faced similar issues and see how they have dealt with them. Best regards, Allan
Parent - - By eekpod (****) Date 11-03-2010 20:00
Load cell calibration method I've found is the most accurate and relaible, the c-clamp amprobe method had alot of variables.
Chris
Parent - - By js55 (*****) Date 11-03-2010 20:46
C'mon guys. Just how accurate does it need to be?
Parent - By waccobird (****) Date 11-03-2010 21:23
js55
Depends on who you have to appease.
Also Welders today need all the help they can get.
Production wise it is nice that when personnel are shifted around they don't have to play with the machine on some scrap before they can produce sound weld.
Just my ¢¢'s.
Marshall
Parent - - By aevald (*****) Date 11-03-2010 21:28
Hello js55, to some degree that's exactly my point, I believe there is way too much definition of weld parameters being spelled out by some governing bodies or individuals where the whole calibration scheme comes in to play. Certainly, there is a need to have a common ground for weld parameters that is repeatable regardless of geographical or physical location. Testing and history proves out that the preciseness of this is likely much more broad than certain folks would like us to believe. Unfortunately, whenever there are boundaries of any kind there will be those who will push the bounds and that is the point where failures occur. So I suppose the logic exists that if the parameters are tight enough those who are pushing those bounds will still fall into a safe range. I'm rambling a bit here and as you said "just how accurate does it need to be?" Best regards, Allan
Parent - - By prakashv59 (*) Date 11-04-2010 09:48
Hi Guys,

As usual, you are deviated from the subject which is “accuracy for current control scale”

What are the criteria to conclude the power source is good for welding or not? Is the output measured (with a load bank) to be compared against the volt-ampere curve or the current control scale setting?
Parent - By js55 (*****) Date 11-04-2010 12:23
You're making this too difficult.
Measure the thing. Compare it to the meters. Establish your acceptance criteria and call it a day.
Parent - - By aevald (*****) Date 11-05-2010 05:21
Hello prakash, I find your latest response a bit odd, you spoke of deviation from the original question and I believe instead the responses by everyone here are definitely relative to your question. A large majority of welding machines, whether they be engine driven or otherwise, are equipped with dials that utilize scaling numbers such as 1 to 10, or in some cases none at all. Once an output level has been "dialed in" according to observing either a calibrated analog or digital meter the operator goes to welding.
     The number on the dial really only serves as a point of quick reference should the operator need to return to using this same machine later and it has been changed by someone else for another purpose, or the particular individual that was using it has had to do some other function and then goes back to the original task. For machines without the numbering or some other form of identification on the dial you would be completely dependent upon the meter readings.
     Volt/Amp curves have little to do with CC power output other than to change when the arc is initially started(how easily or difficultly) and then be subjected to further change when the arc length is altered while welding. Or, in the case of some machines equipped with arc dig controls there can be a much more radical Volt/Amp curve modification based on arc length change and arc voltage being lowered or raised.
     CV sources with inductance adjustment potential could be modified from a "standard" Volt/Amp curve, if there was such a thing, but I believe every manufacturer has their own recipe for Volt/Amp curve relationships on non-adjustable machines.
     Machine calibration is all about matching the best form of output verification at the terminals and having the associated meters provide an accurate display of this current and or voltage. It will be what it is and will not change once it has been measured and verified by whatever form of standard is required in the fabrication/welding contract that applies. Follow any required re-calibration regimen and call it good. You will be chasing your tale if you have to provide further documentation of accuracy based on the point of weld, meaning: having to verify measurements at the stinger, welding gun, or whatever other type of apparatus is providing the actual welding arc. You already know that yourself as you mentioned length of leads, condition of conductors, size of cables, etc. all causing variations of true machine output. You will need to re-educate whoever it is that is making this sort of demand on you as they are truly overstepping what is reasonable.
      I would probably not use this same sort of argument if this was a mechanized or robotic system, as I do believe there are ways that they could be calibrated and be repeatable. But for manual or semi-automatic type welding with the power sources you have listed I do believe the expectations are unreasonable. Best regards, Allan
Parent - By welderbrent (*****) Date 11-05-2010 14:18
I have to agree with Allan and js55 very strongly.

Someone is trying to make this TOO hard, time consuming, and unreasonable.  That, or you are misunderstanding what they are expecting.  It is also why for accurate calibrating we generally defer to outside agencies who have the proper equipment, experience, and training to do actual machine calibrations with adjustments to any digital readout on the front of the machine.  But for the most part that is more for the company's own QC Manual and what they say they require on the welding floor for them to maintain their Pre-Approved Fabricator Status.  If they would call things out different they wouldn't need that.

Many of the perameters you mention are even effected by arc length, length of stick electrode as it burns down, and many other factors.  You start combining those factors and you can get quite a variance.

Most of what I (me personally in my particular situation) need to document can be handled just fine with a simple volt/amp meter that has capability of amperage readings high enough for welding currents.  When more technical calibration is REQUIRED for whatever reason by the customer, be that customer the product end user or the fabrication shop, then I refer them to EXPERTS at that form of testing and calibration. 

Personally, I don't think you are that kind of expert.  I also don't think they have a clue why they want this calibration, too much head knowledge with no practical knowledge or understanding.

Have a Great Day,  Brent
Parent - By prakashv59 (*) Date 11-05-2010 17:07
Hello Allen, Thanks for the valuable input noting that I don't mean that you deviated from the subject.

I explained the customer about the purpose of scale and the factors to be considered while checking the accuracy. It it is a fact that as per BS/ NEMA the current control scale graduated in AMPS should match with the output

As an example, 500A welding power source should have:

As per BS 7570: 1992, clause 4.2 for grade 1 welding power sources/ table 1 for validation accuracies
1) Current +/-10%
2) Rated No load voltage +/-5%
3) Load voltage 14 + 0.05 x Load Current

As per NEMA EW-1, clause 5.4.2 for class 1 welding power source/ table 5.1 for ratings and performance

1) Load amperes or load volts, or both, shall be permitted to be less than, but they shall not be more than 100A, 24V
2) Load amperes or load volts, or both, shall be permitted to be more than, but they shall not be less than 500A, 40V
3) The load voltage are based upon the equation, E = 20+0.04x Load Current. For load current larger than 600A, load voltage is 44V

Regards

Prakash
Parent - By js55 (*****) Date 11-05-2010 15:40
Lets also keep in mind here that with modern power sources we aren't really doing calibration. Calibration is a mechanical adjustment. What we are doing these days is verifying solid state boards.
If the thing doesn't read right there is nothing to calibrate. You change a board. Though analogue meters are still common.
Parent - - By jon20013 (*****) Date 11-04-2010 11:18
could not agree more with Allen and js55 comments.
Parent - - By 803056 (*****) Date 11-05-2010 17:18
Amen. Check the welding machine's meters; if they are within 10% of the full range, move on.

The meters are used as a check to verify the welder is working within the parameters of the WPS, no more, no less.

Put the welding machine on a load bank to stablize the load, check the meters with a calibrated multimeter and ammeter and it is a done deal.

Best regards - Al
Parent - - By prakashv59 (*) Date 11-05-2010 19:33
Try to understand that the power source is equally important as meters. As per the BS and NEMA, both to be verified/ calibrated to ensure proper welding..............

Meters cannot provide welding power

Regards

Prakash
Parent - By CHGuilford (****) Date 11-09-2010 18:00
Wow! I agree that this is going way too deep.

Keep in mind that all you need to do is verify that the welding id being done within established parameters.

But many folks can't even agree on how precise a calibration needs to be, let alone how to do it.

We all like numbers, digital read-outs, and what-not.  There is nothing wrong with that.  However you could simply adjust the machine and lock down the adjuster so it doesn't move. Does the welder have to know what amperage he is at? No as long as someone has already verified the output and nothing has been changed.

You could replace meter numbers with letters.  ("A" has been calibrated for 100 amps; "B" is 125; "C" is...whatever)

Or "this pencil line is for vertical welding with this wire".

My point being - there are many opinions and many methods to calbrate welding machines.  It all boils down to being able to verify that the welding is being done in accordance with an approved procedure.  If it does not need to be complicated, then why make it that way?  Just do what makes sense for the application.
Up Topic Welding Industry / Technical Discussions / Welding power source calibration

Powered by mwForum 2.29.2 © 1999-2013 Markus Wichitill