Not logged inAmerican Welding Society Forum
Forum AWS Website Help Search Login
Up Topic Welding Industry / Technical Discussions / Welding machine calibration
- - By rsx-s-02 (*) Date 05-22-2007 20:59
I have a question about welding machine calibration.  During our AISC audit, the auditor pointed out that our documented tolerance for the machines was +/- 10% (volts and amps) and he stated that the best industry practice is +/- 3%.   I was told by our welding supply rep that the tolernace is +/- 10%.  He is also the one who brings his calibrated meter in once a year to verify that our machines are working properly (within the 10%).   Can someone tell me what the requirements are or where I can research this?  Thanks in advance.
Parent - - By js55 (*****) Date 05-22-2007 21:32
To my knowledge the +/- 3% is essentially a 'rectumus extracticus' on the part of the auditor. Though not entirely. I've been required to hold the same standard and it was explained to me that its the lowest rounded number from half the voltage tolerance under AWS. You won't find 3% anywhere except in the auditors heads. And now in your manuals. It has some logic to it, to be sure, but considering that so many other standards organizations recommend, require, or allow much more liberal tolerances there seems to be no real mechanical or metallurgical justification. Its just sounds good to em when they discuss it around the table.
OK. Thats my sarcasm for the day.
Parent - By RANDER (***) Date 05-22-2007 22:10
You have a great command of the Latin language
Parent - - By aevald (*****) Date 05-22-2007 22:04
Hello rsx-s-02, a while back there was a posting that had the title " Load Bank versus a Test Meter Calibration", you may want to search for this posting. In this thread I have a response which included an attachment that has information from Miller regarding meter calibration. This may have some information that can be helpful with your question. Regards, aevald
Parent - - By 803056 (*****) Date 05-23-2007 03:39
Let me show my ignorance. I don't recall a requirement in AWS D1.1 to calibrate the welding machine. There is a requirement for calibration in D1.5 for bridges, but I don't recall a requirement addressing the +/-3% you mention.

I could be wrong, but my response to the auditor would be, "show me where it states that requirement."

Best regards - Al
Parent - - By aevald (*****) Date 05-23-2007 04:14
Hello Al, I wouldn't disagree with you on your calibration statements regarding D1.1 and I don't personally know about AISC. I would like to know more about this topic and hopefully through the response that you have posted and those of others I will learn the answer.
     I have had requirements to do machine calibration in my facility for the testing that I do under the state of Washington and their version of AWS code and certification referred to as WABO(it is administered and regulated by the Washington Association of Building Officials). They do require calibration yet they aren't very clear in their description and don't specify much other than to be able to trace calibration back to national standards, I feel it is very open to interpretation. In the event that a person requires an explanation or means for considering and doing calibration I included some information that Miller provided to me regarding machine calibration. I posted this in a thread some time ago and gave the heading that it can be found under in my last post to this thread to hopefully provide a way to retrieve it. Best Regards, Allan
Parent - - By DaveBoyer (*****) Date 05-23-2007 04:45
"it just sounds good to em when they discuss it arround the table" I guess the idea is that You should be able to determine heat input from the readings. With respect to amperage You should get a usable number, but the voltage at the machine COULD be pretty much higher than the actual voltage across the arc due to resistance. I guess You have to comply to however it is written, weather it helps anything or not.
Parent - - By js55 (*****) Date 05-23-2007 13:34
Al is correct. There is no requirement for calibration in D1.1. There is no requirement for calibration in ASME except for a vague reference to 'measuring equipment' in NQA-1 that has been interpreted (perhaps ligitimately-I don't know I've never sat in on Seciton III discussions) as being applicable to power source gages. The calibration for non bridge structural steel, to my knowledge, is being driven by the AISC auditing process. And the 3% is the annointed tolerance.
I have no beef against calibration per se. It has its uses and importance at times. I just wish those who impose such processes would actually do some research into it so that tolerances are based in empirical data and not arbitrary logic. I would argue that the great, great majority of piping fabrication, (I have not been exposed to structural steel enough to expand this point to it) is being performed without calibrated power sources. And so I would ask, where are the failures related to voltages beyond 7%, 10%, 25%? ASME and D1.1 in their empirical wisdom implicitly, in my opinion, concur.
Parent - - By jon20013 (*****) Date 05-23-2007 13:44
Once upon a time, many years ago, D1.1 had a requirement for calibrating weld machines.  It was removed in the 1990's.  There are also no requirements within ASME for calibrating weld machines although many would argue it's good practice and that's hard to argue against.  There are also those who use "essential variables" as part of their argument, to that end I use a calibrated multimeter whenever qualifying a WPS, simply to avoid the discussion.  As for state specific requirements such as WABO or others, well, the question was directed toward AWS and of course each jurisdiction can be more restrictive than the constructing code so long as the minimums are met.  In my humble opinion, I believe AISC is starting to go overboard on many of it's requirements but then they have no effect on me, so my mouth remains shut.
Parent - - By js55 (*****) Date 05-23-2007 14:28
Jon,
I too think it is good practice, and would without requirement impliment some kind of calibration verification, but there is little emprical evidence for overly restrictive tolerances. It has always been my opinion that +/- 10% is perfectly acceptable (I believe ISO concurs). And if a lesser tolerance is required let those who require such, justify it with research. Of course, since they have the power to tell us 'up yours' they have obviously chosen to do so, in a manner of speaking.
I do no think that just because the latest power source technology allows us to maintain lesser tolerances that we should by requirement have to do so, any more than just because we can look at atoms in a microstructure that we should impose SEM's and TEM's and such on QC verification.
As for your keeping silent; why? When your point is dead on accurate.
Parent - - By jon20013 (*****) Date 05-23-2007 14:36
I have annual calibration performed on a strictly voluntary bases... sometimes it slips to every 18 months but we do periodically monitor our equipment and I also use the 10% rule as a go / no-go tolerance if you will.  In reality, if the machine is running cold the welder is going to turn it up, if it's running hot, the welder's going to turn it down.  I know this is very "unscientific" and certainly not an answer one would provide as an audit response, but seems to be reality and after many years in the business, one I find it difficult to argue against... provided one has competent welders, of course.  I have had ASME III Auditors ask for calibration records and have successfully fended them off... no code requirements and typically not even a manufacturers recommendation.  As you mention, this is not to say it isn't good practice.
Parent - - By aevald (*****) Date 05-23-2007 15:03
Hello Jon, I chuckle whenever the calibration issues come up and here's why. A few years back I went through a WABO audit which was designed to bless my site as a certified WABO testing agency. I was one of the first in a new round of "getting a handle on things and doing it by the book" whatever that was supposed to mean. At the time the auditor initially came in and said that I needed to have verification of all machines being calibrated, all flowmeters on machines calibrated, all regulators on oxy-acetylene combos calibrated, all wire-feeders calibrated for wire speed, calibrated thermometers in the rod ovens, a way to check flow at the nozzle of the wire feeders with a calibrated flow device, the list went on. Needless to say there was such an uproar over this overly restrictive scenario that there were some changes made before the audit process was completed. They backed way off on a great number of these requirements. I have always maintained that the statement that you made concerning a welder turning it up if it's cold or down if it's hot pretty much covers the industry, especially where field work is being performed. To me part of being certified as a welder is having the skill to recognize when a bead is being run at the proper heat and such. I hardly see the practical side of trying to require all welding machines on job sites to be calibrated, that would be a nightmare. Just my $.02 Regards, Allan
Parent - - By jon20013 (*****) Date 05-23-2007 15:18
you have mail! ;-)
Parent - - By dmilesdot (**) Date 05-23-2007 16:31
I,m gonna go out on a limb here and play devils advocate.  If welding machines are not checked to insure the set amperage is correct, then what good is the PQR? What have you got if your "set" amperage isnt what you really have and the only way to make sure of that is calibrated machines.  If welders are allowed to set their machine any way they want, why do we bother with Procedure Quals?  Some welders have enough experiece to set their machines and be within the tolerances of the WPS and make good welds.  There are others out there that belive that "the hotter, the better" with no knowledge of whats happening in the weld and to the base metal.  When I was still witnessing PQR's I used a multi-tester to verify amps and volts.  If the settings on the machine were off from the actual readings I would tell the fabricator.   Just my thoughts.
Parent - By jon20013 (*****) Date 05-23-2007 16:56
Well, while your points certainly contain elements of validity there are also gaping holes too.  Yes, we must qualify WPS' in many if not most industries but using your logic, what's to keep the engineer or whomever is actually writing the WPS variables from going too far to one extreme or another?  Yes, there are rules in most Codes but how often have we all seen less than competent engineers running the show?  Or, perfectly comeptent civil engineers preparing WPS' for piping applications?  Also, there is added credence for specifying in-process inspections and other examinations by many codes... monitoring things while work progresses is and should be a shared function of management, quality control and competent engineering.  I reiterate, an experienced welder will tell the engineer he/she is all wet if the WPS is written incorrectly.  Similarly, if there's 200 feet of lead streaming from the equipment, the equipment readings may be way higher than the output at the stinger / torch / gun.
Parent - - By js55 (*****) Date 05-23-2007 17:28
Nobody is advocating welders setting their machines wherever they want. At least not what I've read to date that I remember. They still have to set them to the parameters on a WPS. And these parameters still have to be verified by QC. What I am arguing is the realization that its the puddle/arc characteristics that determine where the gages will end up, not the gages that determine the puddle/arc. Even a pre-WPS for qual purposes relies on someones experience as to the proper range of settings. If not, you will quickly find it.
Some minimum verification is needed. But where? That's the question.
Ask yourself this, why is it ASME Section IX does not dictate ranges in which volts and amps are to be set, even with a qualification (in fact voltages for certain processes are often only stated on the WPS for information purposes only-it is not required to dictate voltage). Somewhere in the answer to this question is an understanding that is consistent with this debate. I do not believe the Section IX guys are overlooking something. I believe we are.
Parent - By jon20013 (*****) Date 05-23-2007 17:45
js55, as usual, your insight is marvelous.  I'm certainly not advocating setting welding machines wherever the welder wishes them to be set, even if he/she may be among the best suited to determine whether or not sound weld metal is being deposited.  I am simply reluctant to have this type issue mandated by codes / standards.  There was good reason D1.1 decided to remove this requirement and also why ASME has decided not to include it.  If AISC, WABO or other jurisdictions wish to bog themselves down in the minutia of audit trail then so be it.  For me, I will continue performing "reasonable verification" on a voluntary basis.
Parent - - By aevald (*****) Date 05-23-2007 18:17
Hello dmilesdot, I agree with the logic of your post, where I have issues with it would have to do with the many machines that do not have meters to read volts/amps and the like. There are tons of machines that don't have this capability, Miller's 8PAKs, 6PAKs, SRH444s, Lincoln SAE 400s, 200s, 250s, etc. I believe you get the idea. I contacted Miller directly to inquire on calibration of their machines, they essentially told me that you could calibrate the meters on machines that are so equipped but that any machines that didn't have meters could not be "calibrated". They also went on to say that a true machine calibration scenario can only be be done by using a load bank and the associated calibrated electronics for verification and getting accurate readings. They also went on to say that you could calibrate the machine but when you throw variables into the mix such as lead length, cable size, ground clamp condition, stinger condition, arc length, and a host of other things they can all affect machine output and performance. That's why I put more credence into weld inspection of finished welds and a bit less into the machine calibration issue. My $.02 Regards, Allan
Parent - By jon20013 (*****) Date 05-23-2007 18:47
Very well said Allan and I completely agree with your position on this matter.
Parent - - By Joseph P. Kane (****) Date 05-23-2007 19:47
Dave

You have hit the nail on the head.  Why do we need PQRs and WPSs if nobody is going to follow them?  How can they be sure what they are doing,  if they don't have some way of checking that the machins is running at the desired parameters.  Perhaps "Calibrated" in the NIST sense of the word is not necessary.  Perhaps the parameters just need to be  "Checked" or "Verified" against a standard, even if it is just the in-shop clamp-probe that was used when the PQR was welded.  However, What process would be used to require the Oxy-Acetylene Gages to be "Calibrated"?  If you are doing some sort of powder spraying, or some sort of Zinc spray galvanizing, you need some assurance that the process is working at the optimum, and the Oxy-Acetylene gages need to be checked.

If anyone gets to see the NIST Calibration requirements for Steel Tape measures they woul die laughing.  Nobody could build a building with the allowances they permit.  Yet, because the tape measure is used for measuring in the process, it has to be calibrated traceable to NIST.

Parameter settings on the welding machine are set in the morning, but then, the welder decides to change from back hand to fore hand welding.  The welder changes his stickout, and that is something that the inspector will not readily catch.  So even if everything was properly calibrated the actions of the welder throw a wrench in the process. So, what good did your "Calibration" system do?
Parent - By aevald (*****) Date 05-23-2007 20:25
Hello Joseph P. Kane, my take on calibration and all the associated components of it is one that brings out the vast variability of all of the factors that make up these requirements. PQRs and WPSs are definitely needed to verify the correctness of a given welding application and it's use. They are there to show that this process will work and accomplish the intended purpose the engineers have set forth in a given welding design and manufacture method. I also believe environmental issues dictate a different use and application of these standards as required. When I say that I mean to differentiate between shop fabrication practices and field practices. I don't feel you can apply the same set of rules for these two very different scenarios. Shops have the latitude of a greater amount of control over welding processes and other associated environmental factors. Hence it makes sense to look to control these parameters more closely. Field conditions on the other hand have a different set of parameters and I don't feel that you can attempt to impose the same set of controls on the welding that occurs in that case. That is why I feel that inspection of finished welds is more critical to attaining proper quality levels especially in the field sense. This is a view that I have on the subject. I always try to consider everyones input on a topic and this is no different, that is why I am following this thread with interest. Best regards, Allan
Parent - - By jon20013 (*****) Date 05-23-2007 20:34
Joe, does it seem somewhat bizzarre that AISC work should be held to a higher standard than nuclear work?  I'm not one to answer that question but can tell you over the past 27 years of being more into than out of the nuclear field, tape measures and certain other measuring devices are excluded from the "Measuring & Test Equipment" category... how about squares, are those also required to be calibrated?  Are levels also?  I can see some things falling under the NIST umbrella but a lot that should not.
Parent - - By js55 (*****) Date 05-23-2007 21:12
I have to say that as far as laughable is concerned the whole AISC verification of squares, tapes, etc. is laughable (not to necessrily hijack the thread). It should be absolutely amazing to anybody defending this practice how so much structural, vessel, boiler, piping, and whatever-in fact, the vast, vast majority-of metal fabrication is ever even accomplished without a single bit of third party verification over verification of these tools. And ner a piping or vessel failure in history because the the hook on the end of a tape was a bit loose or a square was slightly out of square.
The idea is, we can verify stuff because we have a logic that we work hard to convince ourselves is important, or we can verify stuff because we have evidence that it is a real service problem.
Accuracy in measurement is most assuredly important. But to have it as a auditing point in third party verification? Nah.
If ASME is so wrong in this I await the evidence to the contrary.
Parent - By jon20013 (*****) Date 05-24-2007 09:18
Sadly, this entire third party verification bit has completey lost focus of technical merit.  I think a few of us know the whole bit is simply to create "programs" which force constructors into paying big $$$$ for audits and auditors.  The technical benefit?  Perhaps in the end it may create better constructors but like most quality systems the pendulum has swung way too far to one side....
Parent - By rsx-s-02 (*) Date 05-24-2007 14:28 Edited 06-06-2007 15:50
Jon, we just completed our annual AISC audit and calibration of squares was listed as a concern.
Parent - - By js55 (*****) Date 05-23-2007 16:04
Allen,
Calibration run amuck is exactly what happens when requirements can be mandated by entities whose income relies upon its continued growth and application.
Bottom lines take precedence over emprical reality.
Parent - By aevald (*****) Date 05-23-2007 17:04
js55, you are oh so right! Regards, Allan
Parent - By js55 (*****) Date 05-23-2007 16:01
Jon,
You really have hit the crux of the issue. Calibration (though minimally necessary) to verify power sources in a sense puts the cart before the horse. In reality it is the puddle/arc that determines the gage, not the reverse. This is after all the whole purpose behind PQR's. If not, we could just read a parameter guide and write procedures from that. No need to do a qual. and you also significantly added 'competent welder'. Which is of course important. Calibration and gages are much more significant to inexperienced welders. Competent welders do not need a gage to tell them what its supposed to look like.
I believe that your minimum standard is actually quite adequate. Perhaps unscientific as you say, but wherein are the failures to make its unscientificness significant?
And, my hats off to you. I never was able to fend off calibration for nukies. But then, our QC department at the time wasn't as motivated, resourceful, or aggressive as I am assuming you are.
Parent - - By jwright650 (*****) Date 05-23-2007 16:01
Concerning D1.1 or an AISC audit requiring calibration of the welding machines....even though I don't recall seeing it written, I still think that the reason that they can hold us to the task of calibration is because we have written WPSs and we even have essential variables listed in Table 4.5 of D1.1. So for an auditor to ask if the welder(person) is welding within a WPS's permissible ranges listed in Table 4.5, you would have to KNOW that the machine is reading correctly or that the gages are close enough to say.... "yes this welder(person) is welding within the WPS's ranges".

My two cents...now can you follow my train of thought?....LOL (or should I say ramblings?).
Parent - - By js55 (*****) Date 05-23-2007 16:08
John,
I think your point is valid. And I actually agree with it. I think the debate is more along the lines of how stringent. I myself, to reiterate, would impose calibration even without being required to do so by a governing body. Just not as stringently as is being imposed upon me.
Parent - - By swnorris (****) Date 05-23-2007 17:22 Edited 05-23-2007 17:25
I agree with your ramblings John.  You won't know if you're within the parameters of your WPS if you don't know what your actual readings are.  The range in our procedures is +/-5%.
Parent - - By rsx-s-02 (*) Date 05-23-2007 18:48
First I want to thank everyone for their input. It's nice to know that there are places like this to get experienced help. Thanks Al, I too looked through D1.1 and couldn't find anything.  It's nice to have another person verify it.  I work for a smaller fabricator and wear many hats and don't always have a "go to" person when I have a question.  Aside from being the QA manager/CWI, i'm also the drafting manager and the safety director.  For the life of me I couldn't find ANYTHING that would indicate 3% or 10%.  This auditor did not write this up as a Corrective Action, but did list it as a concern.  Which all you AISC certified fabricators know, must be addressed internally by next year because they will be reviewing it.    I agree that documentation is required for the acceptable parameters of meter calibration.  Scott, 5% sounds good to me.  :-)  
THanks again everyone.
Parent - - By swnorris (****) Date 05-23-2007 19:29
I hear ya.  I'm the production manager, the drafting manager, and the AISC Rep.
Parent - - By rsx-s-02 (*) Date 05-23-2007 20:22
never a dull moment.  I too am the AISC rep.  (Scott, I sent you an email)
Parent - - By swnorris (****) Date 05-23-2007 21:12
That's right.  I'm so busy I don't even have time to go to the bathroom.  I just have to do my business in my pants and keep on working.
Parent - - By XPERTFAB (**) Date 05-24-2007 04:35
In case you should desire to update your company's increased productivity program to provide some comfort while maintaining your business in your pants, be well advised that "Depends" are flammable. As such, they should be used or stored safely from areas promoting the potential for combustion.  Be additionally advised of the significant risk that your "onboard fire suppression system" may be lacking in sufficient volume and pressure to fully extinguish flames should accidental combustion occur.
Parent - By rsx-s-02 (*) Date 05-24-2007 14:44
Not just flammable but potentially explosive.  Not to worry though, as long as the proper MSD sheets are written, the hazard communication has been completed, employees are trained to the new requirement, everyone wears the proper PPE, the "onboard fire suppression" equipment is peroidically tested and certified, and OSHA and the EPA have been notified there should be no problems. 
Parent - - By calib man Date 06-28-2007 13:26
I do have 20+ years experience in the instrumentation field, but I've never had the opportunity to calibrate the volt and ammeters on welding equipment at my previous employer.  The company I've just been hired into does fabrication for the nuclear industry, and their welding equipment is now "due."  Unfortunately, there's no one remaining here at the plant that has done the task before and the procedure isn't at all specific.  "Step1. Connect the welder voltage, current instrumentation and connect load resistor.  Step2. Step welder through various test points by varying welder adjustments and load resistance.  Note: Make sure the high frequency circuits are turned off to prevent damage to test meters."
My only welding experience is a 40-hr arc welding course at the local vocational school, courtesy of the re-training allowance from my ex-employer.
I've found what I assume is a load bank that was fabricated here, as well as a shunt resisistor used to take mV readings that would correlate to amperage values.

I want to make sure I'm doing this safely as well as accurately.  Can someone make some procedural recommendations, including how to connect things up.  I've included a sketch of the items on-hand: the load bank, the ground connection, and the shunt.
[IMG]http://i196.photobucket.com/albums/aa229/calibman/loadbank.jpg[/IMG]

And what of the high frequency circuits ... the last thing I need to do is ruin the Fluke meter that I use to calibrate everything else in the plant.

Thanks!
Parent - - By DaveBoyer (*****) Date 06-29-2007 04:46 Edited 06-29-2007 04:58
I hope someone who has actually done this wil reply... Here goes: The high frequency will be on TIG welders, There will usually be a 3 position switch that says Start, Off, Continuous. If it is a DC only machine it May only have Start & Off. Obviously You want Off. The shunt goes in series with the load bank, and gets hooked to the machines output terminals. Some of the settings will be selected by heavy knife swiches, these must NEVER be moved under load. Other settings may be changed by a potentiometer, these can be changed under load. The open circut voltage could be in the 90's, so there is a shock danger. You will be compairing load volts and amps between the machines guages and the fluke meter, measuring the voltage across the output posts and amperage derived from the voltage drop across the shunt. You should record the indicated and actual readings at each load bank setting at various machine settings. The actual working voltage of the machines will be from 15 to 30 volts, I would test within this range if it were up to Me [but it is up to whoever writes the standards You are working to]. You may or may not be able to ajust the guages on the machines, check the manuals if they are available.
Parent - - By calib man Date 06-29-2007 11:58
Thanks for the reply, Dave.

The previous data sheets dictate the different test values, which vary machine-to-machine, and are largely determined by the range of the meter that's installed (0-30V, 0-50V, etc.) on a particular unit.
I appreciate the info on adjusting knife switches vs. potentiometers.  SAFETY FIRST!  :)

The shunt in series?  I expected that, simply by definition, the shunt would be connected in parallel.  I've tried different variations of the hook-up using a 32V / 3A lab power supply, but the readings in each case weren't significant enough to make any kind of determination on whether I was heading in the right direction.  I'd like to be sure before I connect up to something with some real power behind it.

Hopefully I will hear from someone who has experience doing this kind of check.  The retiree who did this work at the plant prior to me has since died, so I can't even call him.

I guess it's a good question when everybody doesn't jump at the chance to answer it!

Joe
Parent - - By DaveBoyer (*****) Date 06-30-2007 03:01
The machine current goes through the shunt on it's way to the load bank [series] You measure the voltage drop from end to end of the shunt [the VOM is parallel to the shunt] I hope this clears things up.
Parent - - By calib man Date 07-03-2007 12:55
Thanks, Dave.

I tried it with the shunt (apparently a misnomer) in series as you recommended and things worked fine.
Parent - By DaveBoyer (*****) Date 07-04-2007 02:59
The "shunt" or "Load Shunt" is the actual name of that device, it is a common and inexpensive way to measure DC amperage, particularly if the instrumentation is a distance from the load circut. In a pinch You could make one from any wire or conductor that is suficient to carry tha load without becoming too hot, and calibrate the measured voltage across the ends with a known load close in magnitude to what You need to measure.. A comercially made shunt uses resister wire [or strips] so in order to keep the measured voltage/amperage ratio linear.
Up Topic Welding Industry / Technical Discussions / Welding machine calibration

Powered by mwForum 2.29.2 © 1999-2013 Markus Wichitill