Not logged inAmerican Welding Society Forum
Forum AWS Website Help Search Login
Up Topic American Welding Society Services / Technical Standards & Publications / Opinions on undercut when machining is involved?
- - By jon20013 (*****) Date 05-30-2007 18:56
This is "somewhat" hypothetical, but I'm very interested in opinions from our D1 experts:

Consider a case where the thickness of the base metal allows a maximum undercut of 1/16" to be permitted by D1.1 Code.  After welding, the base metal is machined to a thickness where 1/16" undercut is not acceptable.  The time of inspection is before machining and the machining does not affect the weld.

Acceptable or not?
Parent - - By HgTX (***) Date 05-30-2007 19:26
You mean it is machined to where the undercut is less than 1/16"?

Hg
Parent - - By jon20013 (*****) Date 05-30-2007 19:32
No, if that were the case I wouldn't even ask the question.  The base metal is machined ot the point where 1/16" undercut becomes unacceptable.  I should clarify, I am working to the 1994 Edition of D1.1.
Parent - - By HgTX (***) Date 05-30-2007 22:06
Oh, okay, you're just right at the 1" thickness that allows the 1/16" undercut, and then you reduce the thickness.  Sorry.

How far below the 1" cutoff are you machining it?

Seems to me that for any significant reduction in thickness, it shouldn't be allowed because the requirement is based on some percentage of total plate thickness in the final stressed state (not that I was around when they wrote the provision, but I'm guessing).  Otherwise you could get around the 1/32" requirement all the time (if you wanted to be silly and wasteful) by starting with thicker plate and machining it down.

What this means, though, is that the welder and/or inspector need to know what the final thickness will be in order to determine their criteria.

Then again, if you're supposed to be nominal 1" and you're just machining for flatness and you wind up 0.01" (or even 0.04") below the cutoff and the nominal dimension on the plans for the finished product is 1", I'd probably allow it.  That's a 1" cutoff, not a 1.000".  The line had to be drawn somewhere, and it was drawn somewhere convenient, not at some scientific tipping point.

Hg, possessor of humble opinions
Parent - - By dmilesdot (**) Date 05-31-2007 14:53
It would seem to me that this is a good example of a notation on a drawing of what the final thickness of the material will be, so as to determine acceptable/unacceptable undercut.  Can the inspection of the weld be performed at the final stage of manufacturing?  If so the thickness of the material and the amount of undercut can be determined at that point.  Another question, does the machining reduce the undercut any? For instance, are you cutting the other side of the plate (opposite of the weld) or are you cutting the welded side and reducing the undercut at the same time?
Parent - - By jon20013 (*****) Date 05-31-2007 15:24
dmilesdot, you bring up some very good points and all valid questions.  In response, I'll say the machining has no affect on the undercut, what is there will be there after machining.  The time specified for weld inspection in the shop travelers is before machining.  I was asked my opinion and my response was at the time of inspection the undercut was acceptable, and accepted by the inspector.  My customer does not disagree but simply was inquiring based on what his customer may say.  My opinion, for what its worth, is the condition is acceptable.  Although loading was not defined in design details, in fact the part is statically loaded unless an upset condition requires it to actually become operational in which case it would see a brief cycle.
Parent - - By js55 (*****) Date 05-31-2007 18:05
Isn't BM thickness supposed to be considered as thickness at the toe of the weld. So, how is it you can reduce the BM thickness without reducing the undercut?
And consistent with this thinking I would agree with you jon.
Parent - By jon20013 (*****) Date 05-31-2007 18:28
Thanks js55.  In this case, the undercutting occured on a boss assembly which had extra stock to clean up for tight dimensional tolerances (clevis pin arrangement).  The issue isn't a major one, just one I wished to gather some additional input for.  I appreciate the nod!
Parent - - By aevald (*****) Date 05-31-2007 18:31
Hello js55, as I read this post it is describing a scenario where a column or leg has a base plate welded to it. After the welding has been done the bottom of the base plate is machined to provide flatness, thus reducing the BM thickness, this machining hasn't altered the welding conditions so the undercut that was present is still there in it's "as welded" condition. Now that the base metal has a reduction in thickness does this qualify the weld for a change in inspection criteria? Unless I am misunderstanding, this is essentially what is being described. I don't have an answer to this as it is not in my arena of knowledge. Regards, Allan
Parent - - By js55 (*****) Date 05-31-2007 18:48
jon,
I would say, in consideration of Allen's scenario, that unless the thickness, and therfore cross sectional area is reduced at the location of the undercut, for example the back side of a base plate underneatht he weld undercut, or the ID of the boss opposite the undercut, then you have no concern. However, once you reduce the thickness in the very cross section of the undercut, even though the letter of the code has been satisfied, in my opinion, let me emphasize, you may have a design consideration.
Parent - - By 803056 (*****) Date 05-31-2007 19:31 Edited 05-31-2007 19:51
Let me add my $0.02 to the discussion. I want to make sure I understand the situation, so review the scenario I've given below to see if it jives with your case.

Let's say the plate is 3 inches thick and the boss is 2 inches in diameter and projects from the plate 3 inches, i.e., 3 inches long. The boss is welded to the plate with a fillet weld (all-around) and there is undercut along the toe of the fillet weld against the boss, i.e., the undercut in question is in the 2 inch diameter boss, not the 3 inch thick plate or three inch length of the boss.

The depth of the undercut is acceptable based on the thickness of the boss, in this case the 2 inch diameter. The length of the boss has no bearing on acceptance or rejection of the undercut.

As the boss is machined to reduce the length, i.e., from 3 inch, to 2 1/2 inch, to 2 inch, to 1 1/2 inch, to 1 inch, to 1/2 inch, etc, the depth of the undercut is still based on the diameter of the boss, not the length of the boss (the distance it projects above the plate). As a mater of fact, if the boss is machined to the point where the length (thickness?) of the boss is equal to the leg of the fillet weld securing it to the plate, there is no undercut (or is the undercut equal to the diameter of the boss?).

Does it reflect your situation? If it does, then, I would say the depth of undercut is based on the diameter of the boss, not the final thickness (as it projects above the surface of the plate).

I have to go now, my brain hurts.

Best regards - Al
Parent - By js55 (*****) Date 05-31-2007 19:59
Ah, another piece of that Macaroni Grill pie Al and you'll be just fine.
Parent - - By jon20013 (*****) Date 05-31-2007 20:47
kinda, sorta, but not exactly... lol!!!  The boss (for lack of better term) is to accept a clevis pin when final machining occurs.  Currently, it is approximately the same length as it is thick.  Final machining will open the bore of the boss and also reduce length.  Both reductions will be slight but at least theoretically may reduce the thickness to where 1/16" undercut would be unacceptable if it were welded at that time... HOWEVER, when welded (and inspected) 1/16" undercut was acceptable under D1.1 rules... kind of a weird scenario and as posted "nearly" hypothetical but could come as a question later by third party... why don't we just repair it and shut up?  Well, good question, but accessibility is one issue.  Attempts to repair could end up causing reduction of the base metal to an unacceptable thickness.  The issue is really "time of visual inspection."
Parent - By HgTX (***) Date 05-31-2007 21:53
How far below 1" do you think you might end up?

Hg
Parent - - By CWI555 (*****) Date 06-03-2007 00:26 Edited 06-03-2007 00:29
Acceptance criteria for undercut is intended to take into account potential stress risers. That I believe is the intent.
However;
the traveler is going to be considered a contract document. In it, the time of inspection is specified as after welding and before machining.

I don't have 94 handy but the sentence in section 1 of D1.1 that states :
The Engineer may add to, delete from, or otherwise modify, the requirements of this code
to meet the particular requirements of a specific structure." would apply.
The cognizant engineer is responsible for designating the NDE when not specifically addressed by the code, and for any change in the requirements of the code.
Your contract documents state that the visual is performed before machining, and therefore by the letter of the code, the results of that inspection are for the record.
You could argue intent, but that's a slippery slope. If I where in your shoes, I'd throw the ball back into the engineers court which is technically the right thing to do. It is Engineering's responsibility as stated by D1.1.

My two cents worth,
Gerald
Parent - - By HgTX (***) Date 06-04-2007 15:30
I don't think you can claim that only the set of conditions existing precisely at the time of inspection apply.  If that were the case, we would never be able to take into account state of in-service stress.  If it's known that the final part will be under 1", then the under-1" conditions apply.  Not to say there can't be some reason applied when the machining takes one just barely under 1".

Hg
Parent - - By CWI555 (*****) Date 06-04-2007 15:43
I've re-read section 1, and can't find anywhere where the inspector can make that call. Only the engineer can make this call. Having said that, some level of common sense should apply, but any inspector who trys to apply common sense when there are clear statements in the code to the contrary leaves themselves and their company open to risk. My personal opinion and experience tells me it "should" be taken into account the state of inservice stress as you've said, but the code is the code, and that has stated that the engineer details the contract documents in regards to the subject matter at hand. I try not to let my personal opinion come into it.
Parent - - By HgTX (***) Date 06-04-2007 15:54
I agree it should go to the engineer.

Hg
Parent - - By js55 (*****) Date 06-04-2007 17:07
From an viability standpoint we are talking about 32 thousanths of an inch difference in undercut determining the difference between an assembly that is viable and one that fails when fretting over this issue as jon has stated it(and remember this is a non fatigue application). 32 thousanths!!! IMO, if the design is that tight, you need a new engineer. If the code is that tight, then we need new requirements in the code because the fellers are cutting it too close. But of course there are always fudge factors built into any engineering design, or at least there is supposed to be, especially if done by code. And there is certainly a huge fudge factor built into code requirements. Take a look at ASME Section IID and then take a look at the MSTS's of the applicable materials.
The problem is, defending it in court if there is a failure. In cases like that it doesn't matter whether it actually IS the cause of the failure or not, it would most likley be a glaring issue for a jury that probably will have little if any engineering background.
Pon further review, its probably best to toss it up to the engineer.
Parent - By HgTX (***) Date 06-04-2007 18:20
My guess is that the reason for the two different undercut tolerances for different thickness ranges is that 1/16" was determined to be too high a percentage of material thickness for the thinner materials--6% or higher.  So they went to a smaller number for that range of thickness.  That means that in some applications the allowable is 1/32" and in some applications it's 1/16".  The other choice would be to limit it to 1/32" in all cases, and many fabricators would scream at that one.

Or make it purely a function of thickness (say 5%, 1%, whatever).  But that makes inspection much less efficient because you can't just carry a couple of go/no go undercut gages with you; you'd need to get an actual measurement.

Any engineer in their right mind, if indeed the nominal thickness of the material is still 1" and it's machined to just barely below (I still haven't gotten an answer as to just how thin the material gets), should okay it.  But the engineer is expected to know how close "just barely below" is.  It's not the inspector's job to determine that.  It's also not the inspector's job to take all cases of 1/32" tolerance in the entire code and round 'em all up to 1/16" tolerance because "it's only a 0.032" difference".

The code is there not to describe the circumstances under which a structure will or will not fail.  It describes what is considered robustly acceptable, in part so the inspector can do a yes/no assessment of each case without there having to be a structural viability analysis on every imperfection.  If someone want to fight the grey areas, they can.  But that's not where the inspector is involved.

Hg
Up Topic American Welding Society Services / Technical Standards & Publications / Opinions on undercut when machining is involved?

Powered by mwForum 2.29.2 © 1999-2013 Markus Wichitill