Let me add my $0.02 to the discussion. I want to make sure I understand the situation, so review the scenario I've given below to see if it jives with your case.
Let's say the plate is 3 inches thick and the boss is 2 inches in diameter and projects from the plate 3 inches, i.e., 3 inches long. The boss is welded to the plate with a fillet weld (all-around) and there is undercut along the toe of the fillet weld against the boss, i.e., the undercut in question is in the 2 inch diameter boss, not the 3 inch thick plate or three inch length of the boss.
The depth of the undercut is acceptable based on the thickness of the boss, in this case the 2 inch diameter. The length of the boss has no bearing on acceptance or rejection of the undercut.
As the boss is machined to reduce the length, i.e., from 3 inch, to 2 1/2 inch, to 2 inch, to 1 1/2 inch, to 1 inch, to 1/2 inch, etc, the depth of the undercut is still based on the diameter of the boss, not the length of the boss (the distance it projects above the plate). As a mater of fact, if the boss is machined to the point where the length (thickness?) of the boss is equal to the leg of the fillet weld securing it to the plate, there is no undercut (or is the undercut equal to the diameter of the boss?).
Does it reflect your situation? If it does, then, I would say the depth of undercut is based on the diameter of the boss, not the final thickness (as it projects above the surface of the plate).
I have to go now, my brain hurts.
Best regards - Al
Ah, another piece of that Macaroni Grill pie Al and you'll be just fine.
kinda, sorta, but not exactly... lol!!! The boss (for lack of better term) is to accept a clevis pin when final machining occurs. Currently, it is approximately the same length as it is thick. Final machining will open the bore of the boss and also reduce length. Both reductions will be slight but at least theoretically may reduce the thickness to where 1/16" undercut would be unacceptable if it were welded at that time... HOWEVER, when welded (and inspected) 1/16" undercut was acceptable under D1.1 rules... kind of a weird scenario and as posted "nearly" hypothetical but could come as a question later by third party... why don't we just repair it and shut up? Well, good question, but accessibility is one issue. Attempts to repair could end up causing reduction of the base metal to an unacceptable thickness. The issue is really "time of visual inspection."
How far below 1" do you think you might end up?
Hg
Acceptance criteria for undercut is intended to take into account potential stress risers. That I believe is the intent.
However;
the traveler is going to be considered a contract document. In it, the time of inspection is specified as after welding and before machining.
I don't have 94 handy but the sentence in section 1 of D1.1 that states :
The Engineer may add to, delete from, or otherwise modify, the requirements of this code
to meet the particular requirements of a specific structure." would apply.
The cognizant engineer is responsible for designating the NDE when not specifically addressed by the code, and for any change in the requirements of the code.
Your contract documents state that the visual is performed before machining, and therefore by the letter of the code, the results of that inspection are for the record.
You could argue intent, but that's a slippery slope. If I where in your shoes, I'd throw the ball back into the engineers court which is technically the right thing to do. It is Engineering's responsibility as stated by D1.1.
My two cents worth,
Gerald
I don't think you can claim that only the set of conditions existing precisely at the time of inspection apply. If that were the case, we would never be able to take into account state of in-service stress. If it's known that the final part will be under 1", then the under-1" conditions apply. Not to say there can't be some reason applied when the machining takes one just barely under 1".
Hg
I've re-read section 1, and can't find anywhere where the inspector can make that call. Only the engineer can make this call. Having said that, some level of common sense should apply, but any inspector who trys to apply common sense when there are clear statements in the code to the contrary leaves themselves and their company open to risk. My personal opinion and experience tells me it "should" be taken into account the state of inservice stress as you've said, but the code is the code, and that has stated that the engineer details the contract documents in regards to the subject matter at hand. I try not to let my personal opinion come into it.
I agree it should go to the engineer.
Hg
From an viability standpoint we are talking about 32 thousanths of an inch difference in undercut determining the difference between an assembly that is viable and one that fails when fretting over this issue as jon has stated it(and remember this is a non fatigue application). 32 thousanths!!! IMO, if the design is that tight, you need a new engineer. If the code is that tight, then we need new requirements in the code because the fellers are cutting it too close. But of course there are always fudge factors built into any engineering design, or at least there is supposed to be, especially if done by code. And there is certainly a huge fudge factor built into code requirements. Take a look at ASME Section IID and then take a look at the MSTS's of the applicable materials.
The problem is, defending it in court if there is a failure. In cases like that it doesn't matter whether it actually IS the cause of the failure or not, it would most likley be a glaring issue for a jury that probably will have little if any engineering background.
Pon further review, its probably best to toss it up to the engineer.
My guess is that the reason for the two different undercut tolerances for different thickness ranges is that 1/16" was determined to be too high a percentage of material thickness for the thinner materials--6% or higher. So they went to a smaller number for that range of thickness. That means that in some applications the allowable is 1/32" and in some applications it's 1/16". The other choice would be to limit it to 1/32" in all cases, and many fabricators would scream at that one.
Or make it purely a function of thickness (say 5%, 1%, whatever). But that makes inspection much less efficient because you can't just carry a couple of go/no go undercut gages with you; you'd need to get an actual measurement.
Any engineer in their right mind, if indeed the nominal thickness of the material is still 1" and it's machined to just barely below (I still haven't gotten an answer as to just how thin the material gets), should okay it. But the engineer is expected to know how close "just barely below" is. It's not the inspector's job to determine that. It's also not the inspector's job to take all cases of 1/32" tolerance in the entire code and round 'em all up to 1/16" tolerance because "it's only a 0.032" difference".
The code is there not to describe the circumstances under which a structure will or will not fail. It describes what is considered robustly acceptable, in part so the inspector can do a yes/no assessment of each case without there having to be a structural viability analysis on every imperfection. If someone want to fight the grey areas, they can. But that's not where the inspector is involved.
Hg