By 803056
Date 07-01-2009 12:30
Edited 07-01-2009 12:34
The preheat isn't used to control the size of the HAZ, only to ensure the cooling rate is slow enough to prevent the formation of a hard microstructure. This is a separate issue from toughness. A certain amount of ductility is needed to provide toughness, i.e., hard, brittle microstructures in the weld or HAZ will not promote toughness. So, if toughness is required when welding carbon and high strength low alloy steels, some preheat is required, but not excessively high preheat. The minimum preheat specified by the applicable code is typically sufficient to prevent cracking, but not so high as to reduce toughness when those requirements are a factor.
Not all service conditions adversely affect toughness. As the material thickness becomes greater and as the service temperature becomes lower, toughness becomes more problematic. Most welding standards don't become concerned with toughness unless the service temperature drops below some threshold temperature as defined by the applicable code or standard. For thick materials it can be as high as 70 to 80 degrees, with thinner materials the threshold maybe below 30 degrees. Many steels experience a transition from ductile to brittle failure modes around 30 degrees F (a generalization). Rolling practices, chemistry, and welding procedures influence the toughness which can only be as good as the base metal/filler metal you start with. In other words, if the base metal has a low (or no) toughness requirement or the filler metal has a low (or no) toughness requirements, there is little reason to worry about the toughness of the weld or HAZ as long as it is as good as the weak link, i.e., BM or FM. Some base metal specifications and filler metal classifications are not required to meet toughness requirements, therefore the engineer has to select the proper materials if toughness at "low" service temperatures at a factor.
As for the interrelationship between preheat/interpass temperature and arc energy (heat input), it has to do with cooling rates, microstructure, and grain size. That is delving into the subject a little deeper than I am able to go. Stephan or Gerald may be able to provide some insight on those relationships. I simply go to my references to see what information is provided or I look to the manufacturer to see if they have any recommendations. Then it's a crap shoot, use my best judgement to develop a preliminary WPS, weld and monitor the welding of the test plate, test the resulting samples, and hope for the best.
Best regards - Al
The answer is testing.
There is an interelationship. But it varies greatly depending upon the material in question.
We have to understand what toughness really is, especially when considering the Charpy test. Its not a 'real' material property.
We have arbitrarily (based upon extensive research as to why the Liberty ship hulls were cracking in north atlantic cold) decided that for quick cheap testing we can determine some material response that can be used to develope viable procedures. To get closer to 'real' material properties you need to utilize something like the CTOD test where equations can be generated from the results that can then be tied into design criteria.
Also, toughness, as measured by the charpy is not necesarily opposed to strength or hardness. A tempered martensitic microstructure can demonstrate far more toughness than a microstructure of highly ductile polygonal ferrite.
It also depends on whether you test for impact strength or lateral expansion.
By the time you factor in microstructure, grain size, element distribution and volume percent, low melt eutectic impurities, material thickness (related to not only biaxial and triaxial fracture regimes but through thickness cooling rates as well), welding process, arc density, heat input, the only answer can be testing, testing, testing.
So, the info your looking for is out there, But its dispersed and huge.