33s.pdf

Welding Journal | February 2014

substructure of the cladding. Figure 3D provides a lower magnification view that demonstrates the cracks initiate at the valleys of the weld ripples. The dendrite cores in the cladding exhibit a minimum in alloy concentration due to the relatively rapid solidification conditions associated with welding. As a result, the corrosion rate is accelerated in these regions and localized attack occurs at the dendrite cores. These localized penetrations form stress concentrations, which eventually grow into fullsize fatigue cracks under the influence of residual and service-applied stresses, where the service-applied stresses arise primarily through thermal cycling. As shown in Fig. 3D, most cracks initiate at a region in the valley of surface weld ripple where an additional stress concentration exists. The high residual stress that results from welding also probably contributes to the cracking problem. In addition, dilution from the underlying steel tube substrate, which results in reduced alloy content of the cladding, compromises the corrosion resistance of the cladding. The primary metallurgical factors that contribute to corrosion-fatigue cracking (weld ripples, microsegregation, high residual welding stresses, dilution) are all associated with the localized heating, melting, and solidification of the welding process. Thus, use of a coating that can be applied with more uniform heating in the solid state should help mitigate these problems and improve the cracking resistance. Work is in progress (Ref. 11) to evaluate coatings made by the coextrusion process in order to eliminate or reduce the inherent problems of weld cladding. With this process, a cylindrical shell of a corrosion resistant alloy is first joined by explosive welding to a steel substrate, and the bimetallic billet is then coextruded at an elevated temperature to produce a tube with an outer coating. Since there is no melting/solidification involved, the coating microstructure consists of equiaxed grains with no microsegregation, similar to that expected for a wrought alloy. The corrosion resistance of Alloys 600 (Ni-16Cr-8Fe) and 622 (Ni-22Cr-13Mo- 2Fe) weld cladding and coextruded coatings have recently been compared using thermogravimetric testing in a simulated combustion gas (10%CO-5%CO2- 2%H2O-0.12%H2S-N2, in vol-% at 600°C, and the results are shown in Fig. 4 (Ref. 11). A sample of wrought Alloy 600 was also tested. The coextruded coatings exhibit significantly better corrosion resistance compared to the weld cladding of the matching alloy. Figure 5 shows a weld cladding sample that was corrosion tested under solid-state conditions and then etched to reveal the dendritic substructure. Note that preferential corrosion has occurred at the dendrite cores (arrows). A B C D Figure 6 provides an EDS line scan that was acquired across the dendritic substructure of the weld cladding. As expected (Ref. 12), the dendrite cores are depleted in Mo, with Mo concentration levels down to ~ 11 wt-% (the nominal Mo concentration of the filler metal is ~ 13 wt-%). Figure 7 shows the microstructure of the coextruded coating, and an EDS line scan acquired across several grains of the coating is shown in Fig. 8. The coextruded coating exhibits a uniform, equiaxed grain structure and a uniform distribution of alloying elements. For Alloy 600, the corrosion resistance of the wrought alloy and coextruded coating are comparable. This indicates that the coextrusion coating process has no adverse effect on the inherent corrosion resistance of the alloy. The improved corrosion resistance of Alloy 622 relative to 600 is attributed to the higher Cr and Mo content of Alloy 622. The difference in corrosion performance between the coating types can be attributed to two factors. First, the weld claddings evaluated in this work contain an additional 10 wt-% Fe from dilution with the steel substrate. This additional Fe stems from dilution with the underlying steel substrate and is always present in any weld cladding. The addition of Fe results in a corresponding decrease in the Cr and Mo content and often has a detrimental effect on corrosion resistance. The 10% dilution value used to prepare the sample for these tests represents a lower limit on the dilution level for commercially applied coatings. Such dilution effects do not occur with the coextruded coating. Second, the weld cladding exhibits microsegregation of alloying elements, resulting in preferential corrosion of the alloy-depleted dendrite core regions. The coextruded coatings also have a uniform coating thickness and smooth surface finish that should help reduce the localized stress concentrations that can aggravate the corrosion-fatigue problem. In addition, the heating and cooling cycles experienced during coextrusion are much less severe and more uniform compared to fusion welding, so the residual stresses should be significantly reduced. While the mechanism of corrosion fatigue cracking is generally understood, there is a need to understand the corrosion fatigue behavior on a more fundamental basis in order to assess the relative WELDING JOURNAL 33-s WELDING RESEARCH Fig. 3 — A — Photograph of a weld cladding with extensive circumferential cracks; B — cross-sectional scanning electron photomicrograph of several small cracks early in the cracking stage; C — distribution of alloying elements across the dendritic substructure of the cladding; D — photograph showing crack initiation at the valley of the weld ripple.


Welding Journal | February 2014
To see the actual publication please follow the link above