127s

Welding Journal | April 2015

Figure 1 illustrates the schematic of the proposed training system. In this system, a super welder model can correlate the outputs Ω (3D weld pool characteristic parameters, i.e., the weld pool length, width, and convexity (Ref. 16)) to an optimal welding speeds S*, which will serve as the desired speed for the human welder to follow. The arm movement controller (Ref. 11) then calculates the needed visual signal input A (arrow with direction and amplitude). The human welder sees the visual signal superimposed on the weld pool image, and moves his/her arm accordingly. The arm movement speed S is then inputted into the welding process as the welding speed. Augmented Reality Welder Training System Setup The system (Fig. 2) consists of two workstations: welding station and virtual station. In the virtual station, a human welder can view the mockup where the weld pool image is rendered and displayed, and moves the virtual welding torch accordingly as if he/she is right in front of the workpiece. The human welder movement was accurately captured by a Leap motion sensor, and the obtained virtual torch tip 3D coordinates were sent to the PC. The welding station consists of an industrial welding robot, eye view camera, and a compact 3D weld pool sensing system (Ref. 10). The robot utilized in this study is Universal Robot UR-5 with six degrees of freedom. The robot is connected to a controller, which is used to control the motions of the robot. The robot (client) and PC (server) communicated via Ethernet using TCP/IP protocol and socket programming. The PC calculated the next pose Fig. 4 — Measured system input (welding speed) and outputs (weld pool width, length, convexity) in six dynamic experiments. Fig. 5 — Histogram of the input and outputs in six dynamic experiments. of the robot tool by adding the welding speed (calculated by super welder) within a sampling period and the current robot tool position. The robot arm equipped with the welding torch received the next pose (robot tool 3D position and orientation) via Ethernet from the PC. It then executed the command and sent the current measured robot tool position back to the PC. Figure 3 depicts a detailed view of the 3D weld pool sensing system (Ref. 10). Camera 2 captures the weld pool image and sends it back to the PC. A WELDING RESEARCH visual signal (arrow with direction and amplitude) is added to the weld pool image for the welder to view (a sample image is shown in upper right). A lowpower laser (19 by 19 structure light pattern) is projected to the weld pool APRIL 2015 / WELDING JOURNAL 127-s Table 2 — Model Order Selection Previous Measurements Order Input Order Width 1 3 Length 1 5 Convexity 1 5 Table 4 — Model Errors Eave (mm) RMSE (mm) Width 0.5172 0.6594 Length 0.6318 0.8118 Convexity 0.0200 0.0250 Table 3 — Model Parameters Previous Measurements Input Width 0.7273 1.7093, –0.5674, 0.3929 Length 0.8341 1.0687, 0.5568, –0.1396, –1.0487, 0.6717 Convexity 0.6589 0.0480, –0.001609, –0.01991, –0.01368, 0.03125 A B C D


Welding Journal | April 2015
To see the actual publication please follow the link above