Not logged inAmerican Welding Society Forum
Forum AWS Website Help Search Login
Up Topic Welding Industry / General Welding Discussion / Newb Question About Power Supply
- - By zona70 Date 01-16-2006 17:42


All - please humor my “newb” question. I am a hobbyist welder - mostly on cars etc. I have a basic 120V MIG unit (Lincoln SP-135T) with a bottle and as I begin to work on heavier (for me) materials like .1875 or .250 wall box that are at the limit of this welder I am dealing with lots of breaker pops and poor performance. I know that the "best" answer would be to by a bigger welder but this is not in budget right now and the 1/4" material is really at the outer edge of what I will be working on right now so I would like to make my 135 work if I can.

I am getting ready to install a dedicated 120V/20A drop for the welder but as I look at this I think that it would not be any more difficult and only nominally more expensive to install a 120V/30A drop. I have been meaning to replace the cord on the welder for some time with something that gives me a little bit more range and I have a 30' section of 10GA 3 conductor cable with a twist lock 120V/30A connector that I can use as the new feed cord for the welder.

So - the questions…

1) Does the SP-135T have an internal load control that limits the current that it can pull. In other words, can I hurt the welder by plugging it in to a circuit capable of providing 30A @ 120.

2) Given that the welder performs much better on a 20A dedicated circuit as opposed to a 15A - Will going to the 30A circuit allow the welder to perform any better than a 20A.

Many thanks for your time…
Nathan
Parent - By OSUtigger (**) Date 01-16-2006 19:02
Zona,

The amperage you set a breaker at has no bearing on what you plug into it--for example, if I plug a 7.5 amp grinder into the correct voltage outlet with a 50 amp breaker (probably not going to happen, but just say I found one), this would do absolutely no harm to anything on the grinder side of it. The breaker only allows the amperage that the machine can pull--it has up to whatever amperage available to do the work.

So, answer to question 1 is technically, yes, where you set the machine parameters does regulate amperage, but no, as far as this hurting the machine, it will not.

Question two is unknown to me. I would guess that the welder is probably designed to pull a certain amperage, and that is what the breaker needs to be set for. You are popping breakers because the welder wants to pull 18 amps or 20 amps and the breaker only ALLOWS 15. It sees the 3 or four amps (or however much more) as something that should not be allowed (such as an accidental short that could start a fire), and breaks the circuit. The machine should have ratings on a label on the back, just check them out and size accordingly. You can go right on or if you think you may get something that will pull a certain amperage in the future, oversize with the 30 (though I don't know of many 120's with 30 amp breakers)

As always, be careful when wiring. Good luck!

gls
Parent - By Arcandflash (**) Date 01-16-2006 19:32
1. As OSUtigger said, no it won't hurt the machine. The machine takes what it requires assuming you are applying the proper voltage. You plug lamps requiring 0.5A (60W) into 15A circuits all the time.

2. I assume from your first post you have it powered by either a 15A circuit or a 20A circuit with other loads running at the same time. The big problem people have with high current draw devices like your welder is undersized wire feeding the device for the length of run.

The input voltage your welder sees is at the back of the box not the voltage at your electric panel. Ideally, they are the same but if your supply wire, both within the wall and from the plug to the welder, are too light you will have a voltage drop along the wire and the welder will think its running on reduced voltage. This limits its power and depending on the design may cause it to draw excessive current/pop the breaker.

I assume that if you have a 15A circuit you have #14 wire so going to a 20A circuit with #12 wire will mean less voltage drop since the #12 has less resistance so you will get better performance since the welder sees an input voltage closer to the full 120V. This also assumes the original circuit was long enough to have appreciable voltage drop. So will going from #12 (20A) to #10 (30A) improve performance? Not as likely and only if, once again, the run from the panel to the welder is long. So if you want to put a longer cord from the welder to the plug going to #10 may be the thing to do and it won't hurt anything and you will know it is "good".

You can use a 30A breaker if all of your wire is #10 and you have 30A plugs. It would be cheaper to use a 20A breaker and your 20A plugs with the #10 wire if you go this route since the welder should only draw its max rated current (20A??).

The circuit breaker in the panel protects the wire in the wall from overheating, it is not intended to protect anything plugged into the receptacles.
Parent - - By zona70 Date 01-17-2006 05:52
Thanks for the replys....

If I run a 20 A it will be #12 and if 30A #10. In either case the new outlet will be less than 10' from the breaker box. I guess the remaining questions is whether there is any real advantage to ensuring that the welder has all that it can possibly take by running the 30A circuit. I have sent a question to Lincoln and if the actual max draw on the box is 20A I will probably put in the 30A drop/breaker. If it is 17A - 18A I will probably just go with the 20A circuit. Just trying to remove any barrier to getting as much as I can out of what I have...

Parent - By billvanderhoof (****) Date 01-17-2006 08:06
http://content.lincolnelectric.com//pdfs/products/navigator/im/IM724.pdf
If you don't have it that link will get you an operator manual. The manual says 20 amps at rated output. It also says there is a breaker in the machine and also thermal protection. Thus you will get very little advantage from oversizing the circuit since the machines own protection will fault out if the input exceeds that expected. It also means that oversizing the input will cause no harm. The danger in providing too large a breaker is that a machine with some fault in it has to draw the larger current in order to trip the breaker which in turn may cause a fire. Again prevented by the on board protection. There is no reason not to feed # 10 wire from a 20 amp breaker if you are looking toward a future upgrade. If I were looking toward the future I would run cable with the extra wire (usually red) that would allow upgrade to 240 volt and deadhead the red wire until needed. When you get a bigger welder it will most likely want 240 volt service.
Bill
Up Topic Welding Industry / General Welding Discussion / Newb Question About Power Supply

Powered by mwForum 2.29.2 © 1999-2013 Markus Wichitill