AC Adapter Question

From: Dan (HERMAND)16 Oct 2014 20:41
To: fixrman 9 of 40
The voltage needs to be identical (or very close if you're feeling brave*) but the max current output on the adapter only needs to meet or exceed the requirement for the device.

I.e. if you have a 12v device needing 500mA a 12v 1A adapter will be fine. A 14v 500ma will not.

*I wouldn't be brave at all with modern electronics, but you used to be able to get away with it on some simpler devices.

I'd almost forgotten polarity. Apart from laptops which are their own special beast, I'm not sure I own anything with a good old fashioned transformer. Everything is either USB standardised or IEC to the appliance.
EDITED: 16 Oct 2014 20:44 by HERMAND
From: koswix16 Oct 2014 21:50
To: Dan (HERMAND) 10 of 40
I'd guess (and it is a guess) that voltage is less of an issue now. Or maybe the same level of issue. (within reason)

With microcontrollers in everything these days the design is going to incorporate voltage regulators to provide either the 5v or 3.3v required for the logic side of things, so they shouldn't be affected. The rest of the circuit is probably no more or less voltage sensitive than older stuff.

If you supply 14v instead of 12v I'd guess that everything would run fine (albeit a bit hotter).

I'm not about to test it on anything I value, though :D
From: fixrman16 Oct 2014 22:15
To: koswix 11 of 40
Yeah, maybe he should wrap the damn cord around his waist and stick the busines end in his mouth...

...said the bishop.

He could do well also to put about 7 strands of Christmas lights on as well.  :-D

Current condition... love it!  (dance)
From: fixrman16 Oct 2014 22:17
To: Dan (HERMAND) 12 of 40
I'd not want to put anything but a dman close match myself;  500mA more current could damage some things, not knowing how well that wall wart is made.
 
Quote: 
*I wouldn't be brave at all with modern electronics, but you used to be able to get away with it on some simpler devices.

If you said that, you said a lot there.
 
Quote: 
I'd almost forgotten polarity.
You'd think it would be standard, but it isn't.

 
From: fixrman16 Oct 2014 22:21
To: koswix 13 of 40
Quote: 
that voltage is less of an issue now

The Chinese crap we get now?  8-O
 

Quote: 
if you supply 14v instead of 12v I'd guess that everything would run fine (albeit a bit hotter).

You also might pop some cheap capacitors and electrolytics.

Quote: 

I'm not about to test it on anything I value, though

Exceedingly prudent, my good man.

From: Dan (HERMAND)16 Oct 2014 22:59
To: fixrman 14 of 40
But a current rating is simply a maximum output - current is drawn, not pushed, else your 60w lightbulb wouldn't last many minutes on a 1200w (average) circuit!
From: koswix17 Oct 2014 00:05
To: fixrman 15 of 40
What Dan said - your power supplies are constant voltage, not constant current. The current drawn will be whatever is needed by the device.

Generally if you try and draw too much for an extended period the supply will die.
From: koswix17 Oct 2014 00:09
To: fixrman 16 of 40
>>You also might pop some cheap capacitors and electrolytics.

Possibly, but 2 volts extra shouldnt dshouldn't do much. I'd wager that most components (even I'm cheap dodgy stuff) will be over specified because it's cheaper to go for the mass made stuff that's readily available than it is to get something that exactly matches your requirements.

If 14v would kill it, I'd wager it's got a fairly limited life on 12v anyway.

I should probably stress again that I'm just guessing here. I'm no electromagician.
EDITED: 17 Oct 2014 00:09 by KOSWIX
From: fixrman17 Oct 2014 02:55
To: Dan (HERMAND) 17 of 40
If I have an electronic unit that I like, why would I take a chance that the adapter is crap and smokes it? You can with yours, I'll just use the right one thanks.
From: fixrman17 Oct 2014 02:58
To: koswix 18 of 40
Quote: 
Possibly, but 2 volts extra shouldnt dshouldn't do much. I'd wager that most components (even I'm cheap dodgy stuff) will be over specified because it's cheaper to go for the mass made stuff that's readily available than it is to get something that exactly matches your requirements.

Tell that to the speaker amp I just repaired. Quote: 


If 14v would kill it, I'd wager it's got a fairly limited life on 12v anyway.

Or maybe it is old and just can't handle it any more. Surely you've replaced a motherboard at least once for failed caps?
 

From: koswix17 Oct 2014 07:00
To: fixrman 19 of 40
>>Surely you've replaced a motherboard at least once for failed caps

Nope, I missed that bandwagon. That debacle was due to badly made electrolytic capacitors, I think.
From: fixrman17 Oct 2014 12:15
To: koswix 20 of 40
Yes, that is true. Bad or poorly made power supplies could accelarate the popcorn effect.
From: CHYRON (DSMITHHFX)17 Oct 2014 17:19
To: fixrman 21 of 40
Had an ide channel on an old-ish mb (or associated caps, which were visibly popped) & an attached hdd blown by a lightning-induced power surge.
From: fixrman17 Oct 2014 17:30
To: CHYRON (DSMITHHFX) 22 of 40
Yes, they don't like that much, do they?
EDITED: 17 Oct 2014 17:30 by FIXRMAN
From: Dan (HERMAND)17 Oct 2014 23:38
To: fixrman 23 of 40
Because the chance is zero, due to physics!
From: CHYRON (DSMITHHFX)18 Oct 2014 01:13
To: fixrman 24 of 40
What surprised me was that it still ran off the other channel. Apparently the PSU itself was undamaged.
From: fixrman18 Oct 2014 03:42
To: Dan (HERMAND) 25 of 40
I'll bet you whistle in the dark as well, for safety.
From: Dan (HERMAND)18 Oct 2014 16:13
To: fixrman 26 of 40
I'm not with you now
From: fixrman18 Oct 2014 16:40
To: Dan (HERMAND) 27 of 40
That's OK. Basically I am saying that a poorly designed power supply can cause problems irrespective of it's design. You don't happen to agree with me. The fact that an adapter [may be] designed poorly, or the device,  may cause a problem when using an adapter that is over spec for the device it is to power and could damage it - even though your physics explanation makes sense. An inferior product can always damage another component.

Many people do not know that one is not really supposed to charge a cell phone via the same adaptor they charge their iPad with and vice versa, even though it is more or less "approved" by Apple, to wit:
 
Quote: 

All iOS devices (and most smartphones) charge at 5 volts, the standard for USB devices. The difference between the iPhone and iPad adapters is the rated amperage—the iPad charger is rated to handle 2.1 amps, while the iPhone charger is rated for 1 amp. But the amperage rating is only a measure of the adapter's maximum capability—the actual amperage is determined by the load (i.e., the iPad or iPhone). According to Steve Sandler, founder and chief technical officer of AEi Systems, an electronics analysis company, modern battery-powered electronics have a lot of complexity between the charger and the battery, including battery-charging circuits within the device and battery-protection circuits in the lithium-ion battery itself. These circuits are designed to manage the flow of electricity to the battery, and if the circuits inside the iPhone were designed to tolerate 1 amp, but are routinely exposed to 2 amps, that could stress the system over time. "Even though you may not instantaneously say, 'Wow, I just destroyed my battery!' you may limit its life over the long term," Sandler says, "but you wouldn't know for a year or more." Our advice: Since Apple claims compatibility between the iPad charger and iPhone, pay for the extended two-year warranty for the iPhone to ride out your cell contract, and charge it however you like. If your battery degrades severely after the first year make Apple give you a new one.
~ From Popular Mechanics

I certainly wouldn't charge my iPad with the cell phone charger. I have seen it done and the adapter gets hot. The battery life of an iPhone likely will be shortened if one charges the cell (iPhone) via the iPad charger. The battery in the iPhone is not user replaceable (for most consumers), so despite Apple's assertion that they are compatible, a shortened battery life means Apple gets to sell a new phone if the battery life is poor and we all know iPhones do not enjoy stellar battery life anyway; so if the battery life is shortened by incorrect charger use, the chargers are not 100% compatible.

I don't have an iPhone so Fruitco devotees are free to abuse their equipment as they see fit. I'd say the same logic would apply to pretty much any consumer grade device that needs to be charged or would otherwise use an AC adapter, but that is my opinion only.
EDITED: 18 Oct 2014 16:42 by FIXRMAN
From: Dan (HERMAND)18 Oct 2014 17:35
To: fixrman 28 of 40
The red bit is factually wrong. A device will draw what it needs - all USB plugs are 5v, the only danger is plugging a high current device into a low current output.

How can an iPhone be 'exposed' to 2A when it only draws 1A.