• Please review our updated Terms and Rules here

Why were some DC adapters center ground, shield positive?

bitfixer

Veteran Member
Joined
Apr 6, 2011
Messages
679
Location
San Francisco, CA
Whenever I see a device that uses a barrel-jack DC adapter where the center pin is negative and the shield is positive, I feel a mild annoyance. Seems like the device is daring me to destroy it by accident. I was wondering if there was an actual engineering reason for doing this, apart from perhaps only having that type of DC adapter available in quantity or something like that.
This seems to be something that is not done much anymore, at least not any device I've seen made in the last couple of decades or so. But I could be wrong.
One thought was that it could be useful for devices with batteries, where inserting the plug interrupts the battery connection and completely disconnects the battery from the rest of the circuit. But I feel like I have seen some that do this with center positive as well.
I am just curious to know why this was done, and if there was a real reason for it other than lack of standardization. Any ideas?
 
In my mind it was just a lack of Standardisation, or for the manufacturer to be different so that you had to purchase their power supply unit. Thinking of Sinclair here of course - but others are guilty...

This is the advantage of having an agreed Standard beforehand.

Dave
 
As far as I can tell, it's all completely arbitrary. I've got wall-warts that are tip-positive and tip-negative and AC, ranging in voltages from about 4 to more than 20. Why anyone thought this was good design practice remains a mystery. Not only that, but you'll often be hard-put to find the voltage and polarity and current consumption on much equipment.

Of course, if you consider that the wall wart was an easy way to get around the problems of electrical code compliance certification, it makes complete sense. The makers of such equipment were cheapskates.
 
Center negative is common in guitar pedals for switching as OP described. This seems to carry over into other gear like Roland MIDI modules.
 
Why were some DC adapters center ground, shield positive?
It's just DC voltage, there is no "shield". And that's pretty much the answer: it is completely indifferent where you put plus and minus. Some manufacturers chose center-positive, some center-negative. In the end, center-positive became standard, most likely because it has the small advantage that the barrel of the plug makes contact first, which is good, since you always want to have ground connected first.
 
Thanks. For some reason it felt intuitive to me that the thing with more metal on it should be ground, but who knows. The worst of course is when it's not even printed near the jack.
Right, the guitar stuff. They always seem to be 9V so it can be a direct replacement for a battery.
 
One of the many reasons that USB is quickly becoming the choice for +5V supplies (as much as I hate USB connectors).

USB-C is a decent connector, but, man, Micro-USB is just the worst. Charging cables for phones and tablets using that plug have life expectancies that make most insects look like Methuselahs.
 
And was there really any technical advantage to the thinner connector?

The phones that used the micro seemed thick enough to use the mini
The micro-USB port is supposed to last through 10,000 insertion cycles instead of a rated 5,000 for mini. The advantage to phone makers is the micro AB plug supports USB OTG and lets one attach USB devices to a phone without needing two USB ports.
 
The micro-USB port is supposed to last through 10,000 insertion cycles instead of a rated 5,000 for mini. The advantage to phone makers is the micro AB plug supports USB OTG and lets one attach USB devices to a phone without needing two USB ports.
10,000 rating but breaks off the PCB in 250
 
10,000 rating but breaks off the PCB in 250
That was a bit of the tradeoff. Better latches keep the cable inserted but increase strain on the PCB. Some micro-USB sockets have through hole support stakes which should handle the strain better than the simple surface mount ones but the extra penny in cost seems daunting for even premium phones.
 
You really want to see some fun witness what happens when a pair of frustrated kids that don't quite understand that different styles of jacks and charging cables exist for phones decide to make an Apple Lightning plug "fit" into a micro-USB socket. If I didn't know better I'd think Apple specifically designed the Lightning to be able to ream out its major competitor without taking a scratch...
 
The micro-USB port is supposed to last through 10,000 insertion cycles instead of a rated 5,000 for mini. The advantage to phone makers is the micro AB plug supports USB OTG and lets one attach USB devices to a phone without needing two USB ports.
I'm not sure that I follow. Are you saying that the micro plug enables composite host/device capabilities, where the mini does not? I've got at least one device here that has the micro installed on the PCB, but has pads (and anchor slots) for the mini on the reverse side, unpopulated. My OTG-capable MCU boards can do either through a mini USB--it's mostly a matter of software.

There are some devices with "shape shifting" USB plugs; standard size USB A shell can be flipped up to reveal a USB micro plug.

It's also noteworthy that Apple has fought adopting USB-C, even though Lightning is an inferior protocol.
 
Last edited:
...it has the small advantage that the barrel of the plug makes contact first, which is good, since you always want to have ground connected first.
Does it though? I thought the center-neg was because it contacted first. I don't have a connector on hand right now but IIRC in most PCB mount sockets the connection to the barrel is quite far in. The center pin is also quite long. To be honest I would have to meter the socket & plug to work out what contacts first.

RCA jacks have to be the worst with signal-connect first. Blew a video capture card once, when the source was one of those unearthed set top boxes.
 
I'm not sure that I follow. Are you saying that the micro plug enables composite host/device capabilities, where the mini does not? I've got at least one device here that has the micro installed on the PCB, but has pads (and anchor slots) for the mini on the reverse side, unpopulated. My OTG-capable MCU boards can do either through a mini USB--it's mostly a matter of software.

There are some devices with "shape shifting" USB plugs; standard size USB A shell can be flipped up to reveal a USB micro plug.

It's also noteworthy that Apple has fought adopting USB-C, even though Lightning is an inferior protocol.
The mini-AB plug was deprecated back in 2007 and was fairly scarce before then. The continual insertion of devices and removal for more frequent recharging since adding external devices drains the battery faster that OTG expects was not good for the short life span of the mini plug. I know it is possible to do a lot of non-standard things with USB; the software can't know that the hardware is wrong. I think the dogmatic insistence in USB OTG that devices should only have a single USB port to be a mistake.
 
Back
Top