Discussions

Even just going by the USB specification, the lowest that it allows is 4.0, and that’s in USB 3.0 – earlier versions have higher requirements. If I remember correctly, 4.5 V is USB 2.0 – and that’s the requirement for the device to operate, not charge! Charging is fundamentally limited by battery voltage, unless the device itself upconverts the voltage it receives – but why would it do that, if the spec is intentionally designed to give it voltage high enough to charge? Now, with cheap USB portable solar chargers specifically, I can believe that they are undervolted – this would mirror my personal experience with them being basically useless in anything other than direct sunlight, and weak even then. I have one sitting outside right now, actually, for almost a week straight – and it can’t get past 40% charge on the indicator. If a fully discharged Li-Ion is 3 V, and a fully charged one is 4.2 V, then 40% would be at about 3.48 V. A charger that’s outputting around 3.8 V could realistically get it to that point, accounting for cable and connector losses. You should try this experiment with those chargers that you’ve measured – take a few fully discharged devices, and see how much they can get charged before they flatline. If a charger can only get something to 30-40% even in the best conditions, that’s an important thing to know, IMO. However, this is for solar chargers, and I presume the reason is because they don’t bother with a stepping circuit. USB power banks should never do this. What you should see when measuring voltage is mostly steady, gradually decreasing output within the allowed limits, until the very end, when the internal battery voltage is so low that the stepping circuit can’t do anything about it – then the output starts falling very suddenly and rapidly. This might also depend on the current. I know that cheaper ones are often undervolted under load – so they advertise 5V/2A, but in reality it’s more like 5V or 2A. This is why some are unable to charge powerful smartphones at all – the smartphone negotiates the highest current that it can get, and then draws that, but the voltage isn’t high enough for it to use that.

No activity yet.

Even just going by the USB specification, the lowest that it allows is 4.0, and that’s in USB 3.0 – earlier versions have higher requirements. If I remember correctly, 4.5 V is USB 2.0 – and that’s the requirement for the device to operate, not charge! Charging is fundamentally limited by battery voltage, unless the device itself upconverts the voltage it receives – but why would it do that, if the spec is intentionally designed to give it voltage high enough to charge? Now, with cheap USB portable solar chargers specifically, I can believe that they are undervolted – this would mirror my personal experience with them being basically useless in anything other than direct sunlight, and weak even then. I have one sitting outside right now, actually, for almost a week straight – and it can’t get past 40% charge on the indicator. If a fully discharged Li-Ion is 3 V, and a fully charged one is 4.2 V, then 40% would be at about 3.48 V. A charger that’s outputting around 3.8 V could realistically get it to that point, accounting for cable and connector losses. You should try this experiment with those chargers that you’ve measured – take a few fully discharged devices, and see how much they can get charged before they flatline. If a charger can only get something to 30-40% even in the best conditions, that’s an important thing to know, IMO. However, this is for solar chargers, and I presume the reason is because they don’t bother with a stepping circuit. USB power banks should never do this. What you should see when measuring voltage is mostly steady, gradually decreasing output within the allowed limits, until the very end, when the internal battery voltage is so low that the stepping circuit can’t do anything about it – then the output starts falling very suddenly and rapidly. This might also depend on the current. I know that cheaper ones are often undervolted under load – so they advertise 5V/2A, but in reality it’s more like 5V or 2A. This is why some are unable to charge powerful smartphones at all – the smartphone negotiates the highest current that it can get, and then draws that, but the voltage isn’t high enough for it to use that.