iPad/iPod/iPhone Charging rant

By

I make my living developing applications for the iPad (among other things). I don't have much use for an iPad personally, so mine gets used exclusively as a testbed for development. I have three of them, one of each generation, and most of the time they sit in a drawer unused until I'm ready to test my application on a real device (most of the development/debugging cycle happens in the simulator running on the mac, which is much faster and easier than debugging on the real device). Since the iPads sit unused most of the time, 9 times out of 10 when I need to use them, their batteries are dead. And when I plug them in, I have to wait 20-30 minutes until the battery has enough charge before the device will boot.

Now, I'm a software guy, so maybe I'm way off base here. I have enough knowledge about electronics to understand charging current and voltage, and all that stuff, but I'm no EE. But why on earth, when the device is plugged into a power source, can I not use the device until the battery reaches a certain charge level (10 percent, maybe?). I mean, if the charger is able to supply enough current to charge the battery, is it not also able to supply enough current to operate the device? Maybe it can't supply enough to do both at the same time - that's understandable when it runs off a USB port, since the USB standard limits the amount of current a device can draw (500 ma, I think?). But if it can't do both at the same time, then shouldn't priority be given to operating the device, rather than charging the battery? These devices clearly discharge more slowly than they charge, which means the amount of current required to operate them is LESS than the amount of current required to charge their batteries. Which means the power supply CAN and DOES supply enough current to operate them, independent of the charge status of the battery.

So why doesn't power get diverted to running the device, rather than charging the battery in the case where the battery is completely dead? In software geek pseudocode, it would go something like this:

if (device is switched "ON")
{
    redirect all needed power to device operations;
    use whatever power is left (if any) to charge battery;
}
else
{
    charge battery;
}

I know that's drastically oversimplified, but you get the idea. If the user wants to use the device, divert all available power to letting them do so. Charging the battery should become a secondary task as soon as the user hits the "ON" button.

Now for me, it's just an occasional 30 minute wait before I can test my application. It's annoying, but I can live with it. But here's an example of a real world situation where this could be a big problem: say you're in your car and there's an emergency of some kind - an accident with injuries, for example. You reach into your pocket and pull out your iPhone to call for help. Uh oh...battery is dead. No problem, just plug it into the cigarette lighter charger and make the call, right? Oops. Nope. You have to wait 30 minutes until the battery is charged enough for the phone to run off the battery.

So why can't it run off of the charger? iDevices have been like this since the earliest iPods, so it's not an issue that's unique to the higher power requirements of the iPad and iPhone. And every other phone I've ever owned has been able to make a call as soon as I plugged it into power, regardless of the charging status of the battery, so why are Apple's iDevices different?

That's a serious question. Contact me if you have an answer, I'd really like to know.



Return to Jeff Loughlin's home page


Last update: April 17, 2013