Hi,
This is perhaps a simple question but aside from measuring voltage of batteries (i.e. AA, AAA, C, D, 9V etc...) with a multimeter, how do I make sure they actually perform under load properly? A bunch of batteries would all show the normal 1.5V terminal difference, but if I switch the multimeter to AMP, I may get 5-7 A reading straight from the terminals, but some batteries have much less (1-2 A). For example, my 9V battery tests 9.31V on the terminals but as soon as I check AMP on the terminals, I am getting 0.040 A and dropping. Is this a good secondary way to check? Is there a rating system I can use... for example, if 5-7A it's fresh and good for some heavy-duty toys, and in the 1-2A rating it may be ok for remote-controls or low-energy devices?
Does it make sense to first filter my batteries by voltage on the terminals... and then a secondary test under load or AMP through multimeter on the ones that test within good voltage parameters? Or should I set up a circuit and test the voltage when under load and see how far it drops? What would be an easy way to do this? Set up a battery holder hooked up to a number of different resistors and test the voltage across the battery when it runs through different resistor values?