Understanding current consumption

Can anyone explain to me more about what it means when you measure a light and learn it's consumption?

Lets say a light uses 400 milliamps. For example let's say I'm using a Powerlight with a fully charged 2000mah eneloop and the current is 400ma. That would mean then I could expect about 5 hours of light if it is digitally regulated. But what does a 400ma current mean if the light isn't digitally regulated and the current drops as the hours of use continue? I'm pretty sure the powerlight I have gives close to a dozen hours.

Another example is my 3AA lantern. It consumes 60 ma on low. 3 eneloops give 6000 mah. But the lantern doesn't run for 100 hours on a single set of batteries. There's likely a minimum operating voltage of around 3v, so when the batteries drop below 3v the lamp turns off.

So how do we know the true run time when we learn the consumption. Any thoughts ?

This isn't the way to figure things anyway. You are just taking into account the current with taking into account the voltage. What matter is the watts in the case of the emitter and the watt/hours in the case of the battery.

If the battery is 2000mA at 1.5 V then that's 3wH capacity.

If the emitter is 3.6V and draws 400mA as its power requirement then that's 1.44 wH so 3wH/1.44wH = 2.08 hours

Then you have to deal with what you guys are talking about. Am I missed something?