Any interest in a LED/Battery analyzer device?

I added a couple of 10:1 voltage dividers to the ADC inputs on my prototype so that I can measure voltages higher than 5V (i.e. 50 volts). I used 90K/10K ohm resistors and ran into a small problem. The ADC (analog to digital converter) in the micro controller works by first charging an internal sampling capacitor to store the input voltage while it makes the measurement.

They say the the chip is optimized for a 10K ohm source and I am using a 90K source. The problem is that the chip can’t charge the sampling capacitor fast enough at low input voltages. A 1.60V AA cell was measuring 1.450V and a 0.10V source was measuring as 0.035V. Signals over 10V or so were pretty much correct. The measurement error is exponential (increases dramatically at low voltages) and not linear. The simple solution was to add a calibration lookup table to compensate for the funky readings. Now all the readings are within around 0.5% accurate (with millivolt resolution).

I did some more testing with the recommended 10K source impedance and found that the readings were also at least 2% low. Without an input resistor they were spot on. So much for trusting chip maker’s data sheets… people using AVR chips to measure low/high impedance voltages need to be aware of the issue.

Not sure if you require extremely high speed acquisition or high sample rates, but our solution to the crappy AVR ADC has always been to simply slow down the ADC clock and give the cap time to charge… If we require high speed ADC, we don’t use AVR.

I suspect you’ll find look-up tables to be highly error-prone. The chips vary greatly in how far they’re off when you don’t give the sample cap time to charge.

Additionally, You’ll almost certainly have to calibrate the readings for reference voltage (unless you’re using an external ADCVref) as the ATMega328 has a terrible internal reference (1.1V nominal with a tolerance range from 1.0V to 1.2V).

PPtk

Will it be able to measure/calculate the tint (Kelvin) of the LED?

Nope… that would take a spectrometer to do correctly.

Under the same Kelvin temperature you have various tints, Let's say 5700K, you have 2S and 2R one is Green and one is Pink. So by knowing the Kelvin you still do not know how it looks.

I have a reference voltage calibration factor in there… it currently uses the 5V Vcc as the reference. I could use the internal bandgap. It’s pretty good, as long as you measure it and don’t just assume what its value is. I’d have to put voltage dividers on the current reference chips to do that (and would lose the ratiometric goodness of driving them with the ADC reference voltage)

I am trying to sample the ADC at 10 kHz. Each of the six channels are sequentially sampled (1667 samples a second). The samples are averaged over 1 second intervals to get an effective 15 bit resolution.

I am going to play with slowing the sample rate down and see how that works out. I tried sampling each channel multiple times in a row, but it looks like they zero the sampling cap each sample.

The lookup table is used more for the shape of the error curve, not exact calibration numbers. Absolute errors can be handled by the reference voltage calibration factor. I could also automatically generate a custom table for each individual chip.

I checked the code… the ADC is already running as slow as it can (prescaler is divide by 128) with the 16 MHz CPU clock. Even slowing the system clock to 1 MHz did not help. Looks like the software compensation is going to be the way to go.

That’s really wierd. I’ve had issues with AVR ADC’s, but never anything that couldn’t be fixed by slowing them down. This sounds more like an ohm’s law problem (AKA Voltage drop accross the resistor). Do you have one of the other peripherials turned on (Digital Input, maybe?) that could be leaking some current to ground and causing a delta V across your input resistor (ie, acting as a voltage divider)?

Nope, DIDR reg has the digital inputs disabled and the internal pullups are disabled. I measured the input leakage current on the ADC pins and it is in the low nanoamp range.

I may go to a lower impedance voltage divider to lower the error, but it still needs to be compensated for. A 10K input impedance is pushing the power dissipation limits of 1/4 watt resistors, so 20K would be as low as I would like to go. Since one of the dividers is connected directly across the battery input, I’d like to keep the impedance high so that it would not drain a cell as fast if one accidentally left a battery connected for an extended period of time. 20K would draw around 5 mAh a day from a single lithium cell.

Very strange. Personally, I guess I’d nix the ATMEGA and replace with, oh, I dunno, maybe a PIC24? That’s actually one of my go-to chips when I need reasonable speed and good analog functionality.

I’m too set up for doing AVR’s at the moment to change…

If you find yourself with some time and an AVR, try reading an ADC channel. Drive it with a AA/AAA cell (or even better something below 0.5V). Take readings through 0, 10K, and 100K resistors and see what happens…

I just ordered a 16” sphere from Barnards to go with my LED analyzer project . The price is quite reasonable (go up a couple of inches and it’s no quite so). http://www.barnardltd.com/product.jsp?prodId=1878&catId=950 No way you could justify making your own sphere at that price.

If it works out, I’ll probably get a 24” one.

Actually, it may be able to do so! I’m looking at supporting the Taos color sensor chips. It can measure the light output in the red, green, and blue bands. With that data, one could probably cobble together a color temperature value. No telling how accurate, though.

I would prefer having the sensor parts as an open source HW Arduino shield.
That maybe would trigger this community to write fantastic applications for battery/flashlight analysis?

I would also like to bring up the topic of absolute and relative accuracy.
What is good enough? Personally I am really not interested in extreme accuracy or resolution.
Is it really necessary to have millivolt resolution?

ATMEGA328 with an external Vref is fine with me.

Using an Arduino basically adds $30 to the cost. Forget it. I am planning on using a ’328 CPU with the arduino bootloader in it. Think of it as an Arduino plus shield on one board. You should still be able to program it using that horrid piece ’o crap Arduino development environment.

I understand the cost impact, but a Chinese UNO ad DX are at 17USD which is a fairly good price.
Making a an “Arduino Shield” would also attract other than just us flashaholics.
It will be so versatile and can be used by many others in the arduino community.

You get what you pay for…

This board would be the equal of 8 logging DVM’s, two logging lux meters, etc, etc, etc Figure at least $1000 to do it using off-the-shelf instruments. Plus over 30,000 lines of code in the host app. Target cost here is in the $100 range…

I ordered a 16” sphere from Barnards yesterday (http://www.barnardltd.com/product.jsp?prodId=1878&catId=950) The owner called me back a couple of times to confirm some shipping address details. When I returned their calls, she answers the phone on the first ring or two. So far, their customer service can’t be beat. The sphere is going out today.

Supposedly some light and current sensor boards have arrived from China…

I got in the light sensor boards from Old Cathay. They spent two weeks being shuttled between USPS sort facilities in San Francisco.

They seem to work really well! Hooked one up and my code worked the first time (damn, I’m good :party: ) The readings agree very well (within 10) with my lux meters (which only agree with each other within 20).

With the default gain on the chip, it goes up to around 54,000 lux and down to 0.8 lux. By changing one register it can be made up to 9 times more sensitive down too around half as sensitive.

I also got in some 20A and 30A current sensor boards. I haven’t played with those yet…

Turns out that I got 5A and 20A sensor boards. The 30A ones are on their way from another seller in Old Cathay. I hooked up one of the 20A units and it seems to work pretty well. I am getting milliamp resolution and well under 1% absolute accuracy. There was a 30 mA offset in the reading at the zero amp point, which was easily removed by the calibration code.

I haven’t decided which current sensors to use… I’m leaning towards the +/- 50 amp ones. They have around 100 micro ohms of burden resistance (probably 1000 times lower than most DMM’s) That should introduce no funky effects when measuring LED or tailcap current.