Any interest in a LED/Battery analyzer device?

That’s really wierd. I’ve had issues with AVR ADC’s, but never anything that couldn’t be fixed by slowing them down. This sounds more like an ohm’s law problem (AKA Voltage drop accross the resistor). Do you have one of the other peripherials turned on (Digital Input, maybe?) that could be leaking some current to ground and causing a delta V across your input resistor (ie, acting as a voltage divider)?

Nope, DIDR reg has the digital inputs disabled and the internal pullups are disabled. I measured the input leakage current on the ADC pins and it is in the low nanoamp range.

I may go to a lower impedance voltage divider to lower the error, but it still needs to be compensated for. A 10K input impedance is pushing the power dissipation limits of 1/4 watt resistors, so 20K would be as low as I would like to go. Since one of the dividers is connected directly across the battery input, I’d like to keep the impedance high so that it would not drain a cell as fast if one accidentally left a battery connected for an extended period of time. 20K would draw around 5 mAh a day from a single lithium cell.

Very strange. Personally, I guess I’d nix the ATMEGA and replace with, oh, I dunno, maybe a PIC24? That’s actually one of my go-to chips when I need reasonable speed and good analog functionality.

I’m too set up for doing AVR’s at the moment to change…

If you find yourself with some time and an AVR, try reading an ADC channel. Drive it with a AA/AAA cell (or even better something below 0.5V). Take readings through 0, 10K, and 100K resistors and see what happens…

I just ordered a 16” sphere from Barnards to go with my LED analyzer project . The price is quite reasonable (go up a couple of inches and it’s no quite so). http://www.barnardltd.com/product.jsp?prodId=1878&catId=950 No way you could justify making your own sphere at that price.

If it works out, I’ll probably get a 24” one.

Actually, it may be able to do so! I’m looking at supporting the Taos color sensor chips. It can measure the light output in the red, green, and blue bands. With that data, one could probably cobble together a color temperature value. No telling how accurate, though.

I would prefer having the sensor parts as an open source HW Arduino shield.
That maybe would trigger this community to write fantastic applications for battery/flashlight analysis?

I would also like to bring up the topic of absolute and relative accuracy.
What is good enough? Personally I am really not interested in extreme accuracy or resolution.
Is it really necessary to have millivolt resolution?

ATMEGA328 with an external Vref is fine with me.

Using an Arduino basically adds $30 to the cost. Forget it. I am planning on using a ’328 CPU with the arduino bootloader in it. Think of it as an Arduino plus shield on one board. You should still be able to program it using that horrid piece ’o crap Arduino development environment.

I understand the cost impact, but a Chinese UNO ad DX are at 17USD which is a fairly good price.
Making a an “Arduino Shield” would also attract other than just us flashaholics.
It will be so versatile and can be used by many others in the arduino community.

You get what you pay for…

This board would be the equal of 8 logging DVM’s, two logging lux meters, etc, etc, etc Figure at least $1000 to do it using off-the-shelf instruments. Plus over 30,000 lines of code in the host app. Target cost here is in the $100 range…

I ordered a 16” sphere from Barnards yesterday (http://www.barnardltd.com/product.jsp?prodId=1878&catId=950) The owner called me back a couple of times to confirm some shipping address details. When I returned their calls, she answers the phone on the first ring or two. So far, their customer service can’t be beat. The sphere is going out today.

Supposedly some light and current sensor boards have arrived from China…

I got in the light sensor boards from Old Cathay. They spent two weeks being shuttled between USPS sort facilities in San Francisco.

They seem to work really well! Hooked one up and my code worked the first time (damn, I’m good :party: ) The readings agree very well (within 10) with my lux meters (which only agree with each other within 20).

With the default gain on the chip, it goes up to around 54,000 lux and down to 0.8 lux. By changing one register it can be made up to 9 times more sensitive down too around half as sensitive.

I also got in some 20A and 30A current sensor boards. I haven’t played with those yet…

Turns out that I got 5A and 20A sensor boards. The 30A ones are on their way from another seller in Old Cathay. I hooked up one of the 20A units and it seems to work pretty well. I am getting milliamp resolution and well under 1% absolute accuracy. There was a 30 mA offset in the reading at the zero amp point, which was easily removed by the calibration code.

I haven’t decided which current sensors to use… I’m leaning towards the +/- 50 amp ones. They have around 100 micro ohms of burden resistance (probably 1000 times lower than most DMM’s) That should introduce no funky effects when measuring LED or tailcap current.

I was playing with plotting the light sensor readings. I have the unit sitting on the floor of my kitchen. It was dark outside and I turned on some kitchen lights (12 Sylvania PAR20/10W 95CRI 550 lumen bulbs). The reading started out at 188 lux. Over 30 minutes the reading dropped to 180 lux as the bulbs warmed up.

It also makes for a good motion detector. You can’t move anywhere near it without seeing the lux readings change (even if you try to keep your shadow from falling on it, which causes huge dips in the readings).

You can also use it as a wind detector. You see interesting changes in the readings during the day as wind blows the trees around.

Near sunset the light readings were falling until the sun got near the horizon and apparently started shining in the rear windows. Then there was a big bump in the incoming light.

The sensor readings are very stable and repeatable. There does not seem to be any random noise in the readings. With the sensor covered you get the expected 0 lux, but with just about any very dim light it shows 0.866 lux (assuming it is programmed with the default sensitivity values). You can fiddle with shading it and see readings down to 0.216 lux (which appears to be the minimum quantization step). You can program the chip for more or less sensitivity.

Texaspyro,

I've been watching/reading your progress so far with great interest. Do you have any pictures of the prototype unit so far?

Thanks for the updates and efforts on this project!

-Match

Nope, for now it is just a bunch of wires/small Chinese sensor baords hanging off a LCD touchscreen based micro controller board that I am using for development.

I was going to do some tests on a couple of 6W Chinese PAR16 and PAR20 bulbs. They use 3 x 2W Cree LEDs. I hooked the thermocouple to the heat sink and got a very bogus temperature reading (80 degrees C when the bulb was at 25C) when I switched it on. Same with the thermocouple electrically insulated from the heat sink. Looks like those bulbs are throwing off a lot of electrical noise that is corrupting the temperature readings. I am going to try some more robust insulation…

No problems measuring my Sylvania/Philips/LSG bulbs.

Well, don’t buy the 16” sphere from Barnards unless you like shaping styrofoam… the inside is not spherical! There are 6” diameter flat spots at the top of each half where the wall was made thicker. A little quick measuring shows you would need to remove about 0.56” of material at the peak if you were to sand the inside to a true spherical shape.

Also, the sphere is actually 15.625” in outside diameter.

That's not good news. I wonder if all their sizes exhibit this trait?

Probably… I found a post on CPF that said that their 8” sphere had flats. And another that used their 24” sphere but did not mention any flats.

Plasteel 24” spheres are known to be OK… http://www.smoothfoam.com/foamshapes/smoothfoam_styrofoam/10073-1.html