Budget Friendly Voltage Reference (In search of)

Those $6 voltage reference without 15V batteries will work with lower voltage too, as long as you keep the output lower than input. So just use 5V as calibration voltage, and 2x18650 in series as source would work fine. (You know you have a bunch of those, don’t deny it. ) :slight_smile:

You do NEED batteries as input though, don’t use 15V AC adapter or something, the regular ones are way too noisy for calibration use.

I’ve ordered some LM336 reference chip last month from ebay, the 2.5V version, haven’t arrived. Should be easy enough to build, but the price is too cheap, I don’t know how accurate it will be. They were $0.99 for 10pcs.

EDIT: Add link to the 99c chip I bought.
http://www.ebay.com/itm/401024912217?\_trksid=p2057872

Yeah it would work but the question is if input voltage and different power sources make a difference in the output…and with that all the calibration etc. has another’s unknown value in it…that’s why all conditions are measured and written done on the certificates, to rebuild the same conditions like temperature at home or at least to know that there is probably some tempdrift.
Of course everything works but it is hard to tell how good. To compare some different cheap multimeters so that you see each one measures different all these things will be fine, but the question is how precise are these things in total. The only way to make this is to measure it with a known perfectly working meter and write the values down. Also they have to be kind of stable over time so that if you measure next year they show almost the same values.

The lm336 does not be as precise as other references mentioned in this thread before. Can have anything between 2.44V-2.54V judging from the second page of the datasheet.

I’ve re-read the original thread on EEVBlog, and it seems the module uses single 4.2V li-ion, not 15V.

A few of them did a swap with regular Samsung phone battery. There’s even a mod to charge using the cheap usb module.

And even from 4.2V input, the reference have full outputs from 2.5V up to 10V. EEVblog tests still show consistent output voltage regardless of battery charge state. And these guys don’t use cheap DMMs.

Very interesting.

Yes, it has a boost regulator and a normal LiIon charge chip.

The heat from the charge chip will affect the output voltage.

The jump in voltage is because the charge chip starts charging and heats the reference, it is about 0.05mV on the 10 volt output.

This whole concept wades right into the deep end of what I call “The Third-Party Fallacy” — the belief that any 3rd party will make better decisions than you yourself* will.

Given the purpose of the DMM, “wouldn’t the manufacturers QC check verify proper calibration”??

How about a little DIY physics experiment ?

(Full disclosure, I just got one of those Harbor Fright Free DMMs & was, just this moment, working out just this question!)

I found this article to be interesting, if not ultimately “helpful”.

As for me, I tend to eschew “Absolute Precision” since I seldom work with absolutes (not a Rocket Scientist, me!). I’d suggest factoring in a little “trust” (“Initial Plausibility”), and measure a nominal “standard” like the power-supply connections in a “standard” PC. At the end of the day, even NIST or ASTM calibration requires a level of “trust” of an interested 3rd party…
.

*
A man should learn to detect and watch that gleam of light which flashes across his mind from within, more than the lustre of the firmament of bards and sages. Yet he dismisses without notice his thought, because it is his. In every work of genius we recognize our own rejected thoughts: they come back to us with a certain alienated majesty.

― Ralph Waldo Emerson, Self-Reliance

Prober calibration of a reference chip is not maximal precision, these chips do not have a internal calibration that can be adjusted.

When manufacturers wants to make something very precise they do not worry much about the initial value, but much more about how it changes over time and temperature. This is because it is easy to compensate for the initial error in any equipment (Except the cheapest stuff). Today this type of calibration is done with the computer inside the equipment, you just connect it to a known voltage/value and tell the computer inside the equipment to save a calibration factor.

I have a few of that reference type, the worst was 0.4mV out compared to the 7510

Dimbo the Blinky wrote:

ImA4Wheelr wrote:
EDIT: Regarding uncalibrated chip Given the purpose of the chip, wouldn’t the manufactures QC check verify proper calibration?

This whole concept wades right into the deep end of what I call “The Third-Party Fallacy” — the belief that any 3rd party will make better decisions than you yourself* will.

Given the purpose of the DMM, “wouldn’t the manufacturers QC check verify proper calibration”?? . . .

Interesting stuff. I may need to try the exercise someday. Thanks.

I should have referenced the chip I was referring to, which is this TexasInstruments REF5040 chip mentioned in the OP. The only purpose of the chip is to output a high precision reference voltage of 4.096 volts. It is supposed to have low temperature drift. So for my simple, non-critical needs, it seems like a great way to check and calibrate my meters.

“Measure with Micrometer, Mark with Chalk, Cut with Axe.”

That PDF claims Accuracy = +/- 0.05. My Fluke 77-IV is only capable of +/- 0.3, so, for me that would be a bit of overkill.

OTOH, “overkill” is still “dead” (as in “dead-on balls accurate” … “It’s an industry term” ~~Mona Lisa Vitto), so that should work very well for you.

But how would you know if the chip was accurate?

(EDIT: for the record, the Harbor Fright “7 Function Digital Multimeter”, a Centech Model 90899, only claims +/- 0.5% accuracy. That means your chip could be off by a whole order of magnitude and the HF meter likely wouldn’t notice, the Fluke (still ~an order of magnitude less accurate than your chip) maybe slightly…

Having said that, when you build your Reference, I’d love to help you “test it” on these…)

When I was a calibration tech years ago (Army 35H) we had to maintain at least a 4:1 ratio of accuracy for the standard and unit under test. A standard at /- 0.05% would have been acceptable for a/- 0.3% uut and definitely not overkill.

Back then this was the dc voltage standard we used:
Fluke 332D

Still trying to find the higher current standard but for some lower dc current meters:
Fluke 382A

disclaimer: random links found via google and I am not affiliated with either party.

Not to keep harping on it, but that’s still ~an order of magnitude difference — at least “a greater capacity than necessary” — which seems significant…

And you’d be correct to surmise that I didn’t consider that ImA4Wheelr had human lives on the line as you did.

Just trying to point out that there’s still an awful lot of faith involved.

This is a very interesting thread!

Your comment just now reminds me of an electronics shop class I took in college. Back then we only had analog meters and one day the instructor asked “When you hook up an ammeter into an electrical circuit to measure current, just what do you think you are measuring?”

If you start to think about it, that question should just about blow your mind! After all, amps, the unit for electrical current, is actually electrons per second and an ammeter certainly isn’t measuring THAT.

Here is a very irreverent review by Dave from eevBlog of a Harbor Freight meter that someone sent him to review. I indexed the video to start at the measurement tests. In my opinion his reaction is a mix of surprise and distain.

If I did this correctly, this youtube video should start at 37:30. If it doesn’t, that is where I intended it to. In my opinion a must watch (for about a minute)

Are you sure it’s not just a Very Short Individual sitting in there counting out electrons?? Really?? I could’ve sworn I read that on the Internet somewhere!

:smiley:

I am reading and studying this thread because I would love to build and own a reference source.
But as you alluded to in your post, we are so far removed from the reality of it when we try to measure an electrical quantity, that FAITH always has to enter the picture.

And, an analog ammeter doesn’t measure magnetic flux density either!

Here is maybe an ignorant question. As it would seem, if a highly accurate voltage reference is so easy to come by, why don’t the manufacturers include one in their DMM and use that to self calibrate, say evertime you turn it on. If that were possible, how much extra could that cost. I’m thinking next to nothing once the silicon is done.

I have been out of the field for quite a while but back then calibration was done by checking cardinal points on all ranges, commonly 20, 40, 60, 80, and 100%. Using one reference voltage and calling a meter calibrated is making a big assumption that the observed accuracy carries over to all voltages and range settings.

No one here is calling there meter calibrated after using a reference, but it certainly raises the level of possible precision to know and get a sense of how much your DMM is off from that reference.

I was thinking similarly, I’ve known analog meters to have scaled errors. And even if voltage was calibrated perfectly all across the scale, would that carry over to amperage and resistance readings?

Plus how accurate do we need to be here? I can chop off a log for a cabin square enough with a chainsaw and my eyes, but that obviously wouldn’t work for making furniture. Perfection is nice but do we need accuracy beyond 1/1000th on voltage (or any) readings? I’m NOT wanting to start an argument over accuracy, just thinking aloud.

Phil

That is usual not necessary with digital meters, you only need a zero calibration and a near full scale calibration.

The construction of a DAC usual guarantees that it is linear and if not there is nothing you can do.

Yes, with very few exception you need to check all ranges, but knowing the most used voltage range is correct gives some confidence in the meter (Cheap meters will often give wrong readings when the battery is low).

Thanks for all the advice folks. Some of it is deeper than I understand at the moment, but I will revisit all of it when I have time really dig into the concept.

I can get the REF5040's for $3.05 each, plus $7 shipping directly from TI. That is more than I want to spend, so I ordered 3 for $4.53 of Ali. I'm probably being penny smart, but pound foolish. I won't know if these chips will be any good, but I should be able to get a general idea by comparing them to each other and other common references.

Based on the data sheet, seems that powering them with a single li-ion is the way to go. Ideally a 4.3 or 4.35 cell.

EDIT: I received the chips, but couldn't find any 4.3 or 4.35v cells in my collection. So, I connected them directly (in parallel) to a couple different 5V power supplies and got weird results. They all output 2.738 volts. Rereading the data sheet, it appears that it needs various caps to stabilize output. I need to rig up something like below. I don't know anything about High Frequency ("HF") capacitors. At least, the data sheet seems to indicate that the HF cap is optional. If it works, I'll finalize the pcb below.

I have been following this thread with some interest as well.