Budget Friendly Voltage Reference (In search of)

I found these on ebay

http://www.ebay.com/itm/High-Precision-Voltage-Reference-Module-AD584kH-4-Channel-2-5V-7-5V-5V-10V-/351423344222?hash=item51d276ae5e:g:8YYAAOSwpDdVePQW $6.55
How does one connect a power source to this one?

KKMOON Ad584-m Voltage Reference Module 4-channel Single-button Operation YC for sale online | eBay $15.91
This one comes with a built in cell, that is very convenient but i am not sure it is worth 2 times more than the bottom one.

http://www.ebay.com/itm/High-Precision-Voltage-Reference-Module-AD584kH-4-Channel-2-5V-7-5V-5V-10V-/281866937606?hash=item41a0942506:g:gNgAAOSwcBhWVWiZ $5.96
Does anyone know what kind of cell we should be using with this one?

Is these the ones that should be decent?

For $6 i will gladly upgrade my precision on my cheap DMM’s and various volt meters i use around the house :slight_smile:
Thanks guys for this thread this is a great idea. I though you had to spend thousands of $ to have any hope of any decent volt precision but now i know better.

Ok gotcha. Deleted that post because you where right it did not include the cell.

So i have to decide on the convenience of inbuilt lithium cell or “modding” a 15V 10F20 out of 5x CR2032 cells then. Thanks for the tip gauss163 :slight_smile:

This thread makes me wish I was still working in a cal lab.

Those $6 voltage reference without 15V batteries will work with lower voltage too, as long as you keep the output lower than input. So just use 5V as calibration voltage, and 2x18650 in series as source would work fine. (You know you have a bunch of those, don’t deny it. ) :slight_smile:

You do NEED batteries as input though, don’t use 15V AC adapter or something, the regular ones are way too noisy for calibration use.

I’ve ordered some LM336 reference chip last month from ebay, the 2.5V version, haven’t arrived. Should be easy enough to build, but the price is too cheap, I don’t know how accurate it will be. They were $0.99 for 10pcs.

EDIT: Add link to the 99c chip I bought.
http://www.ebay.com/itm/401024912217?\_trksid=p2057872

Yeah it would work but the question is if input voltage and different power sources make a difference in the output…and with that all the calibration etc. has another’s unknown value in it…that’s why all conditions are measured and written done on the certificates, to rebuild the same conditions like temperature at home or at least to know that there is probably some tempdrift.
Of course everything works but it is hard to tell how good. To compare some different cheap multimeters so that you see each one measures different all these things will be fine, but the question is how precise are these things in total. The only way to make this is to measure it with a known perfectly working meter and write the values down. Also they have to be kind of stable over time so that if you measure next year they show almost the same values.

The lm336 does not be as precise as other references mentioned in this thread before. Can have anything between 2.44V-2.54V judging from the second page of the datasheet.

I’ve re-read the original thread on EEVBlog, and it seems the module uses single 4.2V li-ion, not 15V.

A few of them did a swap with regular Samsung phone battery. There’s even a mod to charge using the cheap usb module.

And even from 4.2V input, the reference have full outputs from 2.5V up to 10V. EEVblog tests still show consistent output voltage regardless of battery charge state. And these guys don’t use cheap DMMs.

Very interesting.

Yes, it has a boost regulator and a normal LiIon charge chip.

The heat from the charge chip will affect the output voltage.

The jump in voltage is because the charge chip starts charging and heats the reference, it is about 0.05mV on the 10 volt output.

This whole concept wades right into the deep end of what I call “The Third-Party Fallacy” — the belief that any 3rd party will make better decisions than you yourself* will.

Given the purpose of the DMM, “wouldn’t the manufacturers QC check verify proper calibration”??

How about a little DIY physics experiment ?

(Full disclosure, I just got one of those Harbor Fright Free DMMs & was, just this moment, working out just this question!)

I found this article to be interesting, if not ultimately “helpful”.

As for me, I tend to eschew “Absolute Precision” since I seldom work with absolutes (not a Rocket Scientist, me!). I’d suggest factoring in a little “trust” (“Initial Plausibility”), and measure a nominal “standard” like the power-supply connections in a “standard” PC. At the end of the day, even NIST or ASTM calibration requires a level of “trust” of an interested 3rd party…
.

*
A man should learn to detect and watch that gleam of light which flashes across his mind from within, more than the lustre of the firmament of bards and sages. Yet he dismisses without notice his thought, because it is his. In every work of genius we recognize our own rejected thoughts: they come back to us with a certain alienated majesty.

― Ralph Waldo Emerson, Self-Reliance

Prober calibration of a reference chip is not maximal precision, these chips do not have a internal calibration that can be adjusted.

When manufacturers wants to make something very precise they do not worry much about the initial value, but much more about how it changes over time and temperature. This is because it is easy to compensate for the initial error in any equipment (Except the cheapest stuff). Today this type of calibration is done with the computer inside the equipment, you just connect it to a known voltage/value and tell the computer inside the equipment to save a calibration factor.

I have a few of that reference type, the worst was 0.4mV out compared to the 7510

Dimbo the Blinky wrote:

ImA4Wheelr wrote:
EDIT: Regarding uncalibrated chip Given the purpose of the chip, wouldn’t the manufactures QC check verify proper calibration?

This whole concept wades right into the deep end of what I call “The Third-Party Fallacy” — the belief that any 3rd party will make better decisions than you yourself* will.

Given the purpose of the DMM, “wouldn’t the manufacturers QC check verify proper calibration”?? . . .

Interesting stuff. I may need to try the exercise someday. Thanks.

I should have referenced the chip I was referring to, which is this TexasInstruments REF5040 chip mentioned in the OP. The only purpose of the chip is to output a high precision reference voltage of 4.096 volts. It is supposed to have low temperature drift. So for my simple, non-critical needs, it seems like a great way to check and calibrate my meters.

“Measure with Micrometer, Mark with Chalk, Cut with Axe.”

That PDF claims Accuracy = +/- 0.05. My Fluke 77-IV is only capable of +/- 0.3, so, for me that would be a bit of overkill.

OTOH, “overkill” is still “dead” (as in “dead-on balls accurate” … “It’s an industry term” ~~Mona Lisa Vitto), so that should work very well for you.

But how would you know if the chip was accurate?

(EDIT: for the record, the Harbor Fright “7 Function Digital Multimeter”, a Centech Model 90899, only claims +/- 0.5% accuracy. That means your chip could be off by a whole order of magnitude and the HF meter likely wouldn’t notice, the Fluke (still ~an order of magnitude less accurate than your chip) maybe slightly…

Having said that, when you build your Reference, I’d love to help you “test it” on these…)

When I was a calibration tech years ago (Army 35H) we had to maintain at least a 4:1 ratio of accuracy for the standard and unit under test. A standard at /- 0.05% would have been acceptable for a/- 0.3% uut and definitely not overkill.

Back then this was the dc voltage standard we used:
Fluke 332D

Still trying to find the higher current standard but for some lower dc current meters:
Fluke 382A

disclaimer: random links found via google and I am not affiliated with either party.

Not to keep harping on it, but that’s still ~an order of magnitude difference — at least “a greater capacity than necessary” — which seems significant…

And you’d be correct to surmise that I didn’t consider that ImA4Wheelr had human lives on the line as you did.

Just trying to point out that there’s still an awful lot of faith involved.

This is a very interesting thread!

Your comment just now reminds me of an electronics shop class I took in college. Back then we only had analog meters and one day the instructor asked “When you hook up an ammeter into an electrical circuit to measure current, just what do you think you are measuring?”

If you start to think about it, that question should just about blow your mind! After all, amps, the unit for electrical current, is actually electrons per second and an ammeter certainly isn’t measuring THAT.

Here is a very irreverent review by Dave from eevBlog of a Harbor Freight meter that someone sent him to review. I indexed the video to start at the measurement tests. In my opinion his reaction is a mix of surprise and distain.

If I did this correctly, this youtube video should start at 37:30. If it doesn’t, that is where I intended it to. In my opinion a must watch (for about a minute)

Are you sure it’s not just a Very Short Individual sitting in there counting out electrons?? Really?? I could’ve sworn I read that on the Internet somewhere!

:smiley:

I am reading and studying this thread because I would love to build and own a reference source.
But as you alluded to in your post, we are so far removed from the reality of it when we try to measure an electrical quantity, that FAITH always has to enter the picture.

And, an analog ammeter doesn’t measure magnetic flux density either!

Here is maybe an ignorant question. As it would seem, if a highly accurate voltage reference is so easy to come by, why don’t the manufacturers include one in their DMM and use that to self calibrate, say evertime you turn it on. If that were possible, how much extra could that cost. I’m thinking next to nothing once the silicon is done.

I have been out of the field for quite a while but back then calibration was done by checking cardinal points on all ranges, commonly 20, 40, 60, 80, and 100%. Using one reference voltage and calling a meter calibrated is making a big assumption that the observed accuracy carries over to all voltages and range settings.

No one here is calling there meter calibrated after using a reference, but it certainly raises the level of possible precision to know and get a sense of how much your DMM is off from that reference.

I was thinking similarly, I’ve known analog meters to have scaled errors. And even if voltage was calibrated perfectly all across the scale, would that carry over to amperage and resistance readings?

Plus how accurate do we need to be here? I can chop off a log for a cabin square enough with a chainsaw and my eyes, but that obviously wouldn’t work for making furniture. Perfection is nice but do we need accuracy beyond 1/1000th on voltage (or any) readings? I’m NOT wanting to start an argument over accuracy, just thinking aloud.

Phil