What is the purpose of the voltage divider in nanjg105c?

Why not just bring +Vcc to pin 7 to monitor voltage?

I guess there’s a very good reason for it else our chinese suppliers would have optimized the schematic and reduced the cost by 2 resistors.

As far as I know, the internal reference voltage inside the Attiny13a is only 1.1v so it can only compare and measure voltages that are smaller than that.
The voltage divider just drops the voltage down into that range.

And by changing the resistor values one can still have voltage monitoring even with a Zener modded driver.

thanks! I’m having issues with voltage monitoring on tiny85, getting 105 to 108 as result of ADC conversion, regardless of input voltage… so I’m trying to figure out why is that.

Setting Vref to VCC can be done on the ATtiny13a, but then you wouldn’t get any difference between comparing.

You probably know this, but anyways… The 85 can have 1.1V and 2.56V as reference voltage (the latter requiring VCC to be at least 3V). Make sure you got the right reference voltage set.

Is this a 105c you bought or built? Verify the protection diode isnt giving you to high of a vdrop since the voltage divider pulls off it, not batt+. What value resistors are you using?

it’s a modified 105c with attiny85, standard 105c board with attiny13a taken out, and put on attiny85. There are no shorts, all the pins are soldered correctly and have good solder points. I have two boards, same result with both of them, so should not be a hw issue.

The resistors are stock, 4.7k / 19.1k

The voltage at the middle of the divider is cca 0.75V with the battery voltage being cca 3.87V.

The problem I have is that the Attiny85 ADC is for whatever reason giving me the ADC conversion result of cca 106, always, regardless of input voltage, instead of around 180 that it should for that voltage. ADC conversion on offtime capacitor is working fine. Everything else is working fine, except for that ADC converion for voltage.

This is how the ADC is done

DIDR0 |= (1 << ADC_DIDR);
ADMUX = (1 << REFS0) | (1 << ADLAR) | ADC_CHANNEL;
ADCSRA = (1 << ADEN ) | (1 << ADSC ) | ADC_PRSCL;
ADCSRA |= (1 << ADSC);
while (ADCSRA & (1 << ADSC));
voltage_adc=ADCH;

As MikeC pointed out, what do you have the vref set to?

hehe, to the wrong one ? :slight_smile:

Tiny13a has only one bit setting for vref

it’s REFS0, and setting it to one means “use internal vref”

in attiny85, to use internal 1.1vref, you’re supposed to set REFS1=1 and REFS0=0 (edit: also REFS2=0)

so it’ seems I have been actually attempting to use external vref set at PB0, and there’s nothing conected there (it’s a star).

so this might be the problem, I’ll check in a few hours.

However what puzzles me is that the offtime cap ADC was working good enough with this setting, so I never questioned the vref being wrong. I don’t know how it could work at all, since I’m using also the same (wrong) external vref on PB0 there.

yep, that was it, thanks guys, this is the correct snippet now

ADMUX = (1 << REFS1) | (0 << REFS0) |(1 << ADLAR) | (0<<REFS2)| 0x01;
ADCSRA = (1 << ADEN ) | (1 << ADSC ) | ADC_PRSCL;

so now i have 6.5k more codespace to use :slight_smile:

I looked at the datasheet, it looks like all you need to set it to 2.56v is this one line added, not sure tho, I've not yet started any 85 based code yet.

#define INTERNAL2V56_NO_CAP (6)

edit: you got that figured out but still be careful, this part still stands:

However you need to be aware you might run into issues doing this, if VCC drops below 3v at any time the 2.56v reference will then be off and that could really screw your settings and make it not ever activate the low voltage warnings. Please verify your diode is only dropping at most .2v (I've gotten many 105c's that some with a diode marked S4 that drops .4v) and start out setting the set points higher than you think you need. If your diode drops that much the reference will be screwed starting at just 3.4v batt voltage, that's not gonna allow very accurate [if any] low voltage monitoring.

note this stands whether or not your voltage divider is feeding direct from batt voltage or coming off after the diode, either way it's basing the reference off Vin (down-stream from the diode) do it will be effected by the diode regardless of where the divider feeds.

I would be calibrating every driver for offtime cap values and, voltage divider/ADC and subsequently temperature. Since the voltage divider is set to drop the voltage to below 1.1V range, If I were to use 2.56V reference I would lose a lot of precision. Also considering using all 10 bits of ADC as opposed to just 8 to get a bit more precision. Will keep you posted.