E-switch UI Development / FSM

For NarsilM I use: "avrdude -p t85 -c usbasp -Ulfuse:w:0xe2:m -Uhfuse:w:0xde:m -Uefuse:w:0xff:m". So this is my standard fuse values for the 85 with BOD enabled.

For the clock speed, I'd probably guess it would cause problems.

Fuse values have always been a source of confusion, so I post/document the fuse values with the firmware source - no questions then. Some settings may be optional - BOD for example: with it disabled in a Q8, the intermittent connections that occur when threading on the battery tube caused it to lock up - enabling BOD fixed all that. Could be other lights you could disable BOD, thus reducing parasitic drain a bit.

Ohhh - my BOD setting is for 1.8V !!

Does BOD prevent lights from resetting with super short power cycle.
When its dropped?

Maybe? I know with the Q8 intermittent power glitches caused lockups with BOD disabled. It wasn't simply a reset. I do know guys and manufacturers are using caps, larger size ones, to prevent exactly that. Believe in parallel to our "C2" cap which is between grnd and Vcc of the MCU.

The H17f driver of DrJones claims to be bump proof and from my testings it is.
Keeps current mode with super short power cycle.

It might not be a must have, but its a really want.

Hi TK, since I also took a look at this issue, what would be the correct ADC_22 and ADC_44 values?

I calculated them to be ADC_22 = 237 and ADC_44 = 473.

Calculation:
Voltage divider R1 = 360k, R2 = 47k; Factor = 0.115
VRef = 1.1V with Factor 0.115 = Max. Value of 9.53V measureable
4.4V at the voltage divider yields 0.508V at PB2, which is 46% of the 1.1V reference
2.2V at the voltage divider yields 0.254V ad PB2, which is 23% of the 1.1V reference
46% of 1024 is 473 => ADC_44
23% of 1024 is 237 => ADC_22

But with these values, I’m getting the following warning

Which makes sense as all voltages internal are in tenths of volts (23 = 2.3V) and the factor would need to be around 1075

To calculate the other way around:

My testbench is set up with such an voltage divider and I’m measuring a VBat of 4.7V and 0.54V at PB2
0.54V at PB2 should be a ADC value of 505.
Bits/volt = (473-237)/(4.4-2.2) = 107
ADC / bits_per_volts = 505 / 107 = 4.7V

Can somebody check my calculations?
Because if I flash with those values, the readout is 1 blinks / 2 blinks which would indicate 1.2V

Ahhh, the constants are 8 bits, not 10 bits. All our previous voltage values were 8 bits. TK compensates/converts for the 8 bit constants.

10 bits shifted left by 7 is 17 bits -- too long for 16 bit math. In NarsilM think I'm doing 32 bit math, but also throwing away the 2 LSB's, so I also lose the precision of 10 bit AtoD's.

TK's method is probably saving precious code bytes,

I tried a bit more and I suspect that there may be a bug in the code for the voltage divider.

It works for me with the following code snippets:

At 4.75V supply voltage I get 4 + 8 or 4+7 blinks.

EDIT: The interesting fact is, for the BLF GT, my excel sheet for the ADC values outputs 184 / 92 which is exactly the same as configured.

When I use the old code with the GT config and the GT voltage divider (1M / 47k) it reads 4.7V as 1+2 blinks.
With the new code, it reads it as 4+8 to 5+0 blinks.

@TK: Did you test the BLF GT config for reading the battery explicitly?

I don't see a bug, but I see two differing methods. I think the bit math is done as signed 16 bit, so if true your method would not work if a value was >=512, but your 4.4V value is 472 so it works. Her method uses 15 bit max with 8 bit #'s for the ADC values - seem to be both equivalent in the math. I could be missing something or I could be wrong and not right... Wut?

I told them twice that there is a bug with the voltage divider, but they keep telling me it works
I tried change values of ADC_44 and ADC_22 and it change nothing, the readout at 6V stays in 2S always 3.7 blinks

likely the code is in the GT file on the wrong position or not read and somewhere else is code used

Clock speed changes are likely to cause issues. However, I think the CKDIV8 option might not matter because Anduril messes with the clock speed divder anyway. It chooses a clock speed based on the brightness level, underclocking itself at moon to reduce power use and make moon mode more stable.

About where to get the fuse values… bin/flash*.sh

Those scripts are what I use to flash drivers, and they include working fuse values.

It’s not about different methods (<<4 vs <<7), the existing code mixes up the fixed-point values.

ADC_nn values are 16.0 values and get converted to a 9.7 value (<<7)
the ADC reading is a 16.0 value and gets converted to a 11.5 value (<<5)

Then, those two values are divided to yield the voltage (e.g. 47 for 4.7V)
But with those mixed fixed-point values, it produces a result that’s much too low.

I also did the measurements.
BLF GT (1M/47k) measuring 4.75V. Old code: 1+2 blinks, New code: 4+8 blinks
Lexel 2S (360k/47k) measuring 4.75V: Old code: doesn’t work, New code: 4+8 blinks

No, BOD has the opposite effect. It detects the interruption and shuts down to make sure nothing weird happens during the interruption.

The intended way to get these values is to flash ToyKeeper/battcheck and write down the numbers it blinks out. The values go from 0 to 255.

Yes. The BLF GT (original, not GT70) is the only light I have with Anduril and a voltage divider. It works fine for me.

The battcheck tool outputs 8-bit unsigned integers matching the most-significant 8 bits of the ADC measurements in FSM.

ADC_nn values are 8.0 fixed-point, or plain unsigned 8-bit integers. Same scale as battcheck.
The ADC measurement is a 10-bit integer without left-adjust, so it’s treated as a 8.2 fixed-point value.

Since one is 8.0 and the other is 8.2, they are shifted by different amounts to put them at the same precision before dividing. They both get converted to 9.7 fixed-point for division purposes.

In the end, the logical units used by UI code are “volts * 10”, so for 3.6V it would be a value of 36.

That’s bad, because battcheck currently doesn’t support the ATTiny 85. But I’m trying to add support for that.

Interesting… Either my setup with the 1M/47k divider has a problem or there’s something else hidden somewhere.

"ADC_nn values are 16.0 values and get converted to a 9.7 value (<<7)
the ADC reading is a 16.0 value and gets converted to a 11.5 value (<<5)"

That's not quite accurate. ADC_nn values are 8 bits, shifted left by 7 they become 15 bit values.

The ADC reading is 10 bits, shifted left 5 bits it becomes a 15 bit value -- so comparing 15 bit vs. 15 bit seems compatible to me.

You could take the reading, divide it by 4, then shift it left 7 bits -- that's the same thing except you lose the resolution of the lower 2 bits.

The included battcheck-25.hex file works on tiny85.

Thanks.

Interesting, that outputs 128 for 4.4V and 62 for 2.2V

Looks like I missed something with my math - that’s why I asked for a check.

It’s worth mentioning that it means 4.4V and 2.2V per cell. So, the actual voltage needs to be scaled up to however many cells the light uses in series.

On my BLF GT, I measured at 8.8V and 17.6V.

Ish.

My PSU doesn’t go up to 17.6V, so I measured a bit lower and then used the battcheck.py script to calculate what it should have been at 17.6V. Then I plugged the resulting values into ADC_22 and ADC_44, compiled, flashed, and now I have a pretty accurate voltage readout on my BLF GT.

Oh, that’s something I messed up. I thought it measures the total voltage and not a per-cell scaled down voltage…