Attiny25/45/85 FW Development Thread

Thanks DEL - yep, that’s what I’m thinking - something else that’s not getting shut down. Spoke last night to my EE buddy here and he said the same thing bout the diode we are using. He also mentioned these diodes will consume voltage though when the MCU is drawing power, so I think I need to know exactly what the draw is. Using the “85” and not the “85v” rated at 2.7v minimum, means you can’t afford to loose much voltage via the diode. Think’n the drain was 0.1-0.2 volts from what I recall, which means our LVP needs to cut off no less than 2.9v. I believe the loss’s caused by the diode is why we moved the voltage divider resistors before the diode, and are using a 22K instead of the 19.1K.

Haven’t had a chance yet, but I need to dive deep into the Atmel 25/45/85 specs to see if we are missing something on sleep mode. I am aware that you have control of several sub-systems for sleep/low power.
Yep, I heard bout the brown-out taking amps in sleep mode - the details were posted here a while back.

You can assume a 0.25 V drop for the Schottky-type diodes we are using (or try to measure it while the MCU is running, it does vary with current and temperature.)

There is a discussion in one of the threads to replace D1 with a ‘boot-strapped’ PMOS FET - this would give practically zero voltage loss.

I don’t think it’s necessary to replace the diode to reduce the draw, it’s just that with an “85”, you have to be extra careful with your LVP cutoff value. The 2.7v is actually 2.95v then, therefore, your cutoff point should be 3.0v or 3.1v for a little cushion. When I bought my ATTiny85’s from Mouser or DigiKey, I bought the 85V’s but the last batch I bought from RMM are the “85”’s.

What driver are you using? I always take the cell voltage measurement before the diode. I thought that’s what everyone else did too.

wight’s FET+1 and MntE’s FET+1 board, reflowed myself. Cell voltage before the diode? Not sure I understand - I’m just measuring at the same place I always measure amps - at the tail: between batt- and the edge of the battery tube.

A tiny85 won’t drop dead at 2.69v. Atmel just does not guarantee they will remain reliable outside the specs. The official specs are often conservative. People can and do run them a bit outside of specs and have no problems. Some even sell devices that use AVRs running outside of specs. I believe, iirc boards from jeelabs are (or were) running at a clock speed + voltage that is outside of spec and he said he never ran into a problem even among hundreds of chips.

Also what speed are you running at? You can increase reliability by running at a lower clock speed (like 4Mhz) + you’ll get a bit of power savings. I posted awhile ago, somewhere :smiley: probably this thread, code to use the clock prescaler to adjust the speed beyond the options that the fuses give you.

~ edit ~
Yep, using our nifty new “search within thread” feature it comes up by searching “clkpr”.

Apparently, I was talking to some guy named Tom E in the above quote. So, I guess it’s nothing new to you? :-x

I forget far more than I remember. It’s dejavu all over again. I recall reading this and maybe other posts about it, but it’s untested, and I thought incomplete at the time. There’s 2 timed things I need - the 16 msec timer (or need to know exactly what it is), and the built-in _delay_ms() routine, so any change that impacts these is a problem. I don’t have any decent development/debug capabilities with these parts, and not much time to explore these things with unknown advantages? I am looking to lower sleep mode parasitic drain, and don’t see the relationship between processor speed and parasitic drain. Sorry, I haven’t had time to do the crazy hours of research on this.

Don’t get me wrong - all good ideas, but don’t ask me to go off and take the risk of time for a possible benefit, which I’m not clear on. I am very handcuffed with this firmware development environment from what I’m used to, lacking single step level debugging, profiling, etc., so experimenting is very time consuming.

I am very committed to 8 Mhz right now because I know it works. I haven’t seen anything posted in detail, fully working, or fully explained and tested to work, at lower speeds. Again, no time for the R&D. Early on I tried a lower speed via the fuses, and timing of something was way off, even though I thought I made all the proper settings. This, of course, is a shame because others are doing this I’m pretty sure, but I just can’t find the source code and/or fuse settings, etc., or they are unwilling/not allowed to post it.

I sometimes feel like I forget far more than I remember as well. Glad I’m not alone. :smiley:

ToyKeeper mentioned testing it here. Not sure how it effects _delay_ms() and the 16 msec timer you’re using. But it should be one of two possibilities, cuts it in half or no effect. About power savings, I don’t have any links off hand but I’ve also seen people test combinations of different levels of sleep with different clock speeds to verify what you get in the real world. I expect the savings to be minor. I brought it up more because you’re concerned about the tiny85 being rated at 2.7v minimum. Even with the tiny85v you’re not suppose to go below 2.7v if your running at 8MHz. 4Mhz is the max (officially) for the 1.8-2.7v range.

Are you disabling adc before power down sleep?
(ADCSRA &= ~(1<<ADEN)) // disable ADC (before power-off)

(ADCSRA |= (1<<ADEN)) // re-enable ADC

You’re not using the watchdog, right?

If the voltage divider gets the voltage before the diode, then the voltage drop of the diode is irrelevant for LVP readings. I design all my drivers this way and just assumed that everyone else did too. I guess I shouldn’t be assuming so much.

Before diode:

After diode:

Crap - not turning that off. Not sure who wrote wrote the sleep mode function originally, probably Johnny C? So I do use the watchdog but it's being turned off, but the ADC is not being shut OFF then ON. Below are the functions form Narsil, but again, originally ported over, probably from STAR Momentary:

/**************************************************************************************
* ADC_on - Turn the AtoD Converter ON
* ======
**************************************************************************************/
inline void ADC_on() {
// For 13A: ADMUX  = (1 << REFS0) | (1 << ADLAR) | ADC_CHANNEL; // 1.1v reference, left-adjust, ADC1/PB2

ADMUX = (1 << REFS1) | (1 << ADLAR) | ADC_CHANNEL; // 1.1v reference, left-adjust, ADC1/PB2

DIDR0 |= (1 << ADC_DIDR); // disable digital input on ADC pin to reduce power consumption
ADCSRA = (1 << ADEN ) | (1 << ADSC ) | ADC_PRSCL; // enable, start, pre-scale
}

/**************************************************************************************

  • ADC_off - Turn the AtoD Converter OFF
  • =======
    **************************************************************************************/
    inline void ADC_off() {
    ADCSRA &= ~(1<<7); //ADC off
    }
void sleep_until_switch_press()
{
   // This routine takes up a lot of program memory :(
   // Turn the WDT off so it doesn't wake us from sleep
   // Will also ensure interrupts are on or we will never wake up
   WDT_off();
   // Need to reset press duration since a button release wasn't recorded
   pressDuration = 0;
   // Enable a pin change interrupt to wake us up
   // However, we have to make sure the switch is released otherwise we will wake when the user releases the switch
   while (is_pressed()) {
      _delay_ms(16);
   }
   PCINT_on();
   // Enable sleep mode set to Power Down that will be triggered by the sleep_mode() command.
   //set_sleep_mode(SLEEP_MODE_PWR_DOWN);
   // Now go to sleep
   sleep_mode();
   // Hey, someone must have pressed the switch!!
   // Disable pin change interrupt because it's only used to wake us up
   PCINT_off();
   // Turn the WDT back on to check for switch presses
   WDT_on();
   // Go back to main program
}

Mike C - yes, thanks for the info. Yep, I understand the impact of the diode on the voltage divider. Think someone said earlier the drop from the diode we use is 0.25v. This is why we used a 19.1K initially in Nanjg’s but use a 22K now (before the diode) to get about the same A-D values.
If the wight FET+1 thread, someone mentioned using 10X the resistors: 220K and 47K. Not sure if it’s possible to read them, but that would be great if we could. It would cut the parasitic drain to 10% of the 0.16 mA the voltage divider is taking now, if I understand it all correctly.

So, I do have some marching orders from this - add turning off the AtoD converter for sleep (simple to try), and gotta find out more bout the voltage divider resistor values. Again, not sure if CPU speed effects parasitic drain, but if it did, might be minor compare to the voltage divider problem. I’d like to have the time to research the 1.1v ref reading proposed solution - not sure I can do this, without lots of time.

Tom E, have a look at 17.8 in the datasheet (page 129 in my PDF copy).

The analog input signal has to charge up a small internal sample+hold capacitor (through an internal series resistor) during the ADC conversion process. If you go too high with the impedance of the analog signal, this capacitor does not charge up all the way and you will get a low result from the ADC.

Atmel recommends keeping the output impedance of the analog signal below 10 kohm. The impedance of our signal is, for all practical purposes, the parallel combination of R1 and R2, or 1/(1/R1+1/R2). This means keeping R2 around 12 k or less.

That said, we are normally only using 8-bit resolution anyway, do we need the maximum accuracy? MikeC is already using 10x R values, not sure if he verified ADC accuracy.

For E-switch drivers, I vote for the Vcc/1.1Vref trick without any resistors.

I’m way out the specs. I’m using 1M2 as R1 and 300K as R2. I went this high because R2 acts as a bleed resistor for the off time cap on my drivers. With calibration I have been accurate down to 0.02 volts across the cell range (test 2.8V up to 4.23V) on several drivers, so I’m fine with these resistor values.

However, I have yet to build a zener version of my latest drivers (I waiting for new ordered resistors, ordered wrong ones before). I want to keep R2 the same so this means R1 is going to be much higher (about 2M7), but this might just be too high. I won’t know until I’ve tested on a few drivers but if so I’ll have to use a lower R2 value to keep R1 from being too high.

Interesting stuff about using VCC for voltage monitoring but nothing I will be looking into. I’m already doing LVP, off time and E-switch on the same pin so I wouldn’t be gaining any pins by doing this, and I don’t really need the space either. All my drivers are made zener compatible from the start anyway, so I need the voltage divider resistors to measure double cell voltage levels.

Just a note on calibration though, maybe you already know: The ATtinys voltage reading results vary and can be calibrated, I guess everyone knows that already, but just remember that it’s the internal reference voltage that is different on each MCU. The actual ADC readings are very precise and do not need calibrating, it’s the varying internal reference voltage that needs calibrating. I know Toykeeper wrote somewhere that mathematical calculations are not accurate but I’ve found that I can use all of the mathematical calculations with very accurate results as long as I have calibrated the internal reference voltage. When the voltage level and ADC value is known you can calculate what the internal reference voltage is, save this value and use it later to calculate the voltage in the LVP routine. It’s the method I use and I’ve found it to be the most accurate single point calibration method yet.

Yea, I’ve read the 1.1v reference can be off by 10%. But manually calibrating each chip feels annoying. I think it’s possible to do automated calibration. Measure the voltage provided by your programmer, write that into your calibration program, flash calibration program, run while powered from your programmer, it stores the result, flash final program.

Would you be using any kind of automated calibration?

Oh wow…I am surprised this works. So 10x should be no concern for Tom E.
How close are you to calculated values before calibration? Did you try different ADC clock dividers? Possibly a slower ADC clock can help in this case.

I just use the standard calculated values for the ADC and do not bother to calibrate. Most of my drivers read around 5-8 mV low high, but that is good enough for a 5-level flash out. I do have a ‘cell-impedance’ mode that flashes out cell+circuit impedance. Great as a sanity check on a new light, or to compare cells. But here I use differential voltage (no-load vs. 1x7135 load, 10-bit resolution) and absolute accuracy is less important. I also do a primitive over-sample/filter running on an interrupt to get consistent readings.

Ok, did the test of adding turning off the AtoD during sleep mode, and first of all, it functions fine - can still do AtoD voltage measurements after sleep mode, so I know it restores it properly (seems to from testing), and it cuts the parasitic drain to about 1/2 -- Great News!

Using a modded SupFire M2-Z (from MtnE) running a wight 22mm FET+1 v013, on a good AWT IMR 2500 cell @4.13v:

Before the AtoD mod: 4.95 mA running the CPU, sleep mode: 0.314 mA

After the AtoD mod: 4.95 mA running the CPU, sleep mode: 0.154 mA

So thanks to Halo, a simple code mod helped big time.

At a drain rate of 0.314 mA, it would take 132.7 days (19 weeks - 4.4 months) to drain 1 amp-hour from a cell.

At a drain rate of 0.154 mA, it would take 270.6 days (38.7 weeks - 8.9 months) to drain 1 amp-hour from a cell.

The next easier thing to try would be to use the 10X factor on the resistors. Since I previously proved the voltage divider is responsible for about 1/2 the parasitic drain, then this mod would greatly reduce parasitic drain to a point where it should be.

New code (cleaned up comments/formatting):

/**************************************************************************************
* sleep_until_switch_press - only called with the light OFF
* ========================
**************************************************************************************/
void sleep_until_switch_press()
{
   // This routine takes up a lot of program memory :(

// Turn the WDT off so it doesn't wake us from sleep. Will also ensure interrupts
// are on or we will never wake up.
WDT_off();

ADC_off(); // Save more power – turn the AtoD OFF

// Need to reset press duration since a button release wasn't recorded
pressDuration = 0;

// Enable a pin change interrupt to wake us up. However, we have to make sure the switch
// is released otherwise we will wake when the user releases the switch
while (is_pressed()) {
_delay_ms(16);
}
PCINT_on();

//-----------------------------------------
sleep_mode(); // Now go to sleep
//-----------------------------------------
// Alternate method? –> set_sleep_mode(SLEEP_MODE_PWR_DOWN);

              // Hey, someone must have pressed the switch!!

PCINT_off(); // Disable pin change interrupt because it's only used to wake us up

ADC_on(); // Turn the AtoD back ON

WDT_on(); // Turn the WDT back on to check for switch presses

} // Go back to main program

I did find 47K and 220K 0805 resistors, thanks to FastTech's cheap assortments.

Using the same M2-Z light/FET+1 driver, swapped the R1 and R2 and it seems to work perfectly, exactly as predicted - 1/10 the parasitic drain -- now 0.016 mA, and I blink out the voltage, tried two cells one at 3.6v, and one at 4.1v and it blinked out the proper voltage.

To tell you the truth, I'm not that concerned about the accuracy of the voltage reading - if I'm under 0.2v, I'm happy with it, as long as it's within 0.1v.

So I think 7 years or so on parasitic drain ain't so bad. Probably sitting on a shelf, a battery would drain faster?

Nice going Tom :beer:

Generally they are close enough and I was happy with that after flashing quite a few 85s… but then there was this one MCU that gave rather poor results. I just had to find out why. Turned out to be that the internal voltage reference was 1.0v which is within the tolerance according to the datasheet but gave rather poor voltage measurements. I guess they where still usable but it was a thorn in my side that I couldn’t let go, so I wrote a calibration routine to take care of it.