Attiny25/45/85 FW Development Thread

For single cell momentary applications if we eliminate both resistors and the polarity protection diode we should have enough space to use a mosfet for polarity protection. This would eliminate almost 100% of the voltage drop we currently see while using a diode for polarity protection.

EDIT: At least with the older boards. The “new stuff” like A17DD-L v30+ is still in flux and already packed super tight, but I’d certainly like to implement a PCB to use this technique for momentary stuff.

I’d like to note that I didn’t come up with it. Heard about it, went looking for an example, then half forgot about.
I heard about it in relation to a different AVR originally. But saw that it’s also available for attiny25/45/85. Not an option with tiny13 though. This was before attiny25/45/85 came into use.

I am trying to imagine what a PID might be. As I understand it, there is a heat sensor either on the driver board or on the LED’s star. If it’s on the driver, then there is a delay and a steady state reduction in temperature from the LED to the driver, and these might be more or less than the delay and the steady reduction from the LED to the flashlight body. There is also heat originating in the FET, but without or with a small gate resistor that is small. So I don’t see how to do much better than regulating the current to keep a constant sensor temperature. One will have to try it to decide on that temperature, but is should be somewhere around the desired flashlight body temperature. It will be cooler when you hold it or the wind blows than not, but not by a huge amount because that will cool the driver through the star and increase the current.
The LED temperature will overshoot because of the delay and have a higher steady temperature than the driver, but maybe it can take that. I don’t see a possibility for doing a lot better than just a simple thermostat.
On the other hand, you do need also to know how much to reduce the current on the first step, so the light output doesn’t vary too much. After that things will change more slowly.
Also, the gain of the thermostat can’t be too high, or with the delay, it will oscillate, turning the light off and on. That is, the table or whatever has to be over some range of temperature.

Hhmm - I should really try measuring the sleep mode amps on a board without the diode or resistors - fairly easy to do… Dunno if anyone did that before.

Double posting this here as well: https://budgetlightforum.com/t/-/25032

I did the tests. First removed the diode, bridged the pad with solder, and saw absolutely no difference in the parasitic drain at all - zero, nothing. Still doing in the 0.30 to 0.32 mA range.

Then I pulled the 22K resistor. The light acted kind of weird, because I didn't re-burn the MCU with firmware not supporting LVP. Anyway, it would settle down, and go into its sleep state. There I measured in the 0.16 mA range, which is about exactly what I expected - it went to half the draw, as calculated from the resistor values.

So, not sure why the diode had no effect or did I mis-understand if it was supposed to help?

Btw, in terms of draw over time, a 0.16 mA steady draw would drain a cell 1 Ah in 260 days, if I did the math correct.

The diode is in series with the MCU. Bridging it cannot improve the parasitic drain. If anything that gives a slightly higher voltage to the MCU, which may increase the drain ever so slightly. If there was a diode+drain related discussion it may have been about the zener diode used on 6 V drivers? That arrangement forms a type of shunt regulator, with a huge parasitic drain by definition.

Still a mystery where the other 0.160 mA is going. Your fuse values indicate that you are not using BOD, so that is good.
Maybe some MCU peripherals are not shutting down properly during sleep?

Thanks DEL - yep, that’s what I’m thinking - something else that’s not getting shut down. Spoke last night to my EE buddy here and he said the same thing bout the diode we are using. He also mentioned these diodes will consume voltage though when the MCU is drawing power, so I think I need to know exactly what the draw is. Using the “85” and not the “85v” rated at 2.7v minimum, means you can’t afford to loose much voltage via the diode. Think’n the drain was 0.1-0.2 volts from what I recall, which means our LVP needs to cut off no less than 2.9v. I believe the loss’s caused by the diode is why we moved the voltage divider resistors before the diode, and are using a 22K instead of the 19.1K.

Haven’t had a chance yet, but I need to dive deep into the Atmel 25/45/85 specs to see if we are missing something on sleep mode. I am aware that you have control of several sub-systems for sleep/low power.
Yep, I heard bout the brown-out taking amps in sleep mode - the details were posted here a while back.

You can assume a 0.25 V drop for the Schottky-type diodes we are using (or try to measure it while the MCU is running, it does vary with current and temperature.)

There is a discussion in one of the threads to replace D1 with a ‘boot-strapped’ PMOS FET - this would give practically zero voltage loss.

I don’t think it’s necessary to replace the diode to reduce the draw, it’s just that with an “85”, you have to be extra careful with your LVP cutoff value. The 2.7v is actually 2.95v then, therefore, your cutoff point should be 3.0v or 3.1v for a little cushion. When I bought my ATTiny85’s from Mouser or DigiKey, I bought the 85V’s but the last batch I bought from RMM are the “85”’s.

What driver are you using? I always take the cell voltage measurement before the diode. I thought that’s what everyone else did too.

wight’s FET+1 and MntE’s FET+1 board, reflowed myself. Cell voltage before the diode? Not sure I understand - I’m just measuring at the same place I always measure amps - at the tail: between batt- and the edge of the battery tube.

A tiny85 won’t drop dead at 2.69v. Atmel just does not guarantee they will remain reliable outside the specs. The official specs are often conservative. People can and do run them a bit outside of specs and have no problems. Some even sell devices that use AVRs running outside of specs. I believe, iirc boards from jeelabs are (or were) running at a clock speed + voltage that is outside of spec and he said he never ran into a problem even among hundreds of chips.

Also what speed are you running at? You can increase reliability by running at a lower clock speed (like 4Mhz) + you’ll get a bit of power savings. I posted awhile ago, somewhere :smiley: probably this thread, code to use the clock prescaler to adjust the speed beyond the options that the fuses give you.

~ edit ~
Yep, using our nifty new “search within thread” feature it comes up by searching “clkpr”.

Apparently, I was talking to some guy named Tom E in the above quote. So, I guess it’s nothing new to you? :-x

I forget far more than I remember. It’s dejavu all over again. I recall reading this and maybe other posts about it, but it’s untested, and I thought incomplete at the time. There’s 2 timed things I need - the 16 msec timer (or need to know exactly what it is), and the built-in _delay_ms() routine, so any change that impacts these is a problem. I don’t have any decent development/debug capabilities with these parts, and not much time to explore these things with unknown advantages? I am looking to lower sleep mode parasitic drain, and don’t see the relationship between processor speed and parasitic drain. Sorry, I haven’t had time to do the crazy hours of research on this.

Don’t get me wrong - all good ideas, but don’t ask me to go off and take the risk of time for a possible benefit, which I’m not clear on. I am very handcuffed with this firmware development environment from what I’m used to, lacking single step level debugging, profiling, etc., so experimenting is very time consuming.

I am very committed to 8 Mhz right now because I know it works. I haven’t seen anything posted in detail, fully working, or fully explained and tested to work, at lower speeds. Again, no time for the R&D. Early on I tried a lower speed via the fuses, and timing of something was way off, even though I thought I made all the proper settings. This, of course, is a shame because others are doing this I’m pretty sure, but I just can’t find the source code and/or fuse settings, etc., or they are unwilling/not allowed to post it.

I sometimes feel like I forget far more than I remember as well. Glad I’m not alone. :smiley:

ToyKeeper mentioned testing it here. Not sure how it effects _delay_ms() and the 16 msec timer you’re using. But it should be one of two possibilities, cuts it in half or no effect. About power savings, I don’t have any links off hand but I’ve also seen people test combinations of different levels of sleep with different clock speeds to verify what you get in the real world. I expect the savings to be minor. I brought it up more because you’re concerned about the tiny85 being rated at 2.7v minimum. Even with the tiny85v you’re not suppose to go below 2.7v if your running at 8MHz. 4Mhz is the max (officially) for the 1.8-2.7v range.

Are you disabling adc before power down sleep?
(ADCSRA &= ~(1<<ADEN)) // disable ADC (before power-off)

(ADCSRA |= (1<<ADEN)) // re-enable ADC

You’re not using the watchdog, right?

If the voltage divider gets the voltage before the diode, then the voltage drop of the diode is irrelevant for LVP readings. I design all my drivers this way and just assumed that everyone else did too. I guess I shouldn’t be assuming so much.

Before diode:

After diode:

Crap - not turning that off. Not sure who wrote wrote the sleep mode function originally, probably Johnny C? So I do use the watchdog but it's being turned off, but the ADC is not being shut OFF then ON. Below are the functions form Narsil, but again, originally ported over, probably from STAR Momentary:

/**************************************************************************************
* ADC_on - Turn the AtoD Converter ON
* ======
**************************************************************************************/
inline void ADC_on() {
// For 13A: ADMUX  = (1 << REFS0) | (1 << ADLAR) | ADC_CHANNEL; // 1.1v reference, left-adjust, ADC1/PB2

ADMUX = (1 << REFS1) | (1 << ADLAR) | ADC_CHANNEL; // 1.1v reference, left-adjust, ADC1/PB2

DIDR0 |= (1 << ADC_DIDR); // disable digital input on ADC pin to reduce power consumption
ADCSRA = (1 << ADEN ) | (1 << ADSC ) | ADC_PRSCL; // enable, start, pre-scale
}

/**************************************************************************************

  • ADC_off - Turn the AtoD Converter OFF
  • =======
    **************************************************************************************/
    inline void ADC_off() {
    ADCSRA &= ~(1<<7); //ADC off
    }
void sleep_until_switch_press()
{
   // This routine takes up a lot of program memory :(
   // Turn the WDT off so it doesn't wake us from sleep
   // Will also ensure interrupts are on or we will never wake up
   WDT_off();
   // Need to reset press duration since a button release wasn't recorded
   pressDuration = 0;
   // Enable a pin change interrupt to wake us up
   // However, we have to make sure the switch is released otherwise we will wake when the user releases the switch
   while (is_pressed()) {
      _delay_ms(16);
   }
   PCINT_on();
   // Enable sleep mode set to Power Down that will be triggered by the sleep_mode() command.
   //set_sleep_mode(SLEEP_MODE_PWR_DOWN);
   // Now go to sleep
   sleep_mode();
   // Hey, someone must have pressed the switch!!
   // Disable pin change interrupt because it's only used to wake us up
   PCINT_off();
   // Turn the WDT back on to check for switch presses
   WDT_on();
   // Go back to main program
}

Mike C - yes, thanks for the info. Yep, I understand the impact of the diode on the voltage divider. Think someone said earlier the drop from the diode we use is 0.25v. This is why we used a 19.1K initially in Nanjg’s but use a 22K now (before the diode) to get about the same A-D values.
If the wight FET+1 thread, someone mentioned using 10X the resistors: 220K and 47K. Not sure if it’s possible to read them, but that would be great if we could. It would cut the parasitic drain to 10% of the 0.16 mA the voltage divider is taking now, if I understand it all correctly.

So, I do have some marching orders from this - add turning off the AtoD converter for sleep (simple to try), and gotta find out more bout the voltage divider resistor values. Again, not sure if CPU speed effects parasitic drain, but if it did, might be minor compare to the voltage divider problem. I’d like to have the time to research the 1.1v ref reading proposed solution - not sure I can do this, without lots of time.

Tom E, have a look at 17.8 in the datasheet (page 129 in my PDF copy).

The analog input signal has to charge up a small internal sample+hold capacitor (through an internal series resistor) during the ADC conversion process. If you go too high with the impedance of the analog signal, this capacitor does not charge up all the way and you will get a low result from the ADC.

Atmel recommends keeping the output impedance of the analog signal below 10 kohm. The impedance of our signal is, for all practical purposes, the parallel combination of R1 and R2, or 1/(1/R1+1/R2). This means keeping R2 around 12 k or less.

That said, we are normally only using 8-bit resolution anyway, do we need the maximum accuracy? MikeC is already using 10x R values, not sure if he verified ADC accuracy.

For E-switch drivers, I vote for the Vcc/1.1Vref trick without any resistors.

I’m way out the specs. I’m using 1M2 as R1 and 300K as R2. I went this high because R2 acts as a bleed resistor for the off time cap on my drivers. With calibration I have been accurate down to 0.02 volts across the cell range (test 2.8V up to 4.23V) on several drivers, so I’m fine with these resistor values.

However, I have yet to build a zener version of my latest drivers (I waiting for new ordered resistors, ordered wrong ones before). I want to keep R2 the same so this means R1 is going to be much higher (about 2M7), but this might just be too high. I won’t know until I’ve tested on a few drivers but if so I’ll have to use a lower R2 value to keep R1 from being too high.

Interesting stuff about using VCC for voltage monitoring but nothing I will be looking into. I’m already doing LVP, off time and E-switch on the same pin so I wouldn’t be gaining any pins by doing this, and I don’t really need the space either. All my drivers are made zener compatible from the start anyway, so I need the voltage divider resistors to measure double cell voltage levels.

Just a note on calibration though, maybe you already know: The ATtinys voltage reading results vary and can be calibrated, I guess everyone knows that already, but just remember that it’s the internal reference voltage that is different on each MCU. The actual ADC readings are very precise and do not need calibrating, it’s the varying internal reference voltage that needs calibrating. I know Toykeeper wrote somewhere that mathematical calculations are not accurate but I’ve found that I can use all of the mathematical calculations with very accurate results as long as I have calibrated the internal reference voltage. When the voltage level and ADC value is known you can calculate what the internal reference voltage is, save this value and use it later to calculate the voltage in the LVP routine. It’s the method I use and I’ve found it to be the most accurate single point calibration method yet.