Attiny25/45/85 FW Development Thread

I forget far more than I remember. It’s dejavu all over again. I recall reading this and maybe other posts about it, but it’s untested, and I thought incomplete at the time. There’s 2 timed things I need - the 16 msec timer (or need to know exactly what it is), and the built-in _delay_ms() routine, so any change that impacts these is a problem. I don’t have any decent development/debug capabilities with these parts, and not much time to explore these things with unknown advantages? I am looking to lower sleep mode parasitic drain, and don’t see the relationship between processor speed and parasitic drain. Sorry, I haven’t had time to do the crazy hours of research on this.

Don’t get me wrong - all good ideas, but don’t ask me to go off and take the risk of time for a possible benefit, which I’m not clear on. I am very handcuffed with this firmware development environment from what I’m used to, lacking single step level debugging, profiling, etc., so experimenting is very time consuming.

I am very committed to 8 Mhz right now because I know it works. I haven’t seen anything posted in detail, fully working, or fully explained and tested to work, at lower speeds. Again, no time for the R&D. Early on I tried a lower speed via the fuses, and timing of something was way off, even though I thought I made all the proper settings. This, of course, is a shame because others are doing this I’m pretty sure, but I just can’t find the source code and/or fuse settings, etc., or they are unwilling/not allowed to post it.

I sometimes feel like I forget far more than I remember as well. Glad I’m not alone. :smiley:

ToyKeeper mentioned testing it here. Not sure how it effects _delay_ms() and the 16 msec timer you’re using. But it should be one of two possibilities, cuts it in half or no effect. About power savings, I don’t have any links off hand but I’ve also seen people test combinations of different levels of sleep with different clock speeds to verify what you get in the real world. I expect the savings to be minor. I brought it up more because you’re concerned about the tiny85 being rated at 2.7v minimum. Even with the tiny85v you’re not suppose to go below 2.7v if your running at 8MHz. 4Mhz is the max (officially) for the 1.8-2.7v range.

Are you disabling adc before power down sleep?
(ADCSRA &= ~(1<<ADEN)) // disable ADC (before power-off)

(ADCSRA |= (1<<ADEN)) // re-enable ADC

You’re not using the watchdog, right?

If the voltage divider gets the voltage before the diode, then the voltage drop of the diode is irrelevant for LVP readings. I design all my drivers this way and just assumed that everyone else did too. I guess I shouldn’t be assuming so much.

Before diode:

After diode:

Crap - not turning that off. Not sure who wrote wrote the sleep mode function originally, probably Johnny C? So I do use the watchdog but it's being turned off, but the ADC is not being shut OFF then ON. Below are the functions form Narsil, but again, originally ported over, probably from STAR Momentary:

/**************************************************************************************
* ADC_on - Turn the AtoD Converter ON
* ======
**************************************************************************************/
inline void ADC_on() {
// For 13A: ADMUX  = (1 << REFS0) | (1 << ADLAR) | ADC_CHANNEL; // 1.1v reference, left-adjust, ADC1/PB2

ADMUX = (1 << REFS1) | (1 << ADLAR) | ADC_CHANNEL; // 1.1v reference, left-adjust, ADC1/PB2

DIDR0 |= (1 << ADC_DIDR); // disable digital input on ADC pin to reduce power consumption
ADCSRA = (1 << ADEN ) | (1 << ADSC ) | ADC_PRSCL; // enable, start, pre-scale
}

/**************************************************************************************

  • ADC_off - Turn the AtoD Converter OFF
  • =======
    **************************************************************************************/
    inline void ADC_off() {
    ADCSRA &= ~(1<<7); //ADC off
    }
void sleep_until_switch_press()
{
   // This routine takes up a lot of program memory :(
   // Turn the WDT off so it doesn't wake us from sleep
   // Will also ensure interrupts are on or we will never wake up
   WDT_off();
   // Need to reset press duration since a button release wasn't recorded
   pressDuration = 0;
   // Enable a pin change interrupt to wake us up
   // However, we have to make sure the switch is released otherwise we will wake when the user releases the switch
   while (is_pressed()) {
      _delay_ms(16);
   }
   PCINT_on();
   // Enable sleep mode set to Power Down that will be triggered by the sleep_mode() command.
   //set_sleep_mode(SLEEP_MODE_PWR_DOWN);
   // Now go to sleep
   sleep_mode();
   // Hey, someone must have pressed the switch!!
   // Disable pin change interrupt because it's only used to wake us up
   PCINT_off();
   // Turn the WDT back on to check for switch presses
   WDT_on();
   // Go back to main program
}

Mike C - yes, thanks for the info. Yep, I understand the impact of the diode on the voltage divider. Think someone said earlier the drop from the diode we use is 0.25v. This is why we used a 19.1K initially in Nanjg’s but use a 22K now (before the diode) to get about the same A-D values.
If the wight FET+1 thread, someone mentioned using 10X the resistors: 220K and 47K. Not sure if it’s possible to read them, but that would be great if we could. It would cut the parasitic drain to 10% of the 0.16 mA the voltage divider is taking now, if I understand it all correctly.

So, I do have some marching orders from this - add turning off the AtoD converter for sleep (simple to try), and gotta find out more bout the voltage divider resistor values. Again, not sure if CPU speed effects parasitic drain, but if it did, might be minor compare to the voltage divider problem. I’d like to have the time to research the 1.1v ref reading proposed solution - not sure I can do this, without lots of time.

Tom E, have a look at 17.8 in the datasheet (page 129 in my PDF copy).

The analog input signal has to charge up a small internal sample+hold capacitor (through an internal series resistor) during the ADC conversion process. If you go too high with the impedance of the analog signal, this capacitor does not charge up all the way and you will get a low result from the ADC.

Atmel recommends keeping the output impedance of the analog signal below 10 kohm. The impedance of our signal is, for all practical purposes, the parallel combination of R1 and R2, or 1/(1/R1+1/R2). This means keeping R2 around 12 k or less.

That said, we are normally only using 8-bit resolution anyway, do we need the maximum accuracy? MikeC is already using 10x R values, not sure if he verified ADC accuracy.

For E-switch drivers, I vote for the Vcc/1.1Vref trick without any resistors.

I’m way out the specs. I’m using 1M2 as R1 and 300K as R2. I went this high because R2 acts as a bleed resistor for the off time cap on my drivers. With calibration I have been accurate down to 0.02 volts across the cell range (test 2.8V up to 4.23V) on several drivers, so I’m fine with these resistor values.

However, I have yet to build a zener version of my latest drivers (I waiting for new ordered resistors, ordered wrong ones before). I want to keep R2 the same so this means R1 is going to be much higher (about 2M7), but this might just be too high. I won’t know until I’ve tested on a few drivers but if so I’ll have to use a lower R2 value to keep R1 from being too high.

Interesting stuff about using VCC for voltage monitoring but nothing I will be looking into. I’m already doing LVP, off time and E-switch on the same pin so I wouldn’t be gaining any pins by doing this, and I don’t really need the space either. All my drivers are made zener compatible from the start anyway, so I need the voltage divider resistors to measure double cell voltage levels.

Just a note on calibration though, maybe you already know: The ATtinys voltage reading results vary and can be calibrated, I guess everyone knows that already, but just remember that it’s the internal reference voltage that is different on each MCU. The actual ADC readings are very precise and do not need calibrating, it’s the varying internal reference voltage that needs calibrating. I know Toykeeper wrote somewhere that mathematical calculations are not accurate but I’ve found that I can use all of the mathematical calculations with very accurate results as long as I have calibrated the internal reference voltage. When the voltage level and ADC value is known you can calculate what the internal reference voltage is, save this value and use it later to calculate the voltage in the LVP routine. It’s the method I use and I’ve found it to be the most accurate single point calibration method yet.

Yea, I’ve read the 1.1v reference can be off by 10%. But manually calibrating each chip feels annoying. I think it’s possible to do automated calibration. Measure the voltage provided by your programmer, write that into your calibration program, flash calibration program, run while powered from your programmer, it stores the result, flash final program.

Would you be using any kind of automated calibration?

Oh wow…I am surprised this works. So 10x should be no concern for Tom E.
How close are you to calculated values before calibration? Did you try different ADC clock dividers? Possibly a slower ADC clock can help in this case.

I just use the standard calculated values for the ADC and do not bother to calibrate. Most of my drivers read around 5-8 mV low high, but that is good enough for a 5-level flash out. I do have a ‘cell-impedance’ mode that flashes out cell+circuit impedance. Great as a sanity check on a new light, or to compare cells. But here I use differential voltage (no-load vs. 1x7135 load, 10-bit resolution) and absolute accuracy is less important. I also do a primitive over-sample/filter running on an interrupt to get consistent readings.

Ok, did the test of adding turning off the AtoD during sleep mode, and first of all, it functions fine - can still do AtoD voltage measurements after sleep mode, so I know it restores it properly (seems to from testing), and it cuts the parasitic drain to about 1/2 -- Great News!

Using a modded SupFire M2-Z (from MtnE) running a wight 22mm FET+1 v013, on a good AWT IMR 2500 cell @4.13v:

Before the AtoD mod: 4.95 mA running the CPU, sleep mode: 0.314 mA

After the AtoD mod: 4.95 mA running the CPU, sleep mode: 0.154 mA

So thanks to Halo, a simple code mod helped big time.

At a drain rate of 0.314 mA, it would take 132.7 days (19 weeks - 4.4 months) to drain 1 amp-hour from a cell.

At a drain rate of 0.154 mA, it would take 270.6 days (38.7 weeks - 8.9 months) to drain 1 amp-hour from a cell.

The next easier thing to try would be to use the 10X factor on the resistors. Since I previously proved the voltage divider is responsible for about 1/2 the parasitic drain, then this mod would greatly reduce parasitic drain to a point where it should be.

New code (cleaned up comments/formatting):

/**************************************************************************************
* sleep_until_switch_press - only called with the light OFF
* ========================
**************************************************************************************/
void sleep_until_switch_press()
{
   // This routine takes up a lot of program memory :(

// Turn the WDT off so it doesn't wake us from sleep. Will also ensure interrupts
// are on or we will never wake up.
WDT_off();

ADC_off(); // Save more power – turn the AtoD OFF

// Need to reset press duration since a button release wasn't recorded
pressDuration = 0;

// Enable a pin change interrupt to wake us up. However, we have to make sure the switch
// is released otherwise we will wake when the user releases the switch
while (is_pressed()) {
_delay_ms(16);
}
PCINT_on();

//-----------------------------------------
sleep_mode(); // Now go to sleep
//-----------------------------------------
// Alternate method? –> set_sleep_mode(SLEEP_MODE_PWR_DOWN);

              // Hey, someone must have pressed the switch!!

PCINT_off(); // Disable pin change interrupt because it's only used to wake us up

ADC_on(); // Turn the AtoD back ON

WDT_on(); // Turn the WDT back on to check for switch presses

} // Go back to main program

I did find 47K and 220K 0805 resistors, thanks to FastTech's cheap assortments.

Using the same M2-Z light/FET+1 driver, swapped the R1 and R2 and it seems to work perfectly, exactly as predicted - 1/10 the parasitic drain -- now 0.016 mA, and I blink out the voltage, tried two cells one at 3.6v, and one at 4.1v and it blinked out the proper voltage.

To tell you the truth, I'm not that concerned about the accuracy of the voltage reading - if I'm under 0.2v, I'm happy with it, as long as it's within 0.1v.

So I think 7 years or so on parasitic drain ain't so bad. Probably sitting on a shelf, a battery would drain faster?

Nice going Tom :beer:

Generally they are close enough and I was happy with that after flashing quite a few 85s… but then there was this one MCU that gave rather poor results. I just had to find out why. Turned out to be that the internal voltage reference was 1.0v which is within the tolerance according to the datasheet but gave rather poor voltage measurements. I guess they where still usable but it was a thorn in my side that I couldn’t let go, so I wrote a calibration routine to take care of it.

That’s great news, Tom.

I’ll be putting a couple more drivers together soon (Fet+1), so will use the higher value resistors, & will re-compile the code with the ADC off/on values.

I’m glad I asked the question over in the other thread!

Ohh, yeah, once again I'm all discombobulated. There was some overlapping goin on, but of course I forgot where/how/who this all originated .

Interesting Sharpie. This is a just another couple of cases with us to miss some obvious, easy improvements, it seems like. Many of us, myself included, are reluctant to make changes to our basic Atmel driver design because of unknown risks, and proven history of what we have working.

For today, it seems like RMM drives the hardware, and TK drives the firmware directions. Rightfully so, really, because RMM is the only one true reseller (and major designer/builder, etc.) of these budget custom drivers en-mass, and TK has a proven track record of delivering stable, high performing feature rich firmware.

The e-switch support, along with the 25/45/85 usage has been not so readily adopted though and for good reasons - it’s limited to specialty lights, usually not so budget friendly, and the modding is more complicated. But I feel the two go hand in hand - you can do a lot more with an e-switch for the UI, therefore you want more memory/storage to take advantage of it. The two main gripes with e-switch lights has been:

- lockout ability, because the side switch’s can be activated easier than a tail switch

- parasitic drain concerns

I’ve solved the lockout with supporting a lockout sequence in the UI, while these mods really solve the parasitic drain issue, with calculated drain in the 7 year range for 1 amp-hour, and puts in right smack dab in the middle of name brand lights like the latest models of NiteCore and Sunwayman.

Yeah, so…

When is the newest model TomE light gonna hit the shelves? :partying_face:

Already has, hit my shelf that is... I'm more afraid of breaking the shelf from all the weight. 1,000's of lumens sure is heavy.

Like many of us, my family thinks I'm nuts for some reason... Showed my office (flashlight shop) to my sister over the weekend ("this is where I reflow the boards", "these are the parts", etc.), and she was at first stunned, then of course comes the question on selling them, making money, etc... I still can't figure out how to make money on these - too much time. Better to sell $6 lights for $50, and know how to advertise with a catchy name, like G700?

I use internal VCC measurement all the time with my ATtiny?5 builds. With my RGBW driver (ATtiny85) the standby current is about 0.3 µA, it survives a battery change running (well, sleeping that is) from the 10 µF 0805 buffer cap.

Halo: I only use my free time for firmware development; and (since 2 years with 2 small kids at home) mostly don’t have much of it. I only occasionally have enough time and leisure to do something new. H17F & Icarus took me about a year in the end. However I did spend a lot of time on firmware - especially on optimizing code. However the money I get from my hobby mostly stays in my hobby (e.g. LD50, SL2, TM16GT, soon a Meteor…)

And of course I spent quite some hours on the PID algorithm. I tried a simple stepdown first, but to which level depends on the host and environment, so I soon wanted to have the real thing, PID. I wrote a simplified thermal simulation (in PostScript :slight_smile: ) with PID regulation and changing environment, and it was quite nice. Then I made an implementation with MCU limitations (integer math and limited accuracy), and it was quite bad - well, the thermal regulation was still fine (PID works under much worse conditions), but the light output was quite unsteady (PID is not designed to keep the ‘heating power’ steady, just the temperature). I spent several hours to think of alternative PID-like aproaches and finally made one I’m happy with. It also seems a bit more tolerant to a wide range of hosts (i.e. their thermal properties). So it’s actually not the normal PID algorithm, but one that combines P~~, I~~ and D-terms in a different way.