Attiny25/45/85 FW Development Thread

a crud.. never mind. I thought I had progress on this 7135 thing, but I think I fooled myself. Anyway, I'll keep playing with it.

Anyway, I found diagnostic tool. I can sneak in a Vcc ADC read in the interrupt right between detecting the shutdown and powering off. It's working, although no time to interpret it now. It should give insight into how much time and charge is lost before the shutdown. I could make use a couple actually, like one after the first wake, and get a read on how much drain is happening during sleep that way too.

So some interesting results. Unless I'm getting completely fooled, like the interrupt is firing at all the wrong times (TURBO works so it can't be all broken) I think I'm measuring the voltage on C2 immdiately after the pin fall is detected.

I first had to calibrate my Vcc read real well (which uncovered a slight bug in the calibration function, but almost irrelevant really). Anyway

In turbo I'm getting:

Input Voltage Vcc Catch voltage

4.0 3.06

3.06 2.71

On All7135s I'm getting almost the same

Input Voltage Vcc Catch voltage

4.0 3.00

3.06 2.60

I find it interesting that the 4.0V reading does relatively badly compared to the 3 V readings in both cases. But what's more interesting is that there doesn't seem to be any serious problem here with the 7135s, and yet there is. This helps though, because if this holds up it at least points to what's not the problem, power down detection time, unless that 0.1V is really a big deal, I wouldn't think so.

This seems to be a good tool. I'll continue, trying to measure voltage at the first power-on and see what that looks like.

(just updated numbers to uncorrect for the diode, so these are real voltages on the Vcc pin now.)

SOLVED!!!

Yes. I got it. Just had to find the right pin state to get the driver pins into after shutdown detection. Apparently output low doesn't work (or output PWM 0, should be the same thing). I have no idea why not. Input high is documented as a good option because it's high impedance and avoids supposedly problematic middle voltages (input low actually means input floating). And yet, the solutions was input low. It maybe helped that I shutdown the 7135's first pulling them down with a quick output low first, I don't know.

Anyway, it not only works, it works great. You can whistle a couple of line of Dixie before you turn you light back on. I've got the code in a bit of a mess at the moment from all kinds of testing, so I'll probably start over from the last release and add things and test, might take a couple of days, might not, but I think we'll have OTSM working great in bistro here shortly.

I keep recompiling and reflashing this same ROM and keeping checking that it's really in ALL7135s because I keep thinking I must be confused, but it's working.

Updated: It actually works over 4s at 3.0V without a bleeder (~4K of divider resistance) and 160C heat gun pointed at it!

:D

I have no idea what you are talking about Flintrock but your writing tells me you are excited and that can only mean one thing. Your onto a winner. well done. :+1:

Thanks MRsDNF.

OTSM. (I think TA named it that). I think it means off-time sleep mode or something like that. It's the replacement for OTC in bistro HD. It means that the length of a medium click is as steady as the watchdog timer, which as far as I can tell is about 10% even when hot, which is much better than many caps. It's also not sensitive to all the resistor tuning that the OTC is, so should eliminate those headaches.

I've got the first low-capacitance bistro Texas Avenger OTSM light ever right here in my hand, working just fine. I thought the issues were worked out several weeks ago, but it hit a snag when applied to a real light. The snag is now defeated.

These fixes will of course get posted up to the bistro-HD thread shortly.

Thanks for the explanation. Now to remember it. :+1:

12v jobbers isn't a term I can google to fix my bricked chip /board (I tried to do a flash and I think I bricked my chip) What do I need to fix my driver board? It's a 17DD MTN Fet + 7135.

I built a high voltage serial programmer to deal with that stuff: High Voltage programming/Unbricking for Attiny ā€“ Arduino, ESP8266, ESP32 & Raspberry Pi stuff

However, if itā€™s one driver youā€™re talking about itā€™s cheaper and less frustrating to just get a new one. I used the reset pin quite a lot on the ATtiny85 so I needed this in order to be able to flash after activating reset pin as IO.

Edit: false alarm, read here .

Have to reanimate this thread, I guess.
Did anybody take the dependency between voltage and temperature readings on Attiny25/85 into account?

For my 2018 BLF Contest light I developed bluetooth remote control with telemetry. Voltage and temperature readings are sent to the app once per second. I noticed a significant dependency between voltage and temperature. Here a couple of readings for roughly the same temperature:

at 4.6 V temperature reading is 13 Ā°C
at 3.6 V temperature reading is 27 Ā°C
at 2.7 V temperature reading is 33 Ā°C

As you can see the dependency is not linear. When I find some time Iā€™ll do more research.
For now Iā€™d say this dependency might interfere with temperature regulation.

I didnā€™t care much about it. On the other hand I donā€™t recall getting such wild variations, but since I havenā€™t touched the 85 in ages my brain has purged most of that stuff. Iā€™m sitting with my ATtiny1634 drivers right now so I just tested: 20Ā°C at 2.6V and 22Ā°C at 4.4V, nothing for me to worry about.

Do you use Noise Reduction Sleep Mode for temperature readings? I donā€™t do this currently but might give it a try.

No I donā€™t. Have you allowed a settling time when switching between voltage and temperature readings? At least on the 1634 itā€™s important if you are reading voltage by connecting internal voltage reference to the ADC and using MCU voltage as reference. I read voltage on 1S lights this way and if I donā€™t have a delay Iā€™d get funky results. 1634 datasheet specifies 1ms settling time, I have a 2ms delay just to be sure. Iā€™m not in a hurry.

This will probably not help if the results you posted are always consistent but at the moment I canā€™t think of anything else.

Yes, I have this 1 ms delay in my code. Anyway -

I have to apologize for crying wolf.

After some more research - including the noise reduction sleep mode - I checked the temperature readings again in my battery driven final light setup and the readings turned out normal (without noise reduction), without much deviation at different voltage levels. But the wrong readings are still there in my test setup on breadboard. So the reason must be somewhere in this test setup, donā€™t know yet if its the test setup itself or if its the USBASP connected to the breadboard which is causing the noise. And it must be noise causing the weird readings since they are normal even in my test setup when I use the noise reduction sleep mode.

Thanks for your help, Mike!

Actually I do this delay only when measuring voltage internally since Atmel says in the specs it is required when measuring Vbg against Vcc. As I understand the specs this delay is not required for measuring temperature against Vbg. (Iā€™m refering to Attiny85 specs).

Glad itā€™s sorted. Youā€™re right about the delay, according to the datasheet itā€™s only required when switch to internal voltage reference, it says nothing about from. For me it doesnā€™t matter though, I have 2ms to spend after each conversion regardless if Iā€™m reading voltage or temperature.

Anyone ever release a candle flicker mode for a clicky firmware? Or just Anduril at this point?

If not, how easy would it be to port Anduril code to Bistro (HD) nothing fancy just the flickery output

It shouldnā€™t be too hard, assuming thereā€™s space in the ROM. Basically just take code from the candle modeā€™s EV_tick handler, put it in a loop, and add a 16ms delay between frames.

This, um, also assumes some other things thoughā€¦ like the existence of a function to set the output level in ~150 visually-linear steps, without the caller having to worry about how thatā€™s actually implemented in the hardware.

Thank you for direction. Hopefully Iā€™ll get it :smiley:

Hey you atmel wizards. I want to reduce the PWM frequency of the 7135 channel of a D4 driver.

From doing a bit of reading in this thread I gather that doing this might involve using a divider with the clock frequency. On the engbedded page if I check the divider of 8 checkbox it changes the low fuse to 0x62. Would that work? Would that change the frequency for both channels, and if so is there a way to just change the 7135 channel?

If I understand correctly that would lower the frequency to around 2 kHz, which would work for my application. But is there a way to adjust the frequency more finely, rather than by a factor of 8?

Thanks.

Making them run at separate speeds may be do-able but would take some coding. Essentially instead of running the FET from OCR0B (Timer/Counter0, Match B Output), youā€™d need to set it up to run on OC1A (Timer/Counter1, Match A Output). Running them on separate counters would then give you the ability to set a prescaler for each timer separately.

If this is for Anduril, most of the relevant bits are in fsm-main.c in the hw_setup() function. Weā€™re setting TCCR0B to 0x01 which is to use the main clock, no prescaler. The smallest prescaler is 8x. Setting TCCR0B to 0x02 would use the 8x prescaler. Youā€™d then need to set up Timer/Counter1 for the FET channel. Iā€™ve never done this with the t85, but it should be do-able. God-speed my friend!

Or, if youā€™re not hung up on running the FET and 7135 PWM channels at separate speeds, Iā€™d just set that TCCR0B to 0x02 to use the 8x prescaler for both and then just leave everything else alone. Donā€™t even need to jack with fuses, which would affect the clock speeds for everything. In doing that, youā€™d need to update tk-attiny.h otherwise everything will be running in 8x slow motion!