Attiny25/45/85 FW Development Thread

My tests with BOD disabled and pin checking are not specific to my driver, they can be done on any. The principles are no different.

Your method of discussing things is so unlike mine… so I’ll just stop.

Has anyone got LVP working in Bistro using the internal voltage reference? I have some SK68’s I want to build with the 15mm drivers that do not have the voltage divider but I have no firmware to use with it that has LVP.

I've made the inverted voltage read function already that can replace the LVP one in bistro, and I've used it that way, it works. The rest is essentially just the battcheck calibratiion table. Basically that requires a battcheck.hex style program that uses that voltage read function. Basically I wrote something for that too that I'm using, although at the moment it's mashed in with a bunch of other mess. I'll see what I can polish up. I was kind of thinking of going this route anyway for OTSM. Mostly been improving my diagnostic tools (like those routines) and testing rig slightly in what little time I spent on any of it recently.

Since the only variable is fluctuations in the internal reference itself, maybe I can make one universal calibration table and it won't require releasing a calibration tool.

Yes, I'm sure if you do exactly the same thing in another driver you'd get the same results as you have. Like I said, those results work well enough for you.

In case I'm slow getting around to doing anything, here's what I've done so far if you want to take it and run TA. See the next post for what should be just the parts that matter.

This is basically ADC_on and get_voltage combined and is cut and paste from original bistro voltage.h but with modifications for inverted Vcc reading. It could be split back out into the two routines easily enough. In fact probably only the ADC_on part needs to be replaced.

inline uint8_t read_vcc() {
DIDR0 |= (1 << CAP_DIDR);
// Vcc reference, read internal reference
ADMUX = (0 <<REFS2)| (0 << REFS1) | (0 <<REFS0) | (1 << ADLAR) | (1<< 3 )| (1 <<2) | (0<< 1) | (0 << 0);
// enable, start, prescale
ADCSRA = (1 << ADEN ) | (1 << ADSC ) | ADC_PRSCL;
_delay_ms(2);
// Wait for completion
while (ADCSRA & (1 << ADSC));
// Start again as datasheet says first result is unreliable
ADCSRA |= (1 << ADSC);
// Wait for completion
while (ADCSRA & (1 << ADSC));

// ADCH should have the value we wanted
return 255-ADCH; // need to turn the result right side up to be most compatible with the usual calibration table.
}

This is a routine to blink out values like battcheck.hex does.

uint8_t blink_value(uint8_t value){
uint8_t ones;
uint8_t tens;
uint8_t hundreds;
ones=value % 10;
tens=((value-ones) % 100)/10;
hundreds=(value-10*tens-ones)/100;
blink (10,20);
_delay_ms(500);
blink(hundreds,200);
_delay_s(1);
blink(tens,200);
_delay_s(1);
blink(ones,200);
_delay_s(1);
blink(10,20);
_delay_ms(500);
}

use it like:

blink_value(read_vcc());

To blink out the decimal pseudo-adc value.

With a bench supply and that you can get values from read_vcc for different voltages and can make a claibration table. I try to do it when I get a chance and re-arrange read_vcc as needed to make it a real replacement. It's a pretty simple modification altogether really.

Basically, it should just boil down to

1) replacing the ADMUX = line in ADC_on() to

ADMUX = (0 <<REFS2)| (0 << REFS1) | (0 <<REFS0) | (1 << ADLAR) | (1<< 3 )| (1 <<2) | (0<< 1) | (0 << 0);

2) changing the return line in get_voltage to:

return 255-ADCH;

3) Changing the calibration table.

If I didn't overlook something off the top of my head.

While I can follow along and nod at the right times with coding I try to stay away from it myself as much as possible. There is one simple reason for this, Syntax.

I can’t say how much I hate working for hours trying to troubleshoot and issue only to find that it is an extra space, comma, or wrongly placed semi colon that is breaking everything.

I don’t have 1 clue where you start implementing that into the bistro code. :person_facepalming:

If anyone wants to inject it into the TA bistro here is the code as I have it right now: TA Bistro V1

I should be able to give it a go. I see now too that while what I did should be enough for batttcheck to work (which you'll also want I presume), the LVP voltage is actually read with separate inline, and the temperature read uses the same get_voltage so I'll need to make a couple of other minor changes. Because of the temperature thing it may require a few extra bytes of code. It should probably be preprocessor configurable too I suppose.

I thought of the problem with detecting shuttoff relative to Vcc. When going from low to turbo, the input voltage could sag suddenly and that also can look like a shutoff. It depends on how fast it sags to determine if Vcc follows or not and how much margin would be needed, but in principle this means it doesn't help. To be very sure, you'd still want to wait to see input voltage drop below operational range, which in fact makes things worse since that means something like 0.7 max Vcc, which with very low batteries actually means you're waiting to go even blow operational range. Oh well.

No rush, I somehow killed the drivers I was wanting to use the internal referance with so it will be awhile before I need it again.

Although I still want to get firmware released with that feature before I move on to other hobbies.

What? There’s such a thing as other hobbies? I suppose… hope you will still hang around!

Well if TA is going to leave when this is done, I might take a very long time to finish it. Really, take it slow, don't burn out. Do your other hobbies and come to this stuff when you can. We'd rather have you slow and peripheral than gone. Anywway, the code is all written (and it was a little more involved to do it well). I just need to test it and calibrate it. Not sure if I'll package a new calibration tool or not.

The way I wrote things, you can switch between normal read mode or inverted vcc mode with a function call before starting the ADC. So I'm thinking of including a check at startup and if voltage is less than say 4.6 V, then do inverted mode. Basically this will have no effect, except it means you could build a driver with 5.0V LDO and a voltage divider and it would work with the same code without changes. So all 1S lights would work with it, and any compliant (5.0V LDO) multi-S lights would too, without needing a different code version. Of course the check adds a couple of bytes of code. But I have to think getting rid of R2 in 1S lights (you'd still need a bleeder and not entirely sure you can get by without the full divider for OTSM) could ultimately mean 45's will fit on all the 1-S designs, and in that case, you'd get a bunch of bytes back. In the mean time, it's only a few bytes.

Making multiple versions isn't difficult either though as I included a pre-processor config to enable the inverted read. This all paves the way for pin-changed bases OTSM upgrade to bistro too, because it separates voltage reading from the power-off pin.

Got latest Atmel Studio 7 and the USBASP driver (at protostack) installed last night on my new i7 6800K 6 core system last night. Wow!! Super fast development now. All works fine - on Win10 64 bit. Hopefully I can get more done on drivers. Very hard to keep up with everything goin in here though.

Was development slow before? What were you running on, a 486? We're talking about 8K worth of code here...

- No, I use the computer for a little more than just Atmel development . $2,500 is not much money for my biz. I've spent $3,500 for the original (I do mean original) IBM PC back in '82 when I was making ~$20K/year, also $5,000 for a Gateway 2000 386 wayy back, plus untold $$$ on many more systems and laptops, so $2,500 for an incredibly super fast system is a great deal.

It's got a 512 GB solid state Samsung Pro drive, high end all the way, but not top end. Could have spent more.

I got VPN direct in to my client's network and will install VS 2015 for C# .NET development for what we are doing lately across several projects/product development.

I was joking. I knew that attiny85 code wasn't bogging down your current PC...unless there was something very wrong.

Actually my previous computer had all sorts of problems - think the file system was corrupted, maybe hard drive problems. I was waiting for the thing to just not boot one day. Scans couldn't complete, Windows Update stopped working more than a year ago, things got super slow, reboots took forever. Could have re-installed Windows which would have helped big time. Might just keep the machine and do exactly that, but on a solid state drive - wow! These things are great! Super quiet, super quick. I'd keep it relatively small, and if needing more space, go with a USB 3.0 external 1 TB or 2 TB drive.

Nah, I am not going to disappear just because that code it done, I just hating leaving things unfinished and that is one of the last things on my to do list.

I have another driver in the works, hopefully I have time to finish that one up. I expect some major changes in my “real world” life shortly and time will become much more valuable to me at that point. So trying to finish up everything I can before that happens.

Nice code work if it can do both without any code changes!

Does it still fit on a tiny25?

[quote=Texas_Ace]

Well you asked the right question. After I posted I realized the answer is no. I only added about 4 control bytes to the code and statements to set them. The function calls are just inline sets of these bytes and are actually being optimized out since I'm not even calling them yet, just using the pre-processor configured initialized values at the moment (haven't put in the initial auto-detect read yet). And yet, it's coming to 5% over, which is what 100 bytes? Of course I'm not certain it was under that before I started. I'm using avr studio to compile, not your script. So I'll have to check. But I guess zero free space minus a small number is still negative.

Since OTC_read is basically the same, just another adc configuration, I might be able to combine it, but I'm not really sure that saves anything. So I'm not sure how I'd add an interrupt handler and wake loop to all this, even though they aren't big. By itself this allows removal of the entire divider on small drivers so should allow use of 45's anyway with a redesign, but leaving off the divider with otsm is less certain. I think the chip is specced to allow a pin to have Vcc+0.5V on it as the absolute max (near destruction) spec. The input voltage in 1S is Vcc plus the diode voltage, so close to that without a divider assuming C1 absorbs all the noise well.

Of course if I actually do the auto-detect, I will need to separate the calibration tables , so that will definitely add a few bytes. If pb0 was used for the divider read you could do inverted read for everything and if you select resistors just right, could get by without changing the calibration only by some small integer factor. which might be doable without a whole new table, but that's getting a bit creative and far from here just to avoid recompiling.

As the code is now it should compile to 2042 bytes IIRC, just a bit under the limit (actually the latest version my be 2040 bytes, for some reason mode groups over 23 don’t work so I deleted them).

Sadly if it doesn’t fit in a 25 it makes it far less useful. The 85 can have the pins bent to fit but it still takes up more room and is harder to build.

TK has some updates to bistro that can save around 100 bytes that would be great to have but she is MIA.