The There Are No Stupid Questions Thread

Sprint layout. :wink:

Have anyone taken apart the dual channel D4V2? What’s the difference besides the driver and PCB?

I think those are only differences.

Does anyone know the copper plate thickness of the FC11 and FC12?

I asked Hank for dual channel driver and copper board because I was going to reflow emitters anyway. He said it’s too different. I wonder what else was changed.

At least my single channel and dual channel KR4’s seem identical. I have also bought dual channel driver and 2 channel pcb from him separately for my projects, so I don’t know why he didn’t sell them to you.

Maybe you misunderstood each other because I had no problem when making weird requests to him.

Edit : I should have quoted, the comment I was replying to has been deleted.

Sorry about that deletion. I clarified with him, I just spent $5000 HKD ($650 USD) with his wholeseller in HK too. It’s definitely not money related. Seeing the events happening around the world, I think it’s a good time to spend more money on your favourite material things before everything shuts down. Evergrande for ex.

Can somebody explain this apparent paradox? Tungsten or halogen bulb is quite inefficient in terms of converting electrical energy in to light. But ironically, none of the flashlights using them as bulbs ever worried about having huge heat sink or thermal regulation. However, all the of the high power LED bulbs need to dissipate excess heat even though for tungsten or halogen, the emitted light was just by product of the filament burning!

It’s very simple, filament bulbs are made of metal and glass and can withstand extremely high temperatures.
Leds are electronic components, diodes made of silicone, plastic and phosphor. They do not like extreme heat. They need to stay at lower temperatures to last a long time.

Maybe your asking about the heatsinks? Most all older flashlights using bulbs also used low power lead acid batteries that did not draw high amperage. They were all low amperage and relatively dim compared to led. They also didn’t have complicated electronic drivers, they typically had an on/off switch. So if they got too hot you simply turned the light off.

Modern led flashlights using lithium ion batteries can draw way higher amps (100+ watts). Older bulb style flashlights typically drew 2.5 watts. Maybe a bit higher? No where near modern flashlight wattage. Hence, the older flashlights never got very hot. There was no need for heatsinks, especially since the bulbs could easily handle their own heat.

I hope this answered your question.

Good explanation! If I want to get the same amount of lumens as the old incandescent flashlight, there is no need for heat sink then?

Generally, yes… because incandescents are horribly inefficient compared to LED. Incans waste so much energy in the form of heat versus photons. You can seriously burn your hand on the head of some high power halogen/xenon incan flashlights.

Yes. For instance, I used to use a 2 D cell Maglite with 1.2v NiMh cells and a 2v bulb to get extra output. When led bulbs came out I was able to get even more lumens and batteries lasted much longer. The original setup produced no noticeable heat and the led setup probably ran even cooler. It’s very easy to get more lumens with led. Modern flashlights are many times brighter than older incandescent to the point that they can get hot and need cooling. You can take a modern led flashlight and run it at a lower setting and be brighter than incandescent and it won’t heat up much at all.

My charger does 500mA and shows me the voltage of the battery as it reaches 4.20 I just need a ballpark number e.g. listed 2500 battery, is it really 1200 or is it 2300? That would be enough.

Can I guestimate true capacity of a cell by noting how long does it take to say go from 4.00 to 4.20 in minutes? I am assuming voltage increase is linear and inversely proportional to the battery capacity.

I don’t think you can use that method since most li-ion chargers start in constant current mode and finish in constant voltage mode. So it will stay at 4.2v for maybe 20 minutes until the current goes to zero, but the time can vary a lot.

The only way I know is to use a battery charger that shows you the milliamphours as its charging. Since it’s usually measured from 2.5v-2.6v to fully charged, I tend to start around 2.8v-3.0v and if it charges 2300mAh I’ll add a little more and say it’s about 2500mAh. So even using a charger that measures it, is not exact. It’s only pretty close, assuming the charger model has been tested and shown to measure pretty accurately.

There's totally such thing as a stupid question...and I've asked lots of them since I've joined the forum...lol. with that said..you're not stupid if you ask a stupid question...but you cannot get upset for being clowned for said stupid question. Trust me, im living it every day. Lol.

Nope. Go to a gas station and put down 10bux worth.

It’ll fill at top speed ’til it gets to about 8bux then will slow down some.

When it gets to 9bux it’ll slow down even farther.

By the time it gets to like 9.70, it’ll just be a trickle, incrementing by pennies ’til it hits 10.00 and then stops.

Same way Li cells are charged. CC is full-tilt at the rated current up to when the “pushing” voltage hits 4.20V, and then it switches to CV mode, just applies that 4.20V and lets the cell sip more and more slowly as it “catches up” on its own until the difference between “pushing” and “battery” voltage is small enough that the current drops to whatever percentage of the CC rate determines it should cut off.

Then how about this one? Just look at the time taken to say from 3.90 to 4.10 (assuming this is being done on CC part of the charging curve). Isn’t that part of the graph assumed to be linear? If it takes 20 minutes at 0.5A to raise 0.2V on one cell vs another one where it took 30 minutes, can’t I claim that other cell has 50% more capacity?

There are other factors that your not taking into consideration. Different cells have different amounts of internal resistance. This can effect the charge times. Even temperature can effect how well a cell can take a charge.
You can try all kinds of techniques, but who knows what results you’ll get. Like I was saying earlier, even with a known charger that can measure the mAh going into a cell its still only going to be a ballpark estimate of the cell capacity. To get better accuracy you need a dedicated testing rig thats calibrated and you need to know how the manufacturer tested the capacity. Some might start at 2.5v, some might start at 2.65v. The test might be done at different room temperatures. They may measure it at a certain charge rate. All these factors go into their cell capacity rating.
The types of measurements your suggesting are way more variable than even the basic battery charger capacity tests (which aren’t very accurate).
I’m not sure what your trying to accomplish. If your just trying to see which of two cells has a higher capacity you can do a load test. Put it under a set load like a flashlight at a brightness that won’t over heat it and measure how long it takes before the low voltage protection kicks in. Thats real world results there.

What batteries do you have?
Has anyone tested those models before?

How many amps will 4 XP-L HI emitters pull in a D4SV2? Looking for a suitable 26650