100Mcd LEP - Laser Phosphor Wheel

While a cold mirror like the original Wavien collar keeps the temps lower by not reflecting infrared light, It still lets quite a bit of visible light through.
I’m curious how the intensity would compare if the reflector is solid electroformed instead…
More heat for sure, but maybe more output?

But yeah, a pinhole in a hemisphere would give the maximum output, just not a lot of lumens haha

This answers your question in cool way :slight_smile:

Impressive project, really nice to find something that extraordinary on flashlight forum :+1:

Cant wait to see the beamshots, good luck with the project!

IIRC, collar reflects blue only because that’s what excites the phosphor, at least in the LEDs that they used. They could design a cold mirror that would reflect all visible light just as well.
It would:

  • produce higher temperatures, reducing intensity
  • reflect some light off the silicone on top of the LED, increasing intensity

And substrate on which you put reflective coating doesn’t matter all that much. You can have a dichroic mirror (like the Wavien one) on metal as well. Or you can have alu mirror on glass.

Are you sure about this? I suppose that assumes the coating is absorbing the other wavelengths. If it was doing that, then there shouldn’t be any visible light escaping from the outside of a waiven right? I thought they acted more or less transparent to the non reflected wavelengths. If this is the case, substrate matters.

If you put the coating on a metal then the IR and red wavelengths will all be reflected back because they can’t go through metal…
That’s why all cold mirrors are on glass, so that the hot light can escape.

But yes, you can put aluminum on glass, that’s what edmund optics does with their parabolics.

You’re right, alu substrate reflects IR and so it’s worse than glass in that regard. I guess it would be possible to add some black layer that blocks it between the alu and the mirror. But I don’t know whether such process customization would be economically reasonable.

BTW, does it matter a lot? Quick check:
http://www.firecalculators.com/radiantheatsimple
1 mm² surface with emissivity of 0.82 (taken from here) heated to 150 °C radiates ~1.5W of energy. Alu has very high reflectivity in IR, let’s omit that.
Is that 1.5W a lot? More than my gut feeling would give. :wink: May have a measurable effect.

It’s like 5-10% if the LED is being driven at ~6A, so probably will just make the output peak a bit sooner.
But maybe the amount of visible light reflected would compensate for that, how much I don’t know.
Some day I’ll buy an electroformed collar and see how it compares to my glass ones.

I did not take into account that some of the radiant heat escapes through the aperture. Also, some is reflected by the die, giving it another chance to escape. If collar collects 75% of energy (pretty much the worst case) it ends up at just over 1W.

My gut feeling is that any mirror will work. The heat emitted is not significant.
In addition, I also believe a larger portion of the efficiency of a collar is not from the recycled blue but simply the white light.

LEP that use reflective laser has a much higher spike of blue since there isn’t really anything stopping the laser to refract/reflect off the surface, where as an LED or transmissive phosphor all of the blue light must pass. Any yet, I keep getting the same ~2x intensity when using the Waiven produced collars on LEDs, LEPs of all types phosphors.
When I was testing the solid crystal static phosphors with collar, they had a very bad absorption rate so there would be quite a bit of leftover blue; same results, about 2x intensity

I was thinking of trying to make my own collars and getting it sent in to be silver/Alu coated used in the Chinese plastic reflectors. Each batch costs like 150USD and can fit like 40 collars. They said the reflectivity is around 97%

I was never able to get a very smooth or spherical surface to continue with coating. Maybe we can do injection molded? The coating would fill up the small surface imperfections.

All I can say likevvii is WOW WOW WOW. :beer:

Thank you for the kinds words (:

I am currently working with an electronic engineer freelancer to produce the controller.
The controller will have a 50mm OLED, buzzer, and buttons

It will monitor and display every critical component sensors and provide warnings / automatic decision when certain values are reached.

Here are some features I plan to add:
(To control brightness, I have a 100mm slide potentiometer.)

-Software limited slow ramp up. Example: +20% per second. Ramping down is unaffected.
-Potentiometer state check. If output is 10% or higher when powered on or long idle; it will produce error message and tell operator to lower potentiometer to zero.
-Exponential brightness curve. A little bit more brightness resolution on the lower end.

“Warning. Core meltdown imminent. You now have 17 minutes to get to minimum safe distance!”

:smiling_imp:

I imagine you have to monitor the wheel rpm, if it stops spinning bad things could happen quickly.

Useful ideas for your controller.

I would suggest adding circuitry for charging, including balancing and capacity-left option. Or what I like: Runtime left on current level. Some buttons for special functions like momentary-full-power is nice. And programming interface accessible without having to open the lamp: Once you write your own software, there will always be improvement ideas, and you will want to program oftenly in the first time....

Yep, I am using an IR RPM sensor directly reading the RPM of the wheel.

What would be some ways to monitor/measure capacity? Are there BMS systems that can output that data? Since I am running these cells at 70% of their maximum continuous discharge rating, they would voltage sag quite a bit so measuring voltage is not optimal.
I could put a current sensor but it seems like it would not remember capacity after disconnecting power.
My programmer will be using an arduino, so I just only need to edit certain variables according to my liking through USB.

if you used it in cold temps [winter], could it do even more?

how bright are all those candelas, compared to the sun at noon in summer?

wle

Regarding battery capacity: On my 1500W build I implemented the following calibration procedure:

  • Battery is fully charged
  • A complete discharge is performed
  • During this discharge, once every 30 seconds the lamp is driven to no-load, mid-load, heavy-load
  • Voltage value is stored for all three load cases, so three values per 30 seconds
  • Once battery reaches end voltage (3V per cell), it is further stored how long total discharge did take
  • Now the voltage level corresponding to certain battery fill grades can be calculated
  • Internal battery resistance can be calculated from the voltage drop between the load steps
  • During operation I then always look up the no-load voltage curve. Depending on lamp power level I then add a value calculated form battery current and internal battery resistance
  • This work pretty good, when ramping lamp power up and down, my capacity keeps +-5% accurate

Knowing the battery capacity in mAh or Wh it is possible to use a hall current sensor and simply integrate to find the drained capacity %.
That’s how all these cheap battery capacity monitors work.
https://www.amazon.ca/Yeeco-Voltmeter-Multimeter-Capacity-Detector/dp/B073W6453F

These are all great ideas!

It seems with an integrated battery, by far the cheapest and immediately effective option would be to map the voltage profiles.
I will use this method once I have a permanent pack.

If I used capacity tracking, what would be some effective ideas to store the remaining capacity value? The auxiliary pack I use to power the controller and other components will be removed since they are my general purpose batteries. Based on my limited knowledge, If there is no continual power supply, we will need the controller to write and save to flash the capacity values periodically. Due to limited life cycle of flash writes, it may not be the best option.

Why do you plan to use a separate battery pack for the controller? Isn't it simpler to have that also derived from one main battery?

I also thought about doing an integration over current and voltage going in and out of the battery, but at >100A current sensing would have consumed too much space, and space is something I had nothing left....

Further, there is the effect of slow battery discharge during non-usage. Getting that not into integration would cause error; and a lamp can sometime lay around some days/weeks without usage.

So with that in background, I found the calibrated voltage method being suited for this application.

By the way, if you have an application needing permanent, non-volatile logging, using an MRAM can be an option.

Can be accessed like SRAM, but keeps content after power loss.