Lumens and "luminous efficacy" - tech question....

ok
bear with me here

lumens are supposed to take into account the color of the light
a watt of light (not electricity) will make 683 lumens - IF the wavelength is 555nm - a certain green color
its 'luminous efficacy' is 100%
because that is the color that seems brightest to the human eye

the sun ( a "class G" star, at 5800K color temperature ) is only 13.6% efficient
https://en.wikipedia.org/wiki/Luminous_efficacy
so it takes 7.4 watts of sun power to make the same 683 lumens

my question is, what does a lux meter measure?
it isn;t looking at color, right?
isn;t that too complicated for a simple meter?

so how would a person really use it, to figure out lumens?
(assuming you integrate all the luxes and the square feet)

is there some factor, like 13.8%, that you apply?
maybe if you knew the color temperature, there is a table (?) of (approximate?) correction factors?

or is this it..?

(also from wiki)

all you 'integrating sphere' guys, how do you go from luxes and square feet, to lumens, ignoring color of the light?


wle

You should not mess wavelength and color temperature. Leds output wide range of wavelength, same right for luxmeter input.
The only way to test your luxmeter - use some very well known light (incandescent lamp) with a set of filters that dont pass light above (below) listed wavelength.

P.S. If I remember right, djozz have tested UV leds with simple luxmeters and even could compare measurements (neverless wavelength was far away from regular led range).

Luxmeters have a filter on their sensor designed to recreate the sensitivity of the human eye to different wavelengths.
So 500 lux of blue light and 500 lux of red light would appear as the same brightness to your eye.
Some cheaper luxmeters have worse filters and will give incorrect readings depending on how much of a certain wavelength there is.

For example, for this UT383 luxmeter here you can see that the blue wavelength gives a much higher reading than the high end calibrated luxmeter.

oh
ok there is part of the answer
i didn;t realize the lux also tries to account for eye sensitivity
so if the luxes are right,the lumens will also be right

wle

Generally cheap luxmeters have problems near the edges of the visible spectrum because their filters sensitivity peak (in the green part of the spectrum) is shifted a bit compared to the real peak of the human eye’s sensitivity. This shifts the entire sensitivity curve causing large errors in the blue/purple and red part of the spectrum.

A lux meter measures brightness, so basically it measures the power of the light that hits it, but it has a filter which corrects the spectrum for the wavelength dependent sensitivity of the human eye. The filter changes the measurement from watts/meter^2 to lumens/meter^2.

Lux is basically directional light. Lumens is the total amount of light emitted in all directions.

As for your other question, the luxmeter only measures a small point of light so you can’t really turn that into a lumen number without using an integrating sphere.
If you know the area of your spot, and you know the lux at every point on the spot, lumens is simply the lux multiplied by the area in m^2.

The complicated part is that it cannot be a flat area, it needs to be the area on the surface of a sphere, where the distance from the light source is constant.

If you go far away, however, you can approximate the lumens based on the flat area because a sphere with a large radius has very little curvature and it will only introduce small error.
In this case, however, you will need to use the lux at that farther distance, not candela (lux @ 1m)