Introducing Ceilingbounce - flashlight testing and runtime graphs for Android

This is interesting. I’ve wondered in the past why an app like this doesn’t exist. Now it does, and seems to work fine. I just need to make some kind of light box for my phone. A film can would be handy mini integrating device, if it worked. The reading changes too much when I move it just a hair for it to be of much use. Anyway, I’ll be following any developments of this app.

Here’s a shot of it working in my, um, lab. (The light is Foursevens Atom, with almost empty battery)

I’m working on a way to do integration without any extra hardware, but no promises. I think a film can will act as a diffuser more than an integrator and is probably only good for runtime graphs. An update with improvements to graphing, especially over very long periods of time is coming soon.

Apps that display, graph and log outputs from all the sensors do exist. Physics Toolbox is my favorite of those.

Thanks for making this app. It should be quite useful.

This is a great initiative, despite the inaccuracies that (most, some are better than others) ambient light sensors have compared to good lux meters, as you say it is a great way to get a pretty good ballpark measurement of light output, with the added convenience of automatic runtime graphs being made (which is way ahead of my own medieval way of collecting data and making graphs)

When I have time I will try it all out. It looks like a great readme that you made.

(from the readme: “Is this app a flashlight? - NO! Stop using your phone as a flashlight. Why are you even here?” :party: )

Eventually, I want to put together a website for people to share data so that runtime graphs and such are easy to find for most lights. Studying the accuracy of various phones’ light sensors would also be interesting. My regular phone is a Nexus 5, and somebody donated a Galaxy S3 to the project. The S3 reads significantly high with high-CRI Nichias I’ve tested, but the Nexus 5 is consistent with what I would expect given tests of those emitters with calibrated instruments. The S3 is almost 30% higher with a 219B R9080 when both phones are calibrated such that they read nearly identically with several unspecified-CRI Crees.

The other important issue I’ve noticed is that phone light sensors report some fraction or multiple of lux. My two phones are quite far from each other, but both appear to have a linear response such that with the same light, a mode that should be twice as bright according to independent tests with real meters reads twice as bright on the phone. That means that regardless of accuracy and calibration issues, it will produce runtime graphs with the right shape.

Sweet! You beat me to it (probably by a number of years). It seems to work great on my current phone (Droid Turbo 2), but I have a Droid RAZR and Galaxy S4 with 4g problems that I plan to actually use to measure runtimes. Now I just need to build an integrating shoe box.

And clojure! You must be one of the cool kids!

Thanks!

Nice! I just ordered a Pixel Google phone, so could try this out in a few weeks. Android is mostly all new to me though.

Well, I certainly wasn’t going to write Java when not forced and/or paid a lot.

While it’s fundamentally a pretty simple app that doesn’t require much in the way of fancy language features, it has been really nice to have watchers, futures and promise/deliver. And the REPL, of course. I don’t know how people get anything done in environments where they can’t interact with their running code in real time.

Cool idea. Any plans for an iPhone variant?

Try developing/testing flashlight drivers. Take your best shot, take apart your light, rig up the clip/USB dongle, dnld, assemble the light, test -- over and over again . You do get better and better at getting it right the first time.

Any plans to put this on the Google Play store? I’m not sure if this would incur any cost to you.

I’ve side-loaded stuff on Android before but for some reason, the app that I normally use to load APKs isn’t recognizing the file at all. I copied the APK into 3 different directories on my phone and it’s not recognizing the APK automatically or even if I manually navigate to the folder. I’m using “APK Installer” by “Mobile Manager” (this same app has been used before to side-load other stuff).

Maybe I’ll just try another side loader.

This is on a Samsung S7 Edge.

Okay, I just tried a different loader and it worked fine. Odd.

Already answered in the readme bout iOS. Simple answer is no, iOS and the store have too many restrictions.

It’s not simply that there are too many restrictions. It’s that use of the API required to access the ambient light sensor is specifically restricted from being used in any app sold or given away in the app store (and there are no sane options for end-users to sideload). If you look at cross-platform general sensor-monitoring apps like Physics Toolbox, you’ll see they don’t support the light sensor on iOS while they do on Android.

As for the Play store, yes, I do plan to put it there eventually. It costs $25 to do so.

Well, just thought 1 restriction is one too many, that's why I included a link to your readme.

There are luxmeter apps on the App Store such as this one: https://itunes.apple.com/us/app/galactica-luxmeter/id666846635?mt=8

But maybe they have a different low level method? I see references stating they use the camera.

There are some iOS apps that try to detect light levels using the camera. As far as I can tell, that method would only be accurate with really low-level control over the camera that isn’t available on iOS. So, yes, one restriction is too many when it’s the core functionality of the app that’s restricted.

Updated:

  • The sample rate is decreased to one sample per 10 seconds after 100 minutes to stay responsive in tests lasting many hours.
  • The PNG image of the graph doesn’t start writing until the background rendering is complete, because concurrency is hard.
  • The output graph is scaled appropriately regardless of the user panning or scaling the live graph.
  • A number pad is used instead of a generic keyboard for the calibration fields.
1 Thank

bump!
(so I can find it faster on the new phone and check it out for some Q8 runtime graphs
thanks for making it Zak!

Hi Zak,

Been tinkering a bit with your app in my rooted good ol' Moto G 2013. Nice it is despite its simplicity, though I have to remember avoiding my usual screen auto-rotate setting because the app resets if the screen rotates, ?

I'm trying to get some valid throw values for a Sofirn C8T, the thing does up to 6+A at the tailcap, but I guess I need to make some more table room, it's a small spot at just 0.75m. Already got 102.4Kcd.

0K, here's the main question I came here for, ¿do you think a PWM detector functionality could be implemented in the app? I believe this can be possible by polling the light sensor at two non-multiple useful frequencies with custom selectable polling frequency up to whatever limit please (this is just in case we start polling at exactly the PWM frequency LoL!), looking up for noticeable variations in the measurements over a very brief lapse. Any non-PWM beam values would be nearly identical, and it would just take a second of polling or much less. Graph time of up to half or 1s would be plenty, I believe.

Cheers ^:)

The maximum sample rate of the sensor on my Nexus 5 seems to be 0.2 seconds. That’s not a viable way to detect PWM.

What could work is to use the camera. At a high shutter speed with the electronic rolling shutters used in most smartphones, ripple and PWM look something like this. That kind of pattern should be detectable with an algorithm and it may even be possible to estimate frequency.