Introducing Ceilingbounce - flashlight testing and runtime graphs for Android

This is a little rough and definitely expects the user to know a bit about both Android and photometrics. If sideloading and file managers are totally foreign concepts, it may not be ready for you just yet.

Please do read the README twice before asking for support.

Will code for food and/or lights.

Thanks to reddit user kaybi_, I’ve fixed a bug that caused the UI to hang some time into a runtime graph. I’ve also reduced the sample rate, resulting in smaller CSV files.

This app makes it easy to see how your flashlight behaves under a variety of conditions. For example, here’s the intersection of a partly discharged and somewhat worn out 18650 and a timed stepdown.

This is interesting. I’ve wondered in the past why an app like this doesn’t exist. Now it does, and seems to work fine. I just need to make some kind of light box for my phone. A film can would be handy mini integrating device, if it worked. The reading changes too much when I move it just a hair for it to be of much use. Anyway, I’ll be following any developments of this app.

Here’s a shot of it working in my, um, lab. (The light is Foursevens Atom, with almost empty battery)

I’m working on a way to do integration without any extra hardware, but no promises. I think a film can will act as a diffuser more than an integrator and is probably only good for runtime graphs. An update with improvements to graphing, especially over very long periods of time is coming soon.

Apps that display, graph and log outputs from all the sensors do exist. Physics Toolbox is my favorite of those.

Thanks for making this app. It should be quite useful.

This is a great initiative, despite the inaccuracies that (most, some are better than others) ambient light sensors have compared to good lux meters, as you say it is a great way to get a pretty good ballpark measurement of light output, with the added convenience of automatic runtime graphs being made (which is way ahead of my own medieval way of collecting data and making graphs)

When I have time I will try it all out. It looks like a great readme that you made.

(from the readme: “Is this app a flashlight? - NO! Stop using your phone as a flashlight. Why are you even here?” :party: )

Eventually, I want to put together a website for people to share data so that runtime graphs and such are easy to find for most lights. Studying the accuracy of various phones’ light sensors would also be interesting. My regular phone is a Nexus 5, and somebody donated a Galaxy S3 to the project. The S3 reads significantly high with high-CRI Nichias I’ve tested, but the Nexus 5 is consistent with what I would expect given tests of those emitters with calibrated instruments. The S3 is almost 30% higher with a 219B R9080 when both phones are calibrated such that they read nearly identically with several unspecified-CRI Crees.

The other important issue I’ve noticed is that phone light sensors report some fraction or multiple of lux. My two phones are quite far from each other, but both appear to have a linear response such that with the same light, a mode that should be twice as bright according to independent tests with real meters reads twice as bright on the phone. That means that regardless of accuracy and calibration issues, it will produce runtime graphs with the right shape.

Sweet! You beat me to it (probably by a number of years). It seems to work great on my current phone (Droid Turbo 2), but I have a Droid RAZR and Galaxy S4 with 4g problems that I plan to actually use to measure runtimes. Now I just need to build an integrating shoe box.

And clojure! You must be one of the cool kids!


Nice! I just ordered a Pixel Google phone, so could try this out in a few weeks. Android is mostly all new to me though.

Well, I certainly wasn’t going to write Java when not forced and/or paid a lot.

While it’s fundamentally a pretty simple app that doesn’t require much in the way of fancy language features, it has been really nice to have watchers, futures and promise/deliver. And the REPL, of course. I don’t know how people get anything done in environments where they can’t interact with their running code in real time.

Cool idea. Any plans for an iPhone variant?

Try developing/testing flashlight drivers. Take your best shot, take apart your light, rig up the clip/USB dongle, dnld, assemble the light, test -- over and over again . You do get better and better at getting it right the first time.

Any plans to put this on the Google Play store? I’m not sure if this would incur any cost to you.

I’ve side-loaded stuff on Android before but for some reason, the app that I normally use to load APKs isn’t recognizing the file at all. I copied the APK into 3 different directories on my phone and it’s not recognizing the APK automatically or even if I manually navigate to the folder. I’m using “APK Installer” by “Mobile Manager” (this same app has been used before to side-load other stuff).

Maybe I’ll just try another side loader.

This is on a Samsung S7 Edge.

Okay, I just tried a different loader and it worked fine. Odd.

Already answered in the readme bout iOS. Simple answer is no, iOS and the store have too many restrictions.

It’s not simply that there are too many restrictions. It’s that use of the API required to access the ambient light sensor is specifically restricted from being used in any app sold or given away in the app store (and there are no sane options for end-users to sideload). If you look at cross-platform general sensor-monitoring apps like Physics Toolbox, you’ll see they don’t support the light sensor on iOS while they do on Android.

As for the Play store, yes, I do plan to put it there eventually. It costs $25 to do so.

Well, just thought 1 restriction is one too many, that's why I included a link to your readme.

There are luxmeter apps on the App Store such as this one:

But maybe they have a different low level method? I see references stating they use the camera.

There are some iOS apps that try to detect light levels using the camera. As far as I can tell, that method would only be accurate with really low-level control over the camera that isn’t available on iOS. So, yes, one restriction is too many when it’s the core functionality of the app that’s restricted.


  • The sample rate is decreased to one sample per 10 seconds after 100 minutes to stay responsive in tests lasting many hours.
  • The PNG image of the graph doesn’t start writing until the background rendering is complete, because concurrency is hard.
  • The output graph is scaled appropriately regardless of the user panning or scaling the live graph.
  • A number pad is used instead of a generic keyboard for the calibration fields.

(so I can find it faster on the new phone and check it out for some Q8 runtime graphs
thanks for making it Zak!