Runtime Graph - Help Needed

This is the first time I have attempted a runtime graph and so I am having some difficulty with it.

The device I am using to log light levels is a Canon Powershot 620 running CHDK software.

The problem is when the light levels produced from the light drop, the numerical figure which is generated by the CHDK software becomes a minus figure. The light is still producing output though, even if it is only around 5 lumens (a guess!).

When trying to import this data into a graph it is proving to be troublesome and I don't know how to go about handling these negative numbers. Ideally I want zero to be when the light shuts off completely.. how to I coax these figures without distorting the graph?

Here is an example of what I mean

Thanks for any help with this.

Simplest method would be to add around 450 to all the values.

If one is being pretentious about it, that's called "normalisation"

Thanks Don, that makes sense. It actually seems like a stupid question now when I think about it.
Sometimes I can’t see the wood for the trees when it comes mathematics!

I spent several years of my life doing this stuff in labs. When your results don't make sense, time to normalise it.

Once spent three months doing things to sewage samples. Basically looking for cadmium.

Did the stats on the numbers, got the tightest results I've ever seen. Since someone was going to prison if these numbers were right (It looked like someone was pumping some seriously dangerous industrial waste (From the metals involved it was an electroplating outfit - and like most plating outfits the waste was loaded with cyanide) down residential sewers 15 miles from the offender's factory) this stuff was serious.

With wet bench chemistry you are lucky to get statistical errors under 10%. Mine were under 2%. Didn't believe it.

My boss didn't believe it either. Nor did his boss (Both PhD's).

So they grabbed my lab notebooks and did their own stats. Got the same answers. That was a bonus!

But at this point the lawyers got involved. For prosecution stuff you divide each sample into three parts. Spend a month analysing another set of the same samples (By extracting them into chloroform - I get ill in the presence of chloroform 30 years later). Sent a set out to another lab using a totally different method of analysis.

To cut three months of work short, the results depended only on how long one piece of lab gear (that cost more than my house did) had been switched on. When we got independent analysis done, I'd got a series of random numbers with errors larger than the actual values. Which reassured me about my technique which i just knew wasn't that good.

For fun, (I did say I used to be (?Still am?) a geek didn't I?) I tried to find a suitable equation to make sense of my data. Could just about normalise it with a two page equation.

Merely adding numbers is a no-brainer. But if you haven't had to do this stuff it wouldn't necessarily occur. I've no doubt there are problems that come up in your day-to-day work that I wouldn't have a clue about and would have to ask. But would be glaringly obvious once you told me.

Like the typesetter I was speaking to who I asked why the margins on book pages are the way they are.

100% - max

0% - min

First result = max

Last result = a little bit more than min.

Here's an example: http://www.light-reviews.com/jetbeam_bc10/runtime_high.gif