I don't know about you, but I mow the lawn quite often. In the summer months, it is a bit more often than I care for it to be. "Maybe you should just cut it shorter. That way you will have longer for it to grow and you won't have to mow as often." This is what my wife said one day. Maybe she's right, but that's not what I heard. I was told that if you cut the grass too short, it grows even faster. Also, I figured that if grass grows at a linear rate, then it would get as tall as an elephant's eye in a few weeks.
Well, which idea is right? There is nothing to stop me from collecting some data on the height of grass, right? So that's just what I did. I put a stick in the ground to use as a reference and then took a picture of it for a few days. Here
Yes, my method might not be perfect - but it's all I have. Oh, and one other thing: I took these pictures LAST summer. Yup. That's how long it took me to get to this growing grass problem. I blame Angry Birds for getting in the way. However, there is one significant problem with the data being so old - I don't remember how long the stick actual is. So, I just guessed. But the nice thing about taking images of the grass is that the metadata in the photo includes the date. Boom - there's your time data. Now I just need to estimate the height of the grass and I can get the following plot.
Someone untrained in the art of grass debates might make a mistake here. That person might say "Oh hey. That data looks quite linear! Let's fit a linear function to it." That would be bad (although you can see I did that anyway). First, how can you say that you should cut the grass at a longer length if the growth rate is constant? You can't. Plus, I still think it would have to reach some maximum height. Ok. Let's just find the slope of this line to make you happy. Doing so, I get a growth rate of 0.0098 meters per day (1.13 x 10-7 m/s). If this growth rate was constant for 2 months, the grass would be 0.6 meters taller than when it started. Maybe you could get grass that high, maybe. I guess it depends on the type of grass. Bonus note: it is better to plot data and find the slope than to just take the height divided by the time. Here's why.
But how else could I model the grass growth? What if the rate of growth is related to the height of the grass? I will call the height h such that:
Here the rate of change (with respect to time) of height depends on the height. That means that we have a differential equation (with some constant in there - k). This one is pretty easy to solve.
Notice that this model for height doesn't have a maximum grass limit. However, it would take a pretty darn long time to grow really high. If this model is accurate, then I could plot h2 vs. t and it should be a straight line. Here is the data.
For this plot, the slope is not the growth rate (since it isn't constant anyway). Instead, the slope would be 2k with a value of 0.0013 m2/s.
Which model should I use? Obviously, if I want to cut the grass to a longer height I should say the second model. Also, it's more interesting. Of course, both models seem to work fine with the data that I have. Really, I would need more data. If I measured the grass height after 20 days instead of just 11, that would help out a bunch.
Just as a comparison, here is a prediction from both of the above models over a longer time interval.
Although my non-linear model doesn't fit as well for the shorter grass, it seems more realistic for longer grass. But what do I know? Oh, and I cheated. Yes, I did. In order to make this easier to plot, I assumed a zero intercept for the non-linear model. It wasn't actually zero (it was close) but this makes it easier to solve for h as a function of time.
Optimal Grass Height
Suppose I cut the grass to a height hc. How long before I have to cut the grass again? If I let it grow to a height of h +Δh, how long would this take? I will pick a constant Δh of 0.026 meters based on the data from the actual lawn. Here is a plot of the time to get to that height based on the starting height of grass.
According to this plot, I win. If you cut the grass super short, you would have to cut it again in a 1.3 days. However, if I let cut the grass so it is twice as tall, I could wait 2 days. However, both of those are still super short. Letting the starting grass height be 6 times as tall will give me almost a week between cuts.
Problems: If you think all of these calculations are kind of bogus, I mostly agree with you. Here are some problems that I can see (mostly dealing with not having enough data).
- I measured the growth of grass as a function of time. Of course, there were things that happened - like rain and stuff. This could possibly (likely) have a big impact on the growth rate.
- I made the assumption that the growth rate depends on the height of the grass. What if the growth rate depends on the height that the grass was CUT? This means that my data is worthless since I only cut the grass at one particular height.
- My model is bogus. I already admitted that my growth model might be less than realistic.
- What if it isn't the change in height of the grass that makes me need to mow it? What if it is that the grass height is uneven? So, as the grass grows, different blades grow at different rates. If all the grass were the same height it would look "neater" than varying heights.
So, what should I do? I could collect a lot more data. That would help. Unfortunately, I don't have time to collect this data. I have to go mow the lawn.