le_bebna_kamni (
le_bebna_kamni) wrote2009-01-26 12:18 pm
![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Entry tags:
Fun with Recursion
I've been trying to learn Python, because despite my growing love for Ruby and Rails, it seems that in order to have fun at the cool kids' table I can't avoid it. ;P
Matt Arnold has provided me with the perfect opportunity to learn Python, because he's also learning it in parallel with his Intro to Java class. He brings home his assignments, writes them in Java first, and then rewrites them in Python. I check out his assignments and translate them into Python as well.
His most recent assignment just begged for a chance to test Python's recursion capabilities.
Imagine that you have a credit card that has an 18% APR, or 1.5% interest per month. If you have a starting balance of $1000, and every month you send a $50 dollar payment, how many payments will it take you to pay off the debt? Assume that interest is added before you make the payment, so that after your first payment of $50 the balance will be down to $965.
Here's the first version of the program:
Both the iterative and the recursive problems failed for any intial balances over $3333.00. With iteration, Python failed immediately, giving me a stack overflow error, but with iteration, it would run forever until I Ctrl-C'ed it. I could understand the stack overflow, but what was going on with the iteration?
I tried the same code out in Java and assembly, and ran into the exact same problem every time. Clearly, it was not simply an issue with Python. After an hour of trying to figure out if it was something wrong with my code, I was beginning to wonder if there was some inexplicable problem in Ubuntu with memory management.
And then I had the magical "Doh!" moment.
You see, the errors/perpetual loops I was getting were not errors, but an indication that $3333 is the threshold above which a $50 minimum payment will never pay off the balance. The interest becomes more than $50, and it spirals off into an infinite number of payments.
Now I'm wondering if I could write a program that, given any interest rate and minimum payment, could calculate the maximum balance that would allow the debt to be paid off. I'm not sure if I should include stipulations like "paid off during a normal lifetime", "paid off before the sun novas", or "paid off before the heat death (or contraction, depending on your model) of the universe". ;)
The exercise has given me some fun insight into recursion and iteration. I had always been told that with primarily iterative languages -- e.g. Java and Python, as opposed to, say ML -- that recursion is generally less efficient, even if it happens to be easier to write in certain problems like "The Tower of Hanoi". While I can't detect a slower run-time, I can see that recursion eats up the stack like no tomorrow. Clearly, with these languages iteration is the best choice for most applications, even if it involves a little more thought on the programmer's part.
But I can also see where catching a stack overflow error may be of some use. My experiment revealed that there are many cases where unexpected values (from user input, for example?) could cause an iterative program to run to infinity. In the case of my program about calculating credit card payments, there are some minimum payments that just won't ever pay off the debt. In these cases I wonder if the stack overflow errors generated by recursion -- which can be caught -- might be a useful way to handle unknown, potentially infinite, values**.
All in all, a fun problem to solve. I'm rather enjoying making simple programs more complex than necessary simply for the sake of breaking and fixing things.
-----------------------------------------------------------
** that assumes that the programmer doesn't think too look for some sane value, like the expected lifetime of a human being, to set as a checkpoint. However, I can see such an artificial limit quickly becoming obsolete if we ever hit the singularity. ;P
(It also assumes that my meager, relatively useless program will still exist somewhere when the singularity occurs. lol)
Matt Arnold has provided me with the perfect opportunity to learn Python, because he's also learning it in parallel with his Intro to Java class. He brings home his assignments, writes them in Java first, and then rewrites them in Python. I check out his assignments and translate them into Python as well.
His most recent assignment just begged for a chance to test Python's recursion capabilities.
Imagine that you have a credit card that has an 18% APR, or 1.5% interest per month. If you have a starting balance of $1000, and every month you send a $50 dollar payment, how many payments will it take you to pay off the debt? Assume that interest is added before you make the payment, so that after your first payment of $50 the balance will be down to $965.
Here's the first version of the program:
def number_of_payments ( amount, count = 0 ):Here's the program done again iteratively:
if amount * 1.015 <= 50:
return count + 1
else:
return number_of_payments ( amount * 1.015 - 50, count + 1 )
print "Number of Payments: ", number_of_payments (1000)
amount = 1000.00
number_of_payments = 0
while amount > 0:
amount *= 1.015
amount -= 50
number_of_payments +=1
print "Number of Payments: ", number_of_payments
Now for the Fun Problem
Both the iterative and the recursive problems failed for any intial balances over $3333.00. With iteration, Python failed immediately, giving me a stack overflow error, but with iteration, it would run forever until I Ctrl-C'ed it. I could understand the stack overflow, but what was going on with the iteration?
I tried the same code out in Java and assembly, and ran into the exact same problem every time. Clearly, it was not simply an issue with Python. After an hour of trying to figure out if it was something wrong with my code, I was beginning to wonder if there was some inexplicable problem in Ubuntu with memory management.
And then I had the magical "Doh!" moment.
You see, the errors/perpetual loops I was getting were not errors, but an indication that $3333 is the threshold above which a $50 minimum payment will never pay off the balance. The interest becomes more than $50, and it spirals off into an infinite number of payments.
Now I'm wondering if I could write a program that, given any interest rate and minimum payment, could calculate the maximum balance that would allow the debt to be paid off. I'm not sure if I should include stipulations like "paid off during a normal lifetime", "paid off before the sun novas", or "paid off before the heat death (or contraction, depending on your model) of the universe". ;)
Some Thoughts
The exercise has given me some fun insight into recursion and iteration. I had always been told that with primarily iterative languages -- e.g. Java and Python, as opposed to, say ML -- that recursion is generally less efficient, even if it happens to be easier to write in certain problems like "The Tower of Hanoi". While I can't detect a slower run-time, I can see that recursion eats up the stack like no tomorrow. Clearly, with these languages iteration is the best choice for most applications, even if it involves a little more thought on the programmer's part.
But I can also see where catching a stack overflow error may be of some use. My experiment revealed that there are many cases where unexpected values (from user input, for example?) could cause an iterative program to run to infinity. In the case of my program about calculating credit card payments, there are some minimum payments that just won't ever pay off the debt. In these cases I wonder if the stack overflow errors generated by recursion -- which can be caught -- might be a useful way to handle unknown, potentially infinite, values**.
All in all, a fun problem to solve. I'm rather enjoying making simple programs more complex than necessary simply for the sake of breaking and fixing things.
-----------------------------------------------------------
** that assumes that the programmer doesn't think too look for some sane value, like the expected lifetime of a human being, to set as a checkpoint. However, I can see such an artificial limit quickly becoming obsolete if we ever hit the singularity. ;P
(It also assumes that my meager, relatively useless program will still exist somewhere when the singularity occurs. lol)