Section 12.5 Key Concepts
- The call stack is a data structure that tracks all currently executing methods.
- A stack frame tracks the state of each executing method on the call stack.
- A stack frame contains all information about the associated active method instance.
- Stack frames are pushed onto the top of the call stack when a method starts execution and popped off the top of the call stack when execution terminates.
- When a method invokes itself, a new stack frame is added to the call stack so all instances of an executing method may be tracked.
- The size of the call stack is limited by memory. If an excessive number of methods are invoked, the call stack may overflow resulting in a StackOverflowError being thrown.
- When a method (eventually) invokes itself during execution, the process is called recursion.
- When a recursively invoked method returns, the method call resolves to the returned value.
- Many algorithms may be implemented using recursion.
- Most recursive algorithms have alternative iterative implementations that tend to be more efficient in terms of the amount of memory used.
- Computing a factorial, the Fibonacci number, and the Greatest Common Divisor have elegant recursive implementations as well as iterative implementations.
- Recursive programs may be divided into direct recursion and indirect recursion.
- All recursive programs must have at least one base case and at least one recursive case.
- The base case executes when the recursive algorithm has reached the end of execution and no further recursion is required.
- A recursive case must move the state of the program closer to the base case in some way to ensure the recursive program terminates.
- There may be multiple base cases and multiple recursive cases in a recursive algorithm.
- Multi-case recursion occurs when a method has more than one base case and/or more than one recursive case.
- Multiple recursion occurs when the block of code that implements a single recursive case invokes multiple recursive methods or the same recursive method multiple times.
- Dynamic programming is a technique used to improve performance of certain recursive methods by saving intermediate results that would otherwise be computed multiple times.