We'd like a method for taking a program and producing a
mathematical function representing the amount of time it takes to run
it.
We can't do this, in general.
However, we can begin to make simplifying assumptions that let us
do this approximately. In order of decreasing accuracy:
- Simplifying assumption #1: The time it takes to execute a simple
statement like
depends only on the statement (not
the processor load, the amount of memory in the computer, the time of
day, the number of users, the value of x...).
- Simplifying assumption #2: The time it takes to execute a pair of
statements is the sum of the times to execute the individual
statements (ignoring issues like pipelining, etc.).
- Simplifying assumption #3: Any simple statement takes ``unit''
time (evaluating any arithmetic expression, assigning a variable,
initializing a loop, etc.).
- Simplifying assumption #4 (for later): Any constant-length (i.e.,
independent of the input) sequence of statements takes ``unit'' time.
This will be enough to get us started. We'll need to make
some even stronger assumptions later!
Next: Simple Programming Model
Up: EFFICIENCY ANALYSIS
Previous: Explaining the Growth Rates