An algorithm is ...
a procedure (steps) to accomplish a task
To be interesting (and worth developing) it should
- It's the idea behind any program.
Enjoy long(er)-term usage/reusage
What do we mean by "PROBLEM" versus instance? Consider our old friend sorting...
that is, solve a well-specified PROBLEM versus a particular instance
Solving the well-specifed PROBLEM expressed as
Algorithms developed to solve the sorting problem
- Input: sequence of n keys
- Output: permutation of input sequence s.t.
Versus solving an instance of it, expressed as
- Reverse alphabetize the students currently enrolled in 157 by their middle names,
using "Fred" for those students without middle names. YUCK!
|Insertion, Bubble, Radix,
Heap, Column, ????
How many are there?
How do you choose?
What are desirable properties?
How should we compare algorithms?
Take Home Lessons
Reasonable looking algorithms can easily be incorrect.
Can often be dead wrong.
Correctness is a property that must be demonstrated/proven.
"It's obvious" never suffices.
These were the naive 157 ways...but won't fly for the
more sophisticated 320 student!
Neither does a demo on an instance.
We will differentiate between a good "heuristic" and a good "algorithm".
Algms can and should be understood & studied in a machine-independent way.
"big-Oh" notation & worst-case analysis simplify our ability to compare.
Holy Grail are algms w/ running times which grow logarithmically WRT input size.
logB(n) grows very slowly with increasing n
SINGLE MOST IMPORTANT STEP towards writing good algorithms is ...
thinking in terms of well-defined DATA structures and (sub)algms (ie, methods/procedures).