Tuesday
Agenda
- How to measure 'goodness' of algorithms
- Analysis of algorithms: counting steps
- Big-O notation
How Good is my Algorithm?
We can measure how good an algorithm is in a number of ways:
- How correct it is. In this course, we usually require that our algorithms always return a correct result.
- Sometimes, when correct algorithms are too expensive, approximations must be used, and we can compare approximation algorithms according to how close they get to an optimal solution.
- How long it takes to execute.
- How much memory is required.
- How complex the algorithm is.
This last one is really a secondary concern, we usually prefer complex fast algorithm to a simple slow one. However, if all other things are equal, the simpler solution will be preferable.
In this course, we focus on algorithms that always return a correct solution, and we usually compare them only based on their running time – though there will be a few cases where we will compare the memory requirements of algorithms if their runtimes are similar.
Analysis of Algorithms
Measuring the time taken to execute an algorithm is not the proper way to analyse an algorithm:
- Different programming languages execute algorithms at different speeds.
- Different computers execute algorithms at different speed.
We want a way of analysing algorithms that is independent of programming languages and hardware.
To analyse an algorithm, we determine what is the basic step of the algorithm, and count the number of times this step is executed.
Examples: is_sorted_1.cpp is_sorted_2.cpp is_sorted_3.cpp
Big-O
The Big O notation is a mathematical notation that indicates an upper bound on the growth of a function for large enough values of its input.
Mathematically, we say that a function $f(n)$ is $O(g(n))$ if and only if
$\exists c>0, k \geq 1$ such that $\forall n \geq k$ $f(n) \leq c \cdot g(n)$
or in English,
There exist constants $c>0$ and $k \geq 1$ such that for all $n \geq k$, $f(n) \leq c \cdot g(n)$.
In simple terms, it means that eventually, within a constant factor, $f(n)$ will not grow any faster than $g(n)$.
Thursday
Agenda
- class Destructor
- More on Big-O
- Sequential Search
- Selection Sort
Destructor
The role of a destructor is to free all the memory that was dynamically allocated by an object. Every call to the function "new" by the object should be matched with a call to the function "delete" somewhere in that same object. The role of the destructor is to call delete on all the pieces that were not freed during the lifetime of the object.
If it helps you, think of the objects as being the owner of all the memory that it created with the function new. As the owner of those pieces of memory, it is its job to free them before it disappears. As such, it would be bad practice for the owner of a piece of memory to require anybody else to free it. So if an object does not use any dynamic memory, its destructor should not have any memory to free. If an object does use dynamic memory, it is its job to free all of it before it disappears.
Sequential Search
Sequential search is an algorithm that attemps to find a value in an array by sequentially inspecting the value of all the buckets in the array. The runtime of the algorithm is $O(n)$.
Selection Sort
Selection sort is a sorting algorithm that sorts an array by repeatedly finding the maximum value in the array, swapping it at the end, and decreasing the size of the array that remains to sort by 1. The running time of the algorithm is $O(n^2)$.
You can find an implementation of Selection Sort in your
examples
repository