Tuesday, 2 December 2014

big-oh

Seems like we did so many maths stuffs, csc165 needs to get back to the world of computer science! When we want to write a program to solve some specific problems on computer, we need to develop an algorithm first. For the same problem there will be different kind of algorithms and their complexities are also
quite different. The complexity of an algorithm is decided by the number of steps that the function will take to reach an output. Different steps will be taken for different inputs so sometimes we just take the worst situation. In the sight of computer scientists, there is no difference between some kind of functions for the number of steps, for example n^2 and 6n^2 + 2, because if n is very large the functions behave in almost the same way.They all belong to some thing pretty weird called O(n^2) and therefore a new concept big-oh is introduced.

The definition of big-oh is like this: f(x) in O(g(x)) is equivalent to " there exists a c>0, there exists a natural number b, for all natural number n, if n >= b then f(n) >= cg(n)." Because we can pick some certain b and c at first, to be honest I didn't face any problem in this section. Everything seems to be fine at least in this part.
All we have to do is to make f(x) <...<...<...<cg(x) and thus we complete the proof.

No comments:

Post a Comment