I have the definition of in-situ algorithm from the professor, but I don't understand it.
In-situ algorithms refer to algorithms that operate with Θ(1) memory.
What does that mean?
|
I have the definition of in-situ algorithm from the professor, but I don't understand it.
What does that mean? |
|||||||||
|
|
Constant space complexity of algorithm
An algorithm is said to have constant space complexity if it makes use of fixed amount of space. It can be $10$ variables or an array of exactly $10$ elements. However, In-situ algorithms perform their intended function on the input itself and thus require very less or no extra space. The input is usually overwritten by the output as the algorithm executes. (ref) |
||||
|
|
|
First, let's unpack what $\Theta(1)$ means. Big $O$, and big $\Theta$, are classes of functions. There's a formal definition here, but for the purposes of this question, we say that a function $f$ is in $O(1)$ if there's a constant $c$ where, for all $x$, $f(x) \leq C$. That is, $f$ grows at most as fast as a constant function. Big-$\Theta$ doesn't mean much for constant functions, because when describing algorithm time or space usage, there isn't much below constant. But to explain what it means, $f \in \Theta(1)$ if there are some constants $c,d$ such that, for all $x$, $d \leq f(x) \leq c$. That is, $f$ grows at least as fast, and at most as fast, as a constant function. Now what does this have to do with memory usage? Consider some algorithm $A$. There is some (mathematical) function which, given an input $n$, gives the maximum memory usage of your algorithm $A$ on any input of size $n$. Let's call this function $mem$. So, now we combine our two concepts. If an algorithm uses $\Theta(1)$ memory, then its memory usage function is in $\Theta(1)$, meaning that there exists some $d,c$ such that, for any input, the memory used is between $d$ and $c$. In short, this means that the memory usage of the algorithm is in some constant range, regardless of the input. Usually, the memory function does not account for the memory used to store the input to the algorithm, since otherwise memory usage would always be at least $\Theta(n)$. |
|||||||||||||
|
|
That means that additional memory amount required for algorithm is not greater than some constant amount that doesn't depend on input size for sufficiently big input. |
||||
|
|