Constant time
From Wikipedia, the free encyclopedia
In computational complexity theory, constant time, or O(1) time, refers to the computation time of a problem when the time needed to solve that problem does not depend on the size of the data it is given as input.
For example, accessing any single element in an array or memory bank takes constant time as only one operation has to be made to locate it. However, finding the minimal value in an unordered array is not a constant time operation as a scan over each element in the array is needed in order to determine the minimal value. Hence it is a linear time operation, taking O(n) time (unless some more efficient algorithm is devised, for example, a binary search across a sorted array). If the number of elements is known in advance and does not change, however, such an algorithm can still be said to run in constant time.