Accueil > Term: big-O notation
big-O notation
A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items. Informally, saying some equation f(n) = O(g(n)) means it is less than some constant multiple of g(n). The notation is read, "f of n is big oh of g of n". Formal Definition: f(n) = O(g(n)) means there are positive constants c and k, such that 0 ≤ f(n) ≤ cg(n) for all n ≥ k. The values of c and k must be fixed for the function f and must not depend on n.
- Partie du discours : noun
- Secteur d’activité/Domaine : Informatique
- Catégorie : Algorithms & data structures
- Government Agency: NIST
0
Créateur
- GeorgeV
- 100% positive feedback