Analysis of algorithms
In computer science, the analysis of algorithms is the process of finding the computational complexity of algorithms—the amount of time, storage, or other resources needed to execute them. Usually, this involves determining a function that relates the size of an algorithm's input to the number of steps it takes (its time complexity) or the number of storage locations it uses (its space complexity). An algorithm is said to be efficient when this function's values are small, or grow slowly compared to a growth in the size of the input. Different inputs of the same size may cause the algorithm to have different behavior, so best, worst and average case descriptions might all be of practical interest. When not otherwise specified, the function describing the performance of an algorithm is usua
- Comment
- enIn computer science, the analysis of algorithms is the process of finding the computational complexity of algorithms—the amount of time, storage, or other resources needed to execute them. Usually, this involves determining a function that relates the size of an algorithm's input to the number of steps it takes (its time complexity) or the number of storage locations it uses (its space complexity). An algorithm is said to be efficient when this function's values are small, or grow slowly compared to a growth in the size of the input. Different inputs of the same size may cause the algorithm to have different behavior, so best, worst and average case descriptions might all be of practical interest. When not otherwise specified, the function describing the performance of an algorithm is usua
- Depiction
- Has abstract
- enIn computer science, the analysis of algorithms is the process of finding the computational complexity of algorithms—the amount of time, storage, or other resources needed to execute them. Usually, this involves determining a function that relates the size of an algorithm's input to the number of steps it takes (its time complexity) or the number of storage locations it uses (its space complexity). An algorithm is said to be efficient when this function's values are small, or grow slowly compared to a growth in the size of the input. Different inputs of the same size may cause the algorithm to have different behavior, so best, worst and average case descriptions might all be of practical interest. When not otherwise specified, the function describing the performance of an algorithm is usually an upper bound, determined from the worst case inputs to the algorithm. The term "analysis of algorithms" was coined by Donald Knuth. Algorithm analysis is an important part of a broader computational complexity theory, which provides theoretical estimates for the resources needed by any algorithm which solves a given computational problem. These estimates provide an insight into reasonable directions of search for efficient algorithms. In theoretical analysis of algorithms it is common to estimate their complexity in the asymptotic sense, i.e., to estimate the complexity function for arbitrarily large input. Big O notation, Big-omega notation and Big-theta notation are used to this end. For instance, binary search is said to run in a number of steps proportional to the logarithm of the size n of the sorted list being searched, or in O(log n), colloquially "in logarithmic time". Usually asymptotic estimates are used because different implementations of the same algorithm may differ in efficiency. However the efficiencies of any two "reasonable" implementations of a given algorithm are related by a constant multiplicative factor called a hidden constant. Exact (not asymptotic) measures of efficiency can sometimes be computed but they usually require certain assumptions concerning the particular implementation of the algorithm, called model of computation. A model of computation may be defined in terms of an abstract computer, e.g. Turing machine, and/or by postulating that certain operations are executed in unit time.For example, if the sorted list to which we apply binary search has n elements, and we can guarantee that each lookup of an element in the list can be done in unit time, then at most log2(n) + 1 time units are needed to return an answer.
- Hypernym
- Determination
- Is primary topic of
- Analysis of algorithms
- Label
- enAnalysis of algorithms
- Link from a Wikipage to an external page
- archive.org/details/algorithmsinc00sedg
- Link from a Wikipage to another Wikipage
- Abstract machine
- Algorithm
- Algorithmic efficiency
- Amortized analysis
- Analysis of parallel algorithms
- Arbitrary-precision arithmetic
- Arithmetic progression
- Asymptotic analysis
- Asymptotic computational complexity
- Benchmark (computing)
- Best, worst and average case
- Big-omega notation
- Big O notation
- Big-theta notation
- Binary search
- Binary search algorithm
- Cambridge University Press
- Category:Analysis of algorithms
- Category:Computational complexity theory
- Collation
- Computational complexity
- Computational complexity theory
- Computational problem
- Computer
- Computer file
- Computer program
- Computer science
- Cryptography
- Deterministic system (mathematics)
- Donald Knuth
- DSPACE
- DTIME
- Elegance
- Empirical
- Exponential growth
- Factorization
- File:Binary search vs Linear search example svg.svg
- File:Comparison computational complexity.svg
- Function (mathematics)
- Hybrid algorithm
- Implementation
- Information
- Information-based complexity
- Insertion sort
- Instruction (computer science)
- Introduction to Algorithms
- Iterated logarithm
- Iteration
- Kilobyte
- Linear
- Linear search
- List (computing)
- Logarithm
- Logarithmic time
- Log–log plot
- Master theorem (analysis of algorithms)
- Merge sort
- Model of computation
- Nanosecond
- NP-Complete
- Numerical analysis
- Operating system
- Platform-independent
- Polynomial time
- Profiling (computer programming)
- Program loop
- Programming language
- Program optimization
- Pseudocode
- Quadratic growth
- Quicksort
- Resource (computer science)
- Rule-of-thumb
- Scalability
- Segmented memory
- Smoothed analysis
- Software profiling
- Space complexity
- Termination analysis
- The Art of Computer Programming
- Time complexity
- Timsort
- Turing machine
- Upper bound
- Wiktionary:Constant
- SameAs
- 359MG
- Algoritma analizi
- Algoritmeanalyse
- Análise de algoritmos
- Anàlisi d'algorismes
- Análisis de algoritmos
- Analiza algoritama
- Analiza algorytmów
- Analyse de la complexité des algorithmes
- Analysis of algorithms
- Analýza algoritmů
- Časovna zahtevnost
- m.0xvr
- Phân tích thuật toán
- Q333464
- Анализа алгоритама
- Аналіз алгоритмів
- אנליזה של אלגוריתמים
- تحليل الخوارزميات
- تحلیل الگوریتمها
- कलनविधियों का विश्लेषण
- അൽഗൊരിതങ്ങളുടെ വിശകലനം
- การวิเคราะห์ขั้นตอนวิธี
- ალგორითმთა ანალიზი
- アルゴリズム解析
- 算法分析
- 알고리즘 분석
- Subject
- Category:Analysis of algorithms
- Category:Computational complexity theory
- Thumbnail
- WasDerivedFrom
- Analysis of algorithms?oldid=1122036611&ns=0
- WikiPageLength
- 26090
- Wikipage page ID
- 2230
- Wikipage revision ID
- 1122036611
- WikiPageUsesTemplate
- Template:Cite book
- Template:Color
- Template:Commons category-inline
- Template:Computer science
- Template:Main
- Template:Math
- Template:More footnotes
- Template:Mvar
- Template:Quote
- Template:Reflist
- Template:Short description