Complexity of Algorithm

50 %
50 %
Education

Published on August 22, 2008

Author: nitinmishra10

Source: authorstream.com

Complexity of Algorithms : Complexity of Algorithms Nitin Mishra M.Tech NIT Allahabad Asst.Professor ITM-Bhilwara Complexity of Algorithms : Complexity of Algorithms Efficiency or complexity of an algorithm is stated as a function relating the input length to the number of steps (time complexity) or storage locations (space complexity). Time Complexity : Time Complexity Running time of the program as a function of size of input Space Complexity : Space Complexity Amount of Computer memory required during the program execution, as a function of input size. Asymptotic Notation : Asymptotic Notation Helps to compare algorithms Suppose we are considering two algorithms, A and B, for solving a given problem. Furthermore, let us say that we have done a careful analysis of the running times of each of the algorithms and determined them to be Ta(n) and Tb(n), respectively, where n is a measure of the problem size. Then it should be a fairly simple matter to compare the two functions and to determine which algorithm is the best! An Asymptotic Upper Bound-Big Oh : An Asymptotic Upper Bound-Big Oh In 1892, P. Bachmann  invented a notation for characterizing the asymptotic behavior of functions. His invention has come to be known as big oh notation: Definition (Big Oh)     Consider a function f(n) which is non-negative for all integers . We say that ``f(n) is big oh g(n),'' which we write f(n)=O(g(n)), if there exists an integer no and a constant c>0 such that for all integers n>n0, f(n)<cg(n) . An Asymptotic Lower Bound-Omega : An Asymptotic Lower Bound-Omega The big oh notation introduced in the preceding section is an asymptotic upper bound. In this section, we introduce a similar notation for characterizing the asymptotic behavior of a function, but in this case it is a lower bound. Definition (Omega)     Consider a function f(n) which is non-negative for all integers n>0 . We say that ``f(n) is omega g(n),'' which we write f(n)=Ω(n) , if there exists an integer and a constant c>0 such that for all integers n>no , f(n)>cg(n) . The definition of omega is almost identical to that of big oh. The only difference is in the comparison--for big oh it is ; for omega, it is . All of the same conventions and caveats apply to omega as they do to big oh. Theta and Little Oh : Theta and Little Oh Definition (Theta)     Consider a function f(n) which is non-negative for all integers n>0 . We say that ``f(n) is theta g(n),'' which we write f(n)=Θ(g(n)) , if and only if f(n) is O(g(n)) and f(n) is Ω(g(n)). Types of Analysis : Types of Analysis Worst case running time Average case running time Best case running time Amortized running time Worst case Running Time : Worst case Running Time . The behavior of the algorithm with respect to the worst possible case of the input instance. The worst-case running time of an algorithm is an upper bound on the running time for any input. Knowing it gives us a guarantee that the algorithm will "never take any longer. There is no need to ­make an educated guess about the running time. For expressing worst case running time of an algorithm Big-Oh notation is used. Average case Running Time : Average case Running Time The expected behavior when the input is randomly drawn from a given distribution. 'rhe average-case running time of an algorithm is an estimate of the running time for an "average" input. Computation of average-case running time entails "knowing all possible input sequences, the probability distribution of occurrence of these sequences, and the running times for the individual sequences. Often it is assumed that all inputs of a given size are equally likely. For expressing average case running time of an algorithm 8 notation is used. Best case Running time: : Best case Running time: The behavior of the algorithm when input is in already in order. For example in sorting, if elements are already sorted for a specific algorithm. The best case running time rarely occurs in practice comparatively with the first and second case. For expressing best case running time of an algorithm Big­Ohm notation is used. Amortized Running Time : Amortized Running Time Here the time required to perform a sequence of (related) operations is averaged over all the operations performed. Amortized analysis can be used to show that the average cost of an operation is small, if one averages over a sequence of operations, even though a simple operation might be expensive. Amortized analysis guarantees the average performance of each operation in the worst case.

 User name: Comment:

January 22, 2018

January 22, 2018

January 22, 2018

January 22, 2018

January 18, 2018

January 22, 2018

Related pages

Analysis of algorithms - Wikipedia, the free encyclopedia

In computer science, the analysis of algorithms is the determination of the amount of resources (such as time and storage) necessary to execute them.

Algorithmic Complexity - Carnegie Mellon School of ...

Algorithmic Complexity Introduction. Algorithmic complexity is concerned about how fast or slow particular algorithm performs. We define complexity as a ...

Computational complexity theory - Wikipedia, the free ...

Computational complexity theory is a branch of the theory of computation in theoretical computer science that focuses on classifying computational problems ...

Time Complexity of Algorithms - SitePoint – Learn HTML ...

Alexander Cogneau explains time complexity of algorithms, the Big O notation, and demonstrates how an algorithm can be optimized

Complexity of Algorithms - www.cs.elte.hu - Matematikai ...

1 Complexity of Algorithms Lecture Notes, Spring 1999 Peter G¶acs Boston University and L¶aszl¶o Lov¶asz Yale University

A Gentle Introduction to Algorithm Complexity Analysis

A Gentle Introduction to Algorithm Complexity Analysis Dionysis "dionyziz" Zindros English; Ελληνικά; македонски

Big-O Algorithm Complexity Cheat Sheet

Know Thy Complexities! Hi there! This webpage covers the space and time Big-O complexities of common algorithms used in Computer Science. When preparing ...

Complexity of Algorithms - Computing Science - Simon ...

Complexity. The whole point of the big-O/Ω/Θ stuff was to be able to say something useful about algorithms. So, let's return to some algorithms and see ...