Wednesday, October 28, 2009
Input Size and Running Time
In general, the time and or memory taken by an algorithm grows with the size of the input, so it is traditional to describe the running time of a program as a function of the size of its input. To do so, we need to define the terms “running time” and “size of input”. Immediate goal of analysis is to find a means of expression or a measuring yardstick that helps characterize the time or space requirements of running algorithm, and suppresses tedious details. This leads to a well-known concept of measuring complexity of an algorithm. The complexity of an algorithm is the function T(n) which gives the running time and/or storage space requirement of the algorithm in terms of the size n of the input data. Frequently, the storage space required by an algorithm is simply a multiple of the data size n. Accordingly; the term complexity shall refer to the running time of the algorithm only.
No comments:
Post a Comment