Introduction to Algorithm Analysis and Brute Force Programming Techniques

cmpe371 analysis of algorithms fall 2021 2022 n.w
1 / 30
Embed
Share

Dive into the fundamentals of algorithms and programming techniques with a focus on brute force approaches. Learn about running time analysis, pseudo-code representation, measuring algorithm efficiency, and more. Explore sorting algorithms, insertion sort, and the costs of running programs on computers. Understand the importance of algorithmic complexity and intractable problems to optimize program performance.

  • Algorithms
  • Programming Techniques
  • Brute Force
  • Running Time Analysis
  • Sorting

Uploaded on | 2 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. CMPE371 Analysis of Algorithms FALL 2021-2022 Lecture 1

  2. Lecture 1 Programming Techniques Brute force approach Running Time Analysis of Algorithms Introduction Pseudo-code representation of an algorithm Analyzing algorithms Measuring the running time and memory size of an algorithm Calculating the running time and memory size of an iterative algorithm Order of growth of a function, asymptotic notations Calculating the algorithmic complexity of an iterative algorithm: practical rules Calculating the algorithmic complexity of a recursive algorithm: recursion tree method Divide and conquer Greedy algorithms: activity selection problem fractional knapsack problem Dynamic programming: matrix chain multiplication problem longest common subsequence problem Graph Algorithms

  3. Introduction what is a computer program? a set of instructions written in a specific programming language that is complete enough for a computer to execute it automatically what is an algorithm? fundamental structure of a computer program set of instructions written in an abstract form, pseudo-code or flowchart, independently from any specific computer or programming language explains clearly to a human brain the general procedure to solve a problem

  4. Input Algorithm Output Example: sorting input: A sequence of numbers. output: An ordered permutation of the input. issues: correctness, efficiency, storage, etc.

  5. The problem of sorting

  6. Insertion Sort

  7. what are the costs of running a computer program on a computer? some time and some memory... cost analysis: we must have an idea of the running time and memory size required to solve a problem of a certain size maybe resolution becomes intractable when problem size increases... (would require a PC to run for even longer than the age of the universe, 13 billion years) intractable: possible in theory but impossible in practice... no need to write a program if we realize that it will not be usable... we should analyze the algorithm to know in advance the costs of running it

  8. Algorithm Analysis Determining performance characteristics. (Predicting the resource requirements.) Time, memory, communication bandwidth etc. Computation time (running time) is of primary concern. Why analyze algorithms? Choose the most efficient of several possible algorithms for the same problem. Is the best possible running time for a problem reasonably finite for practical purposes? Is the algorithm optimal (best in some sense)? Is something better possible?

  9. Analysis of algorithms The theoretical study of computer-program performance and resource usage. What s more important than performance? modularity correctness maintainability functionality robustness user-friendliness programmer time simplicity extensibility reliability

  10. two ways of analyzing an algorithm: precise calculation of the running time and memory size in function of the problem size or just have an idea of how fast the running time and memory size grow when the problem size increases order of growth of the running time = algorithmic complexity example: "this algorithm runs in n2 time" its running time grows: with n2 with the square of n quadratically with n its running time has the same order of growth as n2

  11. Running Time Definition Call each simple instruction and access to a word of memory a primitive operation or step. Running time of an algorithm for a given input is The number of steps executed by the algorithm on that input. Often referred to as the complexity of the algorithm.

  12. Complexity and Input Complexity of an algorithm generally depends on Size of input. Input size depends on the problem. Examples: Number of items to be sorted. Number of vertices and edges in a graph. Other characteristics of the input data. Are the items already sorted? Are there cycles in the graph?

  13. Worst, Average, and Best-case Complexity Worst-case Complexity Maximum steps the algorithm takes for any possible input. Most tractable measure. Average-case Complexity Average of the running times of all possible inputs. Demands a definition of probability of each input, which is usually difficult to provide and to analyze. Best-case Complexity Minimum number of steps for any possible input. Not a useful measure. Why?

  14. Pseudo-code representation of an algorithm to analyze easily an algorithm, we need to represent it in a way that is: clear and straightforward (the brain must easily comprehend its structure) without any detail of implementation (unnecessary for analysis) independent from any programming language (meant for the brain, not for a computer) flowchart representation of an algorithm is mostly outdated and not suitable for analysis

  15. pseudo-code conventions: based on the principles of structured programming (sequential structure, conditional structure, iterative structure) based on data abstraction: no variable declarations, no types we do not specify any memory allocation no pointers, no pointer arithmetic

  16. no input - output operations assignment: if-then-else construct variable value while construct block of code represented with indentation and vertically stretched square bracket 1D array: A[1..n] with 1..n range of the index element: A[i] 2D array: A[1..n , 1..m] with 1..n and 1..m ranges of indices element: A[i , j] function call: similar to C arguments are passed by value except for arrays that are passed by reference

  17. example: max = -1e20; for (i = 0 ; i < n ; i++) if (max < x[i]) max = x[i]; printf("\n max = %f" , max); } C code: #include <stdio.h> #define N 100 void main() { int i , n = 0; double v , x[N] , max; while (scanf("%lf" , &v) != EOF) { n++; if (n > N) exit(0); x[n - 1] = v; } pseudo-code: max - for i 1 to n if max < x[i] max x[i]

  18. Analyzing algorithms we must analyze an algorithm in order to predict the costs of executing it on a computer: running time necessary to execute the code T(n), n: input size size of memory necessary for the data structures of the code

  19. why we should know the costs of an algorithm before coding it: better to know in advance if the algorithm can be executed with affordable costs: running time of at most a few months... memory size inferior to the RAM (or virtual memory) of our computer. find the bottlenecks, i.e. the parts of the program that will take the most time compare the efficiencies of two (or more) algorithms

  20. we assume a generic model of computer: only one CPU instructions are executed one after the other no parallelism, no concurrency

  21. quantifying problem size and costs: quantifying the problem size: in general, n number of values that are input to the algorithm maybe, n size of a square matrix n = (number of values in the matrix) usually, a single integer n sometimes 2 integers necessary to describe the problem size size of a rectangular matrix = (number of rows , number of columns) size of a graph = (number of vertices , number of edges) quantifying the running time: real time in (nano)seconds or total number of instructions of a given type (floating point multiplication, carried out during the execution of the program = can be quasi proportional to physical running time

  22. Insertion Sort

  23. running time as a function: for a given algorithm, the running time generally depends on the size of the problem: "the bigger the problem, the more time it takes to solve it" running time T(n) = function of n T(n) may also depend on the input values themselves even when T does not depend only on n, we will keep writing "T(n)"

  24. best case - average case - worst case: for a given n, T(n) varies between its best case value (when the code runs in the shortest possible time) and its worst case value (when the code runs in the longest possible time) most of the time, T(n) is around its average case value

  25. choose for these test(s) the values that minimize - maximize T(n) best case - worst case analysis: find in the code the test(s) over the data lower bound - upper bound for T(n) worst case is the most significant case : we know things cannot be worse than that! very often, the average case happens to be nearly as bad as the worst case

  26. Measuring the running time and memory size of an algorithm (T(n)): problem: to measure T(n) and the memory size, we must first write the code and run it ! measure the running time: direct measure of physical time use function clock() in C language #include <time.h> . . . . . . . clock_t t0 = clock(); for (i = 1 ; i <= n ; i++) for (j = i + 1 ; j <= n ; j++) if (a[i][j] < 0) a[i][j] = a[i][j] * a[i][j]; printf("\n running time = %f" , (double)(clock() - t0) / CLOCKS_PER_SEC); . . . . . . .

  27. measure the number of times a certain instruction is executed insert a counter variable inside the code int counter = 0; for (i = 1 ; i <= n ; i++) for (j = i + 1 ; j <= n ; j++) if (a[i][j] < 0) { a[i][j] = a[i][j] * a[i][j]; counter++; } printf("\n running time = %d" , counter);

More Related Content