CM10135 / Programming II:   Lecture 3


Introduction to Algorithms and Complexity


"In teoria, non c'e' differenza tra teoria e pratica. Ma in pratica c'e'" 

(In theory, there is no difference between theory and practice. But, in practice, there is.)

  -- Jan L.A. van de Snepscheut


-I.  Housekeeping:

  1. How are the tutorials going?
  2. No recommended text for this course (yet.)
    1. There may be no java books that are also good CS texts.
    2. The ones Dr. De Vos suggested are fine.
    3. The on-line notes may link to more resources.
  3. New computing lab 2E 1.14 --- esp. for computer science all night!  all locations.
  4. Java help sessions Friday 18th in 1W 2.25 from 14:15 to 15:15
  5. Books for you guys:
    1. Books only cover GUIs, threads, events, applets, networking (hard book), basic programming (easy book).
    2. Books don't cover algorithms, complexity, data structures, but see front web page for excellent notes on data structures, sorting (with animations!), searching and complexity.
    3. Books:
      1. easier: Java Programming Today, Barbara Johnston.
      2. harder: Learning Java (O'Reilly, tiger mum & cubs) Niemeyer & Knudsen.
      3. May also want to look at Object-Oriented Programming with Java by David Barnes (the guy who wrote the Blue Jay book.)  It has nice networking examples too.

I. Criteria for Evaluating an Algorithm

  1. Main Criteria
    1. Speed
    2. Size
    3. Risk of failure.
    4. Ease of maintenance.
  2. Speed is the main criteria that we'll talk about.
  3. Speed and Size are the two things Computer Scientists might be talking about when they talk about complexity formally.
  4. Of course, the conventional meaning of complexity (how complicated it is to understand the algorithm) affects both risk and maintenance.
    1. Often worth going with something slightly slower if it will be easier to maintain.
    2. Software development is often more of a bottleneck than processor speed.
      1. Good programmers are more expensive than fast computers.
      2. Brooks' Law:  Adding manpower to a late software project makes it later.
        1. This law is so old it's not even gender neutral (In fact, from "The Mythical Man-Month", 1975) but it's still true.
        2. Slashdot review of The Mythical Man-Month 20th Anniversary Edition.
      3. But sometimes, time really matters.
          1. Graphics, games engines.
          2. Database engines.
          3. Weather simulations.
          4. Social simulations of millions of agents, 
          5. Accurate simulations in general:
            1. the evolution of life, the big bang, hydrogen atoms, brain cells etc.
            2. Right after the first time I gave this lecture (2004) I got a talk announcement on the importance of algorithms for molecular biology / drug discovery from Bruce R. Donald.  Whether you care about helping humanity or making money (not an xor) that's an important research field!
  5. How do you measure Speed?
    1. Stop watch usually isn't a practical way to check (though see the quote above!)
    2. Speed of one instance of an algorithm depends on:
      1.  the processor it's run on
      2. other components (e.g. graphics card, bus)
      3. What else is happening on the computer.
      4. Amount of RAM available.
        1. This is only true because read & write operations take time, even to memory.
        2. But they take more time if they have to go to disk.
        3. If a computer runs out of space in RAM, it swaps memory onto disk.
        4. Very bad thing:  if most time is spent swapping, little on the computation.
        5. Can happen if working with very large data sets, not processing the data efficiently.
      5. But this isn't what most computers scientists are talking about when they talk about time complexity.
  6. Algorithms are normally analyzed in terms of:
    1. The number of operations they perform.
    2. The types of operations they perform.
    3. How the number of operations they perform changes if parameters change.  The key point!!
      1. These criteria are the same for both time and space.
      2. Usually ignore most of the operations and focus on a few that are most significant
        1. e.g. for time disk reads, hard arithmetic
        2. e.g. for space `new' commands (things that allocate more memory.)
  7. How the number of operations they perform changes if parameters change?
    1. This question is refereed to as  scaling.
    2. Scaling happens with respect to some parameter.
    3. Example:  As an animal gets taller, it's weight typically scales at height3.
      1. This is because weight is directly related to volume, not height.
      2. volume = height x width x depth.  If you assume width & depth are also correlated to height, then volume is correlated to height3.
      3. Bone strength can grow this same way, but exoskeletons can't, so vertebrates can grow bigger than insects.
  8. Example in algorithms:  finding the length of an array:  How does this scale on the number of items in a collection?
    1. Just look up a variable in the collection object that tells you its length.
      1. Always takes the same number of steps however many items there are.
      2. This is called a constant time algorithm.
    2. Start at the beginning and count until you reach the last item (which must be marked somehow, like in the lists.)
      1. The number of steps is dependent on the number of objects.
      2. This is a linear algorithm.
    3. If you are checking for how many unique items are in the collection, then for each item of the list you will have to check if any of the other items are the same, so you go through the list once for each item in the list.
      1. The number of steps is the square of the number of items in the collection.
      2. This is said to scale quadratically.
  9. What do you need to know about algorithms and complexity?
    1. You'll get a longer list of these next week, for now I just want you to get the feel. 
    2. You should be able to plot a graph with the number of items on the X axis, and time (or space) on the Y axis.
    3. You will be thinking about several different cases:
      1. Worst case,
      2. Best case,
      3. Average or expected case.
drawing of constant, linear & quadratic time plotted  with respect to # of objects

Notice that an algorithm may look very good at a low N, but then turn out to be a nightmare at higher N!  On the other hand, if you know for certain that you will only have low N for an application, you may still want to consider that algorithm.

page author: Joanna Bryson
17 Feb 2005