Uni-Logo
Algorithms and Complexity
 


Vorlesung
"Average-Case Analysis"
Wintersemester 2008/2009
Prof. Dr. Susanne Albers
Dr. Alexander Souza



Lecture

  • Time: Wednesday, 11-13 and Friday, 11-13
  • First Lecture: 22.10.2008
  • Location: SR 01-018 Geb. 101
  • Language: English
  • Exercise Sessions: Every other friday. First Session 7.11.2009.
  • ECTS: 6 credits

Exam

00-006 Geb. 79
Room
Date2.4.2009
10:30 - 11:00Benjamin Drayer
11:05 - 11:35Marius Heinzmann
11:40 - 12:10Claudius Korzen
13:00 - 13:30Tobias Langner
13:35 - 14:05Torsten Böttcher
Date16.4.2009
11:00 - 11:30Iftikar Ahmed
11:35 - 12:05Cameron Smith
13:00 - 13:30Kerstin Schmatzer


For remarks and questions please email to Alexander Souza.

News

24.9.2008
This page was put online.

27.10.2008
A preliminary version of Chapters 1-3 is available. Please mail me any error you find.
Exercise sheet 1 is available.

28.10.2008
Slides of Chapter 2 are available.

11.11.2008
Exercise sheet 2 is available.

12.11.2008
Slides of Chapter 3 are available.

25.11.2008
Exercise sheet 3 is available.

27.11.2008
Updated versions of Chapters 1-3 and slides of Chapter 3 are available.

16.12.2008
Chapter 4 is available.

8.1.2009
Chapter 5 and exercise sheet 5 are available.

21.1.2009
Chapter 6 and exercise sheet 6 are available.

10.2.2009
Exercise sheet 7 is available (a little delayed).

11.2.2009
Chapter 7 is available.

16.3.2009
Exam-date is out.

Contents

An algorithm is intended to solve problems of mathematical kind. As the name suggests, the classical worst-case perspective asks how a given algorithm behaves on the worst possible input. One is usually interested in the running time or approximation ratio of the algorithm.

In the average-case perspective, we are interested in the behaviour of an algorithm "in expectation", i.e., on "typical instances" (whatever this may mean).

There are several algorithms that admit bad worst-case behaviour, but which is rarely observed in practice. So, one of the goals of average-case analysis is to understand why and to close such gaps analytically. As a consequence, additional assumptions on input distributions are necessary.

One of the algorithms where a convincing average-case analysis could be carried out is the well-known Quicksort. It is not hard to see that it may require n2 comparisons for sorting an n-element array. However, only nlog n comparisons are needed with high probability.

Highlights:

  • Running time analysis of Quicksort
  • Knapsack problem in expected polynomial time
  • Typical performance of memory paging strategies
  • Scheduling under uncertainty

Basic knowledge in the area of algorithms, optimization, and probability theory are helpful but no prerequisite.

Lecture Notes and Slides

Available when ready. Please mail typos and other errors you find.

ChaptersVersion
1-621.1.2009 (preliminary)

SlidesVersion
Sorting and Searching28.10.2008
Knapsack27.11.2008
Paging7.1.2009

Exercise Sheets

Assignments to be handed in at the lecture. You will get the corrected assignments back at the exercise sessions.

Exercise SheetHand-In Date
15.11.2008
219.11.2008
33.12.2008
417.12.2008
514.1.2009
628.1.2009
713.2.2009

Further Material

  • Probability distributions
  • Pseudo-Random Number Generation: Knuth: The Art of Computer Programming - Volume 2 - Seminumerical Algorithms, Addison Wesley.
  • Simulating Random Variables: Ross: Introduction to Probability Models, Harcourt Academic Press.

Books

  • M. Mitzenmacher, E. Upfal: Probabilty and Computing, Cambridge University Press.
  • R. Motwani, P. Raghavan: Randomized Algorithms, Cambridge University Press.