CS449/549 Computational Learning

Prerequisites (CS449): CS344 and CS345
Pre- or co-requisites (CS549): CS541 and CS547, or consent of instructor
Instructor: Christino Tamon
Lectures: SC 346, Tuesday and Thursday 13:00-14:15 SC 346
Office hours: SC373, Tuesday and Thursday 10:45-12:00 and 14:15-15:30


Computational learning studies algorithmic problems for inferring patterns and relations from data. This course describes the mathematical foundations of learning and explores important connections and applications to areas such as artificial intelligence, cryptography, statistics, and bioinformatics. A list of relevant topics may include perceptron and online learning, graphical models and probabilistic inference, decision tree induction and boosting, analysis of Boolean functions, sample complexity bounds, cryptographic and complexity hardness, and reinforcement learning. Basic ideas from computer science and mathematics are employed to describe the main ideas and major developments in computational learning. [CS549: Students are expected to learn and explore recent research ideas in the area.]

Objectives and Outcomes

The objective of this course is to introduce students to basic algorithmic ideas and methods in machine learning and to describe its important connections and applications to areas such as artificial intelligence, cryptography, and bioinformatics. The expected outcomes include a familiarity with the problems of machine learning, the ability to design and implement learning algorithms, and the exposure to relevant research articles in the area.

Requirements and Policies

Although attendance is not mandatory, students are responsible for all course materials covered in lectures. Students who need to make up missing course work must provide the required Clarkson official exempt form. All students must submit their own work; the exchange of ideas are encouraged but ultimately the submitted work must be the student's own. Please refer to the Clarkson University Regulations for more guidelines on academic integrity and related matters.

Grading (tentative)


Schedule (tentative)

Tuesday Thursday
01/09/07 01/11/07: Introduction; Learning models; Examples
01/16/07: Online model; Disjunction/Halfspaces 01/18/07: Perceptron/WINNOW; Weighted Majority; Experts
01/23/07: Exact model; DFA; Angluin's algorithm 01/25/07: Exact model; DFA; other algorithms
01/30/07: PAC model; Chernoff bounds; Rivest's DL algorithm 02/01/07: Occam's razor
02/06/07: Lower bounds; VC dimension and fingerprints 02/08/07: Lower bounds; cryptography and complexity
02/13/07: short break 02/15/07: Boosting; AdaBoost
02/20/07: More boosting; C4.5 02/22/07: DNF: Verbeugt's algorithm; Greedy
02/27/07: DNF; Harmonic Sieve 03/01/07: HMMs; Viterbi's algorithm
03/06/07: More HMMs; Applications 03/08/07: Learning with noise; XOR problem
03/13/07: SQ model 03/15/07: Reinforcement learning
03/20/07: break 03/22/07: break
03/27/07 03/29/07
04/03/07 04/05/07
04/10/07 04/12/07
04/17/07 04/19/07
04/24/07 04/26/07

Optional readings (under construction)