CS
607: Topics in Computer Science
Course Syllabus  Spring 1998
Professor: C. Tamon
Office: 369 Science Center
Phone: 2686521
Email: tino@sun.mcs.clarkson.edu
Office Hours: M,W,F 10:0012:00 and by appointment
Bagging, Boosting, and Fourier Transform
Purpose: The goal of this project is to investigate several
open problems in machine learning that involve bootstrap aggregating (bagging),
boosting, and Fourier transform methods.
Background: Bagging is a wellknown method in statistics to
improve the performance of predictors (see Efron & Tibshirani (1993),
Breiman (1996)). Boosting is a similar method that was developed in learning
theory (see Freund & Schapire (1995)) and is somewhat superior than bagging
because of its adaptive properties. Unfortunately, boosting is inherently
sequential in nature whereas bagging admits a relatively simple parallel
implementation. The question is whether there is a parallel compromise
between these two methods.
A powerful theoretical learning algorithm based on Fourier transform
and boosting was developed by Jackson (1994) that can learn expressive
Boolean representation classes. Although the algorithms runs in polynomial
time, in practice, the time bound is still too high to be of any use.
We explore alternatives for improving Jackson's algorithm from both the
Fourier and the boosting aspects.
The following specific quesitons are currently being investigated:
 Parallel implementations of boosting
 Using approximation algorithms for MAXCUT to improve boosting algorithms
 Efficiency improvements on a boostingbased Fourier transform learning
algorithm
Participants: Anthony Moores, David Stracuzzi and Christino Tamon.
Last modified: 20 January 1998
tino@sun.mcs.clarkson.edu
