Boosting. Theory and Practice
In the 2009 winter quarter CSE254 will be tought by Yoav Freund and will concentrate on boosting algorithms.
Time: Tue, Thu 12:30-1:50 PM (No class on Tue, Feb 24)
Room: New Location: Center Hall 206
The topics covered will include:
- Adaboost, LogitBoost and NormalBoost
- Using JBoost
- The large-margin theory
- Understanding and using empirical score distributions.
- Using boosting for problems other than binary classification
- The one-sided classification framework.
- Multi-label classification
- Detecting rare events
- Active Learning using Boosting.
- Resampling training examples to speed up training
- The cascade method for optimizing the speed of the scoring function.
The emphasis in this class will be on the application of boosting to solve real-world problems. Students are expected to have a strong background in programming (Matlab, Java, Perl) and an interest in large scale data analysis.
- --SunsernCheamanunkul 19:40, 28 January 2009 (UTC) You can find the code for RobustBoost [here]. In order to get the code, you need an account on seed machine. Please send an email to scheaman-at-ucsd if you need one.
- --WilliamBeaver 18:17 25 January 2009 (PST) a pre-release of jboost-1.4.1 is now available here (not sourceforge). These packages are builds of sf.net snapshots taken at the time of this post (please verify the timestamp to be 1/25/09 18:17). This release contains many small bug fixes, a new nfold.py script with a slightly modified output structure (READMEs in /scripts are up to date), initial support for classifier output in python, plus things I cannot recall right now. If you have problems, post to jboost mail lists or email me. Note: this is a pre-release. The Jboost website does not reflect these changes. Thanks.
- CSE254_Lecture 1 Intro
- CSE254_Lecture 2 Boosting resistance over-fitting, the Margins theory.
- CSE254_Lecture 3 Boosting for Multi-class and for regression.
- CSE254_Lecture 4 Alternating decision trees and JBoost
- CSE254_Lecture 5 Boosting data with label noise - convex vs. non-convex loss functions.
- CSE254_Lecture 6 Robust-Boost (This will probably take several lectures)
- CSE254_Lecture 7 Boosting the margin - Why Adaboost tends not to overfit.
- CSE254_Lecture 8 Calibrating scores.
- CSE254_Lecture 9 Automatic Cameraman.
- CSE254_Lecture 10 Boosting Image Retrieval
- CSE254:jamie alexandre
- CSE254:matt jacobsen
- CSE254:ilya valmianski
- CSE254:sunsern cheamanunkul
- CSE254:ben and mayank
- CSE254:Natalie Castellana
- CSE254:Andy Stout
- CSE254:Chris Barngrover and Brent Payne
- CSE254:Evan Ettinger and Samory Kpotufe
- CSE254:William Beaver
- CSE254:Iman Mostafavi
- CSE254:Arturo Flores
- CSE254:Matan Hofree
- CSE254: Paul Loriaux
- CSE254: Heejin Choi