English

Trees for Regression and Classification

By  





Tree models are widely used for regression and classification problems, with interpretability and ease of implementation being among their chief attributes. Despite the widespread use tree models, a comprehensive theoretical analysis of their performance has only begun to emerge in recent years. This lecture provides an overview of tree modeling theory and methods, with an emphasis on risk bounds, oracle inequalities, approximation theory, and rates of convergence, in a variety of contexts. Special attention is devoted to decision trees and wavelet-based regression methods, two of the most well-known examples of tree models. The choice of loss function (squared error, absolute error, 0/1 error) plays a pivotal role in both theory and methods. In particular, optimal tree selection rules vary dramatically depending on the loss function employed. Despite these differences, suitable tree-based models coupled with appropriate selection rules can provide fast algorithms and near-minimax optimal performance in a very broad range of regression and classification problems. Examples from image reconstruction and pattern classification will demonstrate the effectiveness of trees in practice.
Find OpenCourseWare Online Exams!
Attribution: The Open Education Consortium
http://www.ocwconsortium.org/courses/view/fac0a97302632ee9623133ce8ae36c5b/
Course Home http://videolectures.net/mlss05us_nowak_trc/