Reading 15: Classification II: Discriminative Classification, Ensemble Methods

Reading 15: Classification II: Discriminative Classification, Ensemble Methods#

For the class on Wednesday, March 27th

Reading assignments#

  1. Read the following sections of [ICVG20]:

    • Sec. 9.5 “Discriminative Classification”

      • Sec. 9.5.1 “Logistic Regression”

    • Sec. 9.7 “Decision Trees”

      • (Optional) Sec. 9.7.1 “Defining the Split Criterion”

      • Sec. 9.7.2 “Building the Tree”

      • Sec. 9.7.3 “Bagging and Random Forests”

      • Sec. 9.7.4 “Boosting Classification”



Submit your answer on Canvas. Due at noon, Wednesday, March 27th.

  1. List anything from your reading that confuses you. Explain why they confuse you. You are strongly encouraged to think about what questions you have about the reading, but if you really have no questions at all, please briefly summarize what you have learned from this reading assignment.

  2. We often use random forest methods and rarely use a single decision tree. Based on what you read, briefly explain why.

Discussion Preview#


We will discuss the following questions in class. They are included here so that you have a chance to think about them before class. You need not submit your answers as part of this assignment.

  1. We will discuss overall framework of discriminative methods.

  2. We will discuss how ensemble methods improve a classifier’s performance, and how to use cross-validation to verify such improvement.

  3. We will continue to compare a few different classification methods (and will focus on discriminative methods in this class).