Search for question
Question

1. This is an analytic problem to be done by hand. Parts (a)-(c) refer to a 2-class nearest-means classifier, and class-mean vectors ₁, ₂. All parts below have D features. (a)

Give an algebraic expression for the decision boundary, the decision rule, and the discriminant function g(x). Simplify as much as possible. Is the classifier linear? (b) If the classifier is linear, starting from the general expression for a 2-class discriminant function g(x) for a linear classifier, find an expression for the weights for the nearest- means classifier in terms of the mean vectors μ₁, μ2. (c) If the classifier is not linear, then write an expression for a nonlinear function g(x) with weight for coefficients of each term. Then find expressions for the weights of the nearest-means classifier in terms of H₁, H₂. Parts (d)-(f) refer to a C-class nearest-means classifier, with C>2. (d) Consider the following decision rule: x € Ik iff k = argmaxm{gm (x)} Can you find expressions for the gm (x), i = 1,2,..., C, such that this is the decision rule for a nearest-means classifier? If so, give your expression for gm (x), and simplify it as much as possible. Hint: when comparing gm (x) only to each other (e.g., g(x) to g(x)), any additive term that doesn't depend on m, and that is common to gm (x) vm, can be dropped from all gm (x). (e) Is gm (x) linear? Justify your answer. If yes, give expressions for the weights for the nearest-means classifier in terms of the mean vectors Hx- (f) Is this multiclass nearest-means classifier an example of the MVM multiclass method? Justify your answer.

Fig: 1