# Problem 3 (Cost-sensitive classification). Suppose you face a binary classificationprob- lem with in

Problem 3 (Cost-sensitive classification). Suppose you face a binary classificationprob- lem with input space X = R and output space Y = {0, 1}, where it is c timesas bad to commit a â€œfalse positiveâ€ as it is to commit a â€œfalse negativeâ€ (for somereal number c ≥ 1). To make thisconcrete, letâ€™s say that if your classifier predicts 1 but the correct label is 0, you incura penalty of\$c; if your classifier predicts 0 but the correct label is 1, you incur a penalty of ✩1.(And you incur no penalty if your classifier predicts the correct label.)Assume the distribution you care about N(0, class prior 0, and = 2/3 and π1/3, and the class conditional densities are has a1) for class with π0 N(1, 1/4) for1 =class 1. Let f * : R → {0, 1} bethe classifier with the smallest expected penalty.(a)(b)Assume 1 ≤ c ≤ 1.5. Specify precisely (and with a simple expression involvingc) the region in which the classifier f * predicts 1.Now instead assume c ≥ 10. Specify precisely the region in which the classifier fpredicts 1.1