Consider a dataset with the following information:

Using the Naive Bayes classifier, what will be the predicted class for a new instance where X1 = Sunny and X2 = Cold?


Ridge regression adds a penalty to the linear regression cost function to prevent overfitting. Which of the following correctly describes the effect of increasing the regularization parameter λ in ridge regression?
Which of the following is a regression problem?

Compared to ordinary least squares (OLS) regression, ridge regression:
The Naive Bayes classifier assumes that the features are:

In ridge regression, the penalty term added to the cost function is:
You are using a Naive Bayes classifier for spam detection. Given the following dataset:

In ridge regression, you are given a dataset where the multicollinearity between the predictors is high. Which of the following statements best describes the impact of ridge regression on the bias-variance tradeoff compared to ordinary least squares (OLS) regression?

Which of the following is a key assumption made by the Naive Bayes classifier?