What is the first step in forward feature selection?
a) Removing the least important feature
b) Adding the feature that provides the most significant improvement to the model
c) Adding all features and then removing them one by one
d) Randomly selecting a feature
Show Answer
Answer: b) Adding the feature that provides the most significant improvement to the model
Explanation: Forward feature selection starts with no features and adds the most significant one at each step, based on improvement in model performance.
In forward feature selection, how is the best feature determined at each step?
a) Based on the smallest p-value
b) Based on the feature with the highest correlation to the target variable
c) Based on the feature that, when added, results in the greatest improvement in model performance
d) Based on the feature with the most missing values
Show Answer
Answer: c) Based on the feature that, when added, results in the greatest improvement in model performance
Explanation: The feature that offers the greatest improvement to the model’s performance (e.g., reduces error, increases R-squared) is added next.
What is a potential drawback of forward feature selection?
a) It always finds the optimal set of features
b) It is computationally very expensive
c) It can lead to overfitting by adding too many features
d) It ignores interactions between features
Show Answer
Answer: d) It ignores interactions between features
Explanation: Forward selection may miss important interactions between features because it considers each feature individually without accounting for their combined effects.
What is the first step in backward feature selection?
a) Adding the most important feature
b) Removing the least significant feature
c) Adding all features at once
d) Removing all features at once
Show Answer
Answer: b) Removing the least significant feature
Explanation: Backward feature selection starts with all features and removes the least significant one at each step, based on the least impact on model performance.
What is the main difference between forward and backward feature selection?
a) Forward selection adds features while backward selection removes features
b) Forward selection removes features while backward selection adds features
c) Forward selection considers feature interactions while backward selection does not
d) Forward selection is only used for linear models while backward selection is used for non-linear models
Show Answer
Answer: a) Forward selection adds features while backward selection removes features
Explanation: Forward selection begins with no features and adds them, whereas backward selection starts with all features and removes them.
In which scenario is backward feature selection not practical?
a) When there are a large number of features
b) When the dataset is very small
c) When features are highly correlated
d) When the target variable is categorical
Show Answer
Answer: a) When there are a large number of features
Explanation: Backward selection can be computationally intensive and impractical with a very large number of features due to the need to evaluate all possible subsets initially.
What is a common stopping criterion for forward, backward, and stepwise feature selection?
a) When the model accuracy reaches 100%
b) When adding/removing a feature does not significantly improve model performance
c) When all features have been added/removed
d) When the computational resources are exhausted
Show Answer
Answer: b) When adding/removing a feature does not significantly improve model performance
Explanation: A common stopping criterion is when further addition or removal of features does not result in a significant improvement in model performance, indicating that an optimal subset has been found.
Discover more from Science Comics
Subscribe to get the latest posts sent to your email.