Explain Feature Scaling and Normalization
In this mock interview, Angie asks Raj (MLE @ Snapchat) to discuss “the importance of feature scaling and normalization in machine learning” Below is a supplemental written solution that shows how to approach the question.
Answer
Feature scaling is really important for training machine learning algorithms that do gradient-based updating. Since features often have different orders of magnitude, the derivatives of the loss with respect to those input parameters will be on different scales as well. When you have them on different scales, gradient design features tend to be unstable and converge more slowly. Feature scaling can be a way of getting an algorithm to converge faster.
What makes this answer effective
The answer simply and accurately explains the reason for feature scaling in machine learning. It specifies the use case in which it is useful, and why it is useful for that particular use case.