logo
Loading...

Overfitting 2: training vs. future error - Victor Lavrenko - 機器學習 Machine Learning 公開課 - Cupoy

Training error is something we can always compute for a (supervised) learning algorithm. But what w...

Training error is something we can always compute for a (supervised) learning algorithm. But what we want is the error on the future (unseen) data. We define the generalization error as the expected error of all possible data that could come in the future. We cannot compute it, but can approximate it with error computed over a testing set.