Overfitting: Difference between revisions
Mr. MacKenty (talk | contribs) (Created page with "<center> <blockquote style="padding: 5px; background-color: #FFF8DC; border: solid thin gray;"> File:Exclamation.png This is student work which has not yet been approve...") |
No edit summary |
||
(6 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
[[file:Studying.png|right|frame|Case study notes<ref>http://www.flaticon.com/</ref>]] | [[file:Studying.png|right|frame|Case study notes<ref>http://www.flaticon.com/</ref>]] | ||
== Introduction == | == Introduction == | ||
Overfitting is a phenomenon that occurs when a machine learning model is trained too well on the training data. This can cause the model to perform poorly on unseen data, because it has learned patterns in the training data that do not generalize to new data. | |||
In overfitting, the model becomes too complex and fits the noise in the training data rather than the underlying relationship. This can lead to poor generalization performance, as the model will be sensitive to the noise in the training data and may not be able to generalize to unseen data. | |||
One way to mitigate overfitting is to use techniques such as regularization, which helps to constrain the model and reduce its complexity. Other techniques include using a larger training dataset, using cross-validation to tune the model, and early stopping to prevent the model from becoming too complex. | |||
== References == | == References == |
Latest revision as of 08:12, 7 January 2023
Introduction[edit]
Overfitting is a phenomenon that occurs when a machine learning model is trained too well on the training data. This can cause the model to perform poorly on unseen data, because it has learned patterns in the training data that do not generalize to new data.
In overfitting, the model becomes too complex and fits the noise in the training data rather than the underlying relationship. This can lead to poor generalization performance, as the model will be sensitive to the noise in the training data and may not be able to generalize to unseen data.
One way to mitigate overfitting is to use techniques such as regularization, which helps to constrain the model and reduce its complexity. Other techniques include using a larger training dataset, using cross-validation to tune the model, and early stopping to prevent the model from becoming too complex.