Overfitting: Difference between revisions
Mr. MacKenty (talk | contribs) (Created page with "<center> <blockquote style="padding: 5px; background-color: #FFF8DC; border: solid thin gray;"> File:Exclamation.png This is student work which has not yet been approve...") |
Mr. MacKenty (talk | contribs) No edit summary |
||
(5 intermediate revisions by one other user not shown) | |||
Line 1: | Line 1: | ||
[[file:Studying.png|right|frame|Case study notes<ref>http://www.flaticon.com/</ref>]] | [[file:Studying.png|right|frame|Case study notes<ref>http://www.flaticon.com/</ref>]] | ||
== Introduction == | == Introduction == | ||
Overfitting is when a statistical model takes too many parameters. (continue general meaning) | |||
In Machine Learning, overfitting is when you take into account too many random factors. The problem lies within the random factors, as there is always some ''noise''. (reference to define noise). When the AI overfits, it ends up learning from the noise, which might make decision making a lot worse. | |||
== References == | == References == |
Revision as of 11:32, 16 March 2018
Introduction[edit]
Overfitting is when a statistical model takes too many parameters. (continue general meaning)
In Machine Learning, overfitting is when you take into account too many random factors. The problem lies within the random factors, as there is always some noise. (reference to define noise). When the AI overfits, it ends up learning from the noise, which might make decision making a lot worse.