Cost function: Difference between revisions

From Computer Science Wiki
No edit summary
No edit summary
Line 3: Line 3:
== Introduction ==
== Introduction ==


In ML, cost functions are used to estimate how badly models are performing. Put simply, a cost function is a measure of how wrong the model is in terms of its ability to estimate the relationship between X and y. This cost function (you may also see this referred to as loss or error.) can be estimated by iteratively running the model to compare estimated predictions against “ground truth” — the known values of y<ref>https://towardsdatascience.com/machine-learning-fundamentals-via-linear-regression-41a5d11f5220</ref>
In machine learning, a cost function is a function that is used to optimize a model's parameters by minimizing the error between the predicted output and the actual output. The cost function is used in training a machine learning model to find the set of parameters that minimizes the error between the predicted output and the actual output. The cost function is typically defined as a function of the model's parameters and the training data, and it is used to guide the optimization process by providing a measure of how well the model is doing on the training data.


== A fairly decent video ==  
== A fairly decent video ==  

Revision as of 12:04, 6 January 2023

Case study notes[1]

Introduction[edit]

In machine learning, a cost function is a function that is used to optimize a model's parameters by minimizing the error between the predicted output and the actual output. The cost function is used in training a machine learning model to find the set of parameters that minimizes the error between the predicted output and the actual output. The cost function is typically defined as a function of the model's parameters and the training data, and it is used to guide the optimization process by providing a measure of how well the model is doing on the training data.

A fairly decent video[edit]


References[edit]