L2-norm loss

توضیح مختصر

  • زمان مطالعه 0 دقیقه
  • سطح خیلی سخت

دانلود اپلیکیشن «زوم»

این درس را می‌توانید به بهترین شکل و با امکانات عالی در اپلیکیشن «زوم» بخوانید

دانلود اپلیکیشن «زوم»

فایل ویدیویی

متن انگلیسی درس

Earlier we divided supervise learning into two types regression and classification.

We will take the same approach here and consider two of the most common types of loss functions.

Each is used with one of the two types of supervised learning knew that the objective function is a

separate block in our framework from the model.

That is to say that we are going to discuss now is generally true for all models regardless of their

linearity.

OK.

First we should define another concept called The Target denoted by T.

The target is essentially the desired value at which we are aiming.

Generally we want our output y to be as close as possible to the target T in the cats and dogs example

we’ve been employing so far.

The targets would be the labels we assign to each photo.

So we are 100 percent sure these values are correct.

They are the values we aspire to the y values are the outputs of our model the machine learning algorithm

aims to find a function of x that outputs values as close to the targets as possible using this new

notation.

The last function evaluates the accuracy of the outputs regarding the targets.

All right.

Let’s see the two common functions we talked about first we will talk about regressions.

I’d like to remind you that the outputs of a regression are continuous numbers.

A commonly used loss function is the squared loss also called L2 norm loss in the machine learning realm.

The method for calculating it equals the least squares method used in statistics.

Mathematically it looks like this.

The sum of the square differences between the output values y and the targets T.

Naturally the lower this sum is the lower the error of prediction.

Therefore the lower the cost function.

OK we will check out a common loss function for classification in our next lesson.

Thanks for watching.

مشارکت کنندگان در این صفحه

تا کنون فردی در بازسازی این صفحه مشارکت نداشته است.

🖊 شما نیز می‌توانید برای مشارکت در ترجمه‌ی این صفحه یا اصلاح متن انگلیسی، به این لینک مراجعه بفرمایید.