# Layers

/ / درس 1

### توضیح مختصر

• زمان مطالعه 0 دقیقه
• سطح خیلی سخت

### دانلود اپلیکیشن «زوم»

این درس را می‌توانید به بهترین شکل و با امکانات عالی در اپلیکیشن «زوم» بخوانید ### متن انگلیسی درس

High again it’s time to dig deeper.

The point is intentional as we will explore deep neural networks now in one of the earlier sections.

We saw how to train a simple linear regression model.

It had two inputs and a single output.

That’s a type of neural network but it had no depth.

We used a linear model that learned to function.

This function was the best fit of the data according to the L2 neuron loss with a couple of hundred

iterations.

Now most real life dependencies cannot be modeled with a simple linear combination.

And because we want to be better forecasters will need better models.

Most of the time this means working with a model that is more sophisticated than a linear model such

complexity is usually achieved by using both linear and non-linear operations mixing linear combinations

and non-linearities allow us to model arbitrary functions or in other words functions with strange unconventional

shapes like this one.

So basically our model changes from inputs that are linearly combined resulting in outputs to inputs

that are linearly combined and then go through some nonlinear transformation resulting in outputs.

Probably you will be happy to see an example of non-linearity.

Well here’s one a commonly used nonlinearity is the sigmoid function defined as Sigma X equals 1 divided

by 1 plus E to the power of minus X.

intriguing the initial linear combination and the added nonlinearity

formalwear the layer is the building block of neural networks when we have more than one layer.

We are talking about a deep neural network.

Henceforth layers will be an important topic.

Thanks for watching.

### مشارکت کنندگان در این صفحه

تا کنون فردی در بازسازی این صفحه مشارکت نداشته است.

🖊 شما نیز می‌توانید برای مشارکت در ترجمه‌ی این صفحه یا اصلاح متن انگلیسی، به این لینک مراجعه بفرمایید.