An overview of RNNs

دوره: یادگیری عمیق با TensorFlow / فصل: Conclusion / درس 4

An overview of RNNs

توضیح مختصر

  • زمان مطالعه 0 دقیقه
  • سطح خیلی سخت

دانلود اپلیکیشن «زوم»

این درس را می‌توانید به بهترین شکل و با امکانات عالی در اپلیکیشن «زوم» بخوانید

دانلود اپلیکیشن «زوم»

فایل ویدیویی

برای دسترسی به این محتوا بایستی اپلیکیشن زبانشناس را نصب کنید.

متن انگلیسی درس

While CNN’s are great for modeling and dealing with visual data aren’t ends or recurrent neural networks

were specifically designed for sequential data.

Examples are trading signals so stocks bonds and X or music where there is a certain flow speech recognition

and handwriting recognition are also instances of our and ends as each consequent word often depends

on what you said before it is probable that the suggestive keyboard on your phone was developed through

an R and N similar to CNN’s Oren and Tower an extension on top of the NSA.

We have seen so far the specific feature of our Anand’s is that they have memory.

Let’s illustrate we have an input layer a hidden layer and an output layer.

We forward propagate but we keep the information from the hidden layer for the second sample.

We forward propagate again but the hidden layer this time is a function of both the input layer the

updated weights and the hidden layer from the previous iteration if we want to design a similar figure

to that of deep neural networks we would have the same picture as before but this time there are additional

weights that lead from the hidden units themselves.

Essentially these are connections between the hidden units and their own values for the previous inputs.

By now every input target pair was independent with this new structure and output is formed by the current

input and the hidden units values from previous forward propagations.

Therefore we must learn their weights as well.

If you think about an hour and then you realize that it’s basically a deep neural network the net though

is not just deep but extremely deep.

This is called unrolling the aura and then if we unroll the added time dependencies we will end up with

a deep feedforward neural network.

While oron and seem very easy following this short description they are computationally expensive as

they are extremely deep.

Moreover due to the high number of layers involved and their interconnection over time one must have

a very focused approach and a solid understanding of deep and ends before diving into our Anon’s.

All right well I’m sure that you now have a much better idea of what comes next right after exploring

CNN’s in our ads.

It is time to tell you that there are scientific papers combining CNN and RNA and networks but that’s

an issue to be explored in the future as it is not well-established how the two can work well together.

And the next lesson we will conclude by saying a few words about non neural network machine learning.

Thanks for watching.

مشارکت کنندگان در این صفحه

تا کنون فردی در بازسازی این صفحه مشارکت نداشته است.

🖊 شما نیز می‌توانید برای مشارکت در ترجمه‌ی این صفحه یا اصلاح متن انگلیسی، به این لینک مراجعه بفرمایید.