TensorFlow 2 intro
- زمان مطالعه 0 دقیقه
- سطح خیلی سخت
دانلود اپلیکیشن «زوم»
این درس را میتوانید به بهترین شکل و با امکانات عالی در اپلیکیشن «زوم» بخوانید
متن انگلیسی درس
We know we’re going to work with tensor flow to create neural networks.
However tensor flow is no longer what it used to be.
So let’s have a quick history of development overview tensor flow one is one of the most widely used
deep learning packages.
That’s largely due to its great versatility which makes it the preferred choice of many practitioners.
Unfortunately it has one major drawback it’s very hard to learn and use.
That’s why many people are disheartened after seeing just a couple of lines of tensor flow code.
Not only is a method strange but the whole logic of coding is unlike most libraries out there.
This led to the development and popularization of higher level packages such as pi torch and carry us.
Perez is particularly interesting as in 2017 it was integrated in the core tensor flow a feat that may
sound a bit strange at first.
In reality though both tensor flow and care as are open source so it shouldn’t be surprising that such
things happen in their programming world.
In fact Kira’s author claims that cure us is conceived as an interface for tensor flow rather than a
different library making this integration even easier to digest and implement.
So far so good.
Anyway even with care us as a part of TAF tensor flow was still losing popularity.
This was addressed in 2019 when tensor flow 2.0 came on the horizon or at least its alpha version.
We can say that was tensor flows effort to catch up with the current demand for higher level programming.
Interestingly enough instead of creating their own high level syntax the TAF developers chose to borrow
that of us.
Actually this decision made sense as Cairns was already widely adopted and people generally love it.
On that note you may hear people saying tensor flow too is basically hear us and that isn’t far from
In fact TAF too has the best of both worlds.
Most of the versatility of F1 and the high level simplicity of carries sounds great right.
And that’s not all.
There are also other major advantages of F two over t F1 simplified API no duplicate or deprecated functions
and some new features in the core tensor flow.
But what’s important for us is that tensor flow to boasts eager execution or in other words allowing
standard Python rules of physics to apply to it rather than complex computational graphs you don’t really
want to work with.
OK so from here on we will code in tensor flow to enjoy and thanks for watching.
مشارکت کنندگان در این صفحه
تا کنون فردی در بازسازی این صفحه مشارکت نداشته است.
🖊 شما نیز میتوانید برای مشارکت در ترجمهی این صفحه یا اصلاح متن انگلیسی، به این لینک مراجعه بفرمایید.