Load the preprocessed data

دوره: یادگیری عمیق با TensorFlow / فصل: Business case / درس 5

Load the preprocessed data

توضیح مختصر

  • زمان مطالعه 0 دقیقه
  • سطح خیلی سخت

دانلود اپلیکیشن «زوم»

این درس را می‌توانید به بهترین شکل و با امکانات عالی در اپلیکیشن «زوم» بخوانید

دانلود اپلیکیشن «زوم»

فایل ویدیویی

متن انگلیسی درس

All right.

It’s high time we cracked the business case.

Let’s start by taking a look at the net.

We’re going to deal with here.

We have an input layer consisting of 10 units.

Those are the inputs from our CSP.

As you can see there are only two output nodes as there are only two possibilities zero and one we will

build a net with two hidden layers.

The number of units in each layer will be 50.

But as we know very well this is extremely easy to change.

Therefore for a prototype of an algorithm 50 is a good value.

Let me explain in more detail 50 hidden units in the hidden layers provide enough complexity.

So we expect the algorithm to be much more sophisticated than say a linear or logistic regression.

At the same time we don’t want to put too many units initially as we want to complete the learning as

fast as possible and see if anything is being learned at all.

All right.

It’s coding time.

First we import none by once again as we have not imported it in this notebook.

Then we should import tensor flow as TAF.

So far so good.

The next logical step is to load the data to make things easier.

I’ll declare a temporary variable called NPC that will story each of the three data sets as we load

them now to load the train data we use and P load and the name of the train data set.

So audiobooks data train NPC you probably remember that we saved the NPD in two tuple form comprising

inputs and targets.

Let’s start from the inputs and extract them into a new variable called Train inputs.

Until now they were stored in the NPC under the keyword inputs.

So we call them as NPC square brackets inputs.

As mentioned earlier in our tensor flow intro These keywords can be any string like RAD 1 rad 2 or even

frog mouse etc.

Finally to make sure our model learns correctly we expect all inputs to be floats.

Therefore we must ensure that by employing the method as type to indicate that the inputs are of the

type and p dot float great in a similar way we extract the train targets from NPC using the keyword

targets.

Now our targets are zeros and ones but we are not completely certain if they’ll be extracted as integers

floats or billions.

It’s good practice to use the same method as type and make sure their data type will be NDP I.A.

even

if we know in what format we save them.

All right we’ve got our train inputs and targets.

What about the validation and test.

Well we start by loading the next NPC namely audio books data validation in a temporary variable NPC

then we proceed in a similar way to extract the validation inputs and targets making sure of their data

types.

I’ll leave the desk data set for homework so you can practice independently.

Finally note that unlike before our train validation and test data is simply an array form instead of

the iterator we use for the amnesty.

In this business example we will train our model with simple everyday arrays.

See you at the next lesson where we will create the model.

Thanks for watching.

مشارکت کنندگان در این صفحه

تا کنون فردی در بازسازی این صفحه مشارکت نداشته است.

🖊 شما نیز می‌توانید برای مشارکت در ترجمه‌ی این صفحه یا اصلاح متن انگلیسی، به این لینک مراجعه بفرمایید.