The linear model. Multiple inputs and multiple outputs
دوره: یادگیری عمیق با TensorFlow / فصل: Introduction to neural networks / درس 6سرفصل های مهم
The linear model. Multiple inputs and multiple outputs
توضیح مختصر
- زمان مطالعه 0 دقیقه
- سطح خیلی سخت
دانلود اپلیکیشن «زوم»
فایل ویدیویی
برای دسترسی به این محتوا بایستی اپلیکیشن زبانشناس را نصب کنید.
ترجمهی درس
متن انگلیسی درس
OK let’s apply the linear model logic for multiple output variables.
Here’s our new situation.
We may be interested in predicting not only the price of the apartment when buying it but also the price
for which we can rent it out.
Our inputs are unchanged.
Size and proximity to the beach this time though we have two outputs.
Therefore we can create two linear models.
The price as a function of the size and proximity to the beach and the rent as a function of the size
and proximity to the beach.
Why one equals x one times w 1 1 plus x 2 times w 2 1 plus B one and Y two equals x 2 times w 1 2 plus
x 2 times w 2 2 plus B to notice the indices of the weights the first number refers to the respective
input while the second to the output.
We have two outputs two inputs for weights and two bias’s the number of weights depends on the inputs
and outputs.
There is a different weight for each input in each equation.
In general if we have k inputs and outputs the number of weights would be k times M.
The number of Byass is equal to the number of outputs.
Let’s see this in linear algebraic terms y equals x times w plus B to outputs 2 inputs for weights and
two bias’s let the weights for the apartment price be equal to four hundred three point seventy seven
and minus fifteen thousand five hundred twelve as before and the weights for the rent be equal to thirteen
point nine and minus four hundred eighty four point seventy five.
So these are our w 1 1 w 1 2 w 2 1 and w 2 2.
The biases are one thousand one hundred twelve point four five and two hundred twelve point three four.
The inputs are the same as before.
743 and one point to 1 miles.
So the first output y one is found in the familiar way.
Seven hundred forty three times four hundred three point seventy seven.
Then subtract 1.2 one times fifteen thousand five hundred twelve and add the bias one thousand two hundred
twelve point four or five.
The result is two hundred eighty two thousand four hundred forty four dollars and four cents and that’s
the price of the apartment.
The second output y2 is found by seven hundred forty three times.
Thirteen point nine minus four hundred eighty four point seven five times one point to one plus two
hundred twelve point three four.
The result is nine thousand nine hundred fifty two dollars and seventy nine cents.
So basically you can buy the house for around 282 K and rent it out for 10k.
That’s how we use the same model X times w plus B to represent two linear relationships.
Notice how the previous example is actually part of this one.
Here are the respective output inputs weights and bias.
Finally I would like to point out that this was only one observation we could extend this example to
many inputs outputs and observations.
The output matrix will be entered by N where N is the number of observations an M is the number of output
variables.
The input matrix will be an by k where k is the number of input variables.
The weights matrix remains the same as the weights don’t change depending on the number of observations.
The same applies to the bias’s.
This last bit is extremely important.
It shows us we can feed as much data in our model as we want to and it won’t change as each model is
determined solely by the weights and the biases.
This property will greatly help us when creating machine learning algorithms.
We vary only the values of the weights and the biases but the logic of the model stays the same.
Cool.
In the next lesson we will see why the linear model is useful for solving problems.
Thanks for watching.
مشارکت کنندگان در این صفحه
تا کنون فردی در بازسازی این صفحه مشارکت نداشته است.
🖊 شما نیز میتوانید برای مشارکت در ترجمهی این صفحه یا اصلاح متن انگلیسی، به این لینک مراجعه بفرمایید.