Published: Sep 08, 2024
Duration: 00:18:49
Category: People & Blogs
Trending searches: w9
hello everyone today we are going to learn about deep neural networks or DNM so deep neural networks are a type of artificial neural network with multiple layers between the input and the output layers these layers are called hidden layers and they allow the network to learn and model complex and nonlinear relationships in data here this first layer is called the input layer which receives the initial data then these intermediate layers at the hidden layers where the computation is performed the deep in DNN comes from having multiple hidden layers so it has multiple hidden layers so we can call this a deep neural network and at last after this hidden layers we will get the output layer which is the final layer and it produces the output so for the Simplicity of understanding we just skip the output layer here so let's move on to the first exercise here this is the input for us which is X1 so we are doing linear operations here and using the reu activation function so for the linear operation we know that the formula is WX + B that means we have to multiply this input sample with the weight and then add the bias so if we follow this so 2 into one two plus the bias will be zero so we will get two and passing through the r function the output will also be two because we know that ra will return the positive number as it it is if it it gets the positive number or zero and it will return zero if it gets a negative number so this is the first hidden layer for this exercise so in the next hidden layer the output of the first hidden layer which is A1 will be multiplied with this weight and then we will add the bias which is 2 into 2 4 then we add the bias which is zero so passing through the r function we will get four also now next exercise we have to find out this value so this input is going to be multiplied by this W and then we add the bias and passing through the r function we will get two then 2 into two will be four and then adding the bias zero we'll get the output as four also now output of this hidden layer which is the layer B1 so this output of B1 will get multiplied by this zero so B1 into 0 will be zero and then plus one we will get one and passing through the r function it will give us one also so this output of C1 hidden layer one will get multiplied by zero and then we add the bias two so passing through the real function we will also get two so output of the Hidden layer D1 will be two so this will be one and this will be two next exercise we have to find out these value so output of the Hidden layer D1 will get multiplied by this so 2 into min-1 -2 and then adding the bias plus 0 will give us minus two and then passing through the r function minus two will return as zero so we will get zero here so E1 will be zero and then this zero will get multiplied by this two for the next hidden layer so 0 into two 0 plus we add the bias five so we will get five here so F1 will be five next we have to find out the value of the Hidden layer G1 so for that the value of fub1 will get multiplied by this weight so 5 into 1 five plus the bias will be one so six so G1 will be six and then the output of this G1 which is six will get multiplied by this one and then we will add the bias one so we will get 7 from here so you can see that the value of X1 gets changed to A1 then a1's value changed to B1 so the value of each hidden layer is changing by multiplying with weights and biases in each layer now for this value this input will simply get multiplied by this W so 3 into 1 three then adding the bias will give us three also for the Lo so A1 will be three then the output of this A1 will get multiplied by this two so 3 into 2 6 + 0 we will get six and passing through the r function we will also get six so B1 will be six so this value of B1 will go to the next layer and getting multiplied by the wets and biases we will get the value of the next hidden layer and thus following the same logic we will get the values for the next hidden layers so you can try that by your own to fill up the values here now for the value of the Hidden layer C1 we are going to multiply the value of B1 by this weight so B1 into 0 will be zero and then adding the bias will give us one from the ra then for D1 this value of C1 will get multiplied by this Zer so 1 into 0 0 + 2 will give us two also so the value of C1 will be one and the value of D1 will be two next the value of this D1 will get multiplied by this minus1 so 2 into -1 + 0 will give us- 2 so passing through the real function we will get zero as minus 2 is a negative number so for E1 the output will be zero and then for F1 this Z will get multiplied by this two so 0 into two 0 + 5 will give us five from the real function so the value of F1 will be five so you can just simply calculate this e f and then using the value of F1 you will get the value of G1 and then the value of H1 so you can notice that these layers are sequential so for finding out the value of G1 you need the value of fub1 for finding out the value of H1 you need to find the value of G1 first next exercise we have to find out this value of G1 so we have to multiply the value of the previous layer which is fub1 with the wets and bies so 5 into 1 + 1 will give us six and from the Rel function we will also get six so G1 will be six so for finding this H1 we have to multiply the value of G1 6 into 1 + 1 which will give us seven so H1 will be S so for next exercise we can see that there is a change in the pattern so now we have two input samples and two rows of wets and biases as we are doing a linear operation so we will multiply these with these so 2 into 0 then plus 3 into 1 so 2 into 0 will give us Z and adding the bias we will get three and passing through the real function the output of A1 will be three also then again this X1 will get multiplied by this one so 2 into 1 plus this X2 into this weight so 3 into 1 3 2 + 3 will give us five and 5 + 0 will return us five also so now if we try to fill up the diagram X1 is 2 and X2 is three and then in case of A1 X1 is getting multiplied by zero so X1 to A1 will be zero and then in case of A2 X1 is getting multiplied with this one so X1 to A2 will be 1 and then in case of X2 X2 gets multiplied by one for A1 so X2 to A1 will be one and then X2 gets multiplied by this one in case of A2 so X2 to A2 will be one also so you can see that every input is connected to every neuron in the next layer so this is called fully connected or dense layers which means each neuron is connected to every neuron in the previous and and in the next layers so we can see another example of dense layer here so each neuron here is connected to every neuron in the previous layer also every neuron in the next layer so now this set of WS and biases gets multiplied by these inputs and these intermediate value of hidden layers will get multiplied by this rows of WS and biases through 3 into 1 3 + 5 into -1 - 5 will give us min-2 and passing through the function we will get zero here so then we are going to multiply this row with A1 and A2 so 3 into 1 3 + 5 into 0 will give us 0 so 3 + 1 will give us four and passing through the real function the output will be four here also so if we try to fill up the Valu so A1 is 3 and A2 is five then from A1 to B1 we are getting this value so one so A1 to B1 will be one and then this A1 to B2 A1 will get multiplied by this one so A1 to B2 will also be one and from A2 to B1 A2 is going to get multiplied by minus1 so A2 to B1 will be minus1 and then from A2 to B2 A2 will get multiplied by this zero so A2 to B2 will be zero and B1 and B2 will be 0 and four respectively now next exercise we have to multiply the intermediate values of this hidden layer B with this row of weights and biases so B1 will get multiplied by this row so B1 and B2 get will get multiplied by this row at first for C1 so 0 into 1 0 plus 4 into 1 4 and the bias is zero so the output will be four from the ru function and then for this row 0 into 1 0 plus 4 into 0 0 0 - 1 will give us minus one and then from reu we will get zero as output so we will just fill up this values of the Hidden layers so B1 is zero and B2 is four and then we will get C1 is 4 and then C2 will be zero so again you can find out the values for this connection by yourself like the previous example next we are adding one more hidden layer and the pattern is getting more dense so we are going to multiply these C1 and C2 with these two rows 4 into 0 + 1 into 0 and the is one so from reu we will get one and then for the next Row 4 into 1 4 plus 0 into minus 1 0 and then we add the 0 so from reu we will get four here so D1 will be one and D2 will be four and C1 is four and C2 is zero so you can find out the value of this connections here next exercise we have to find out this value of E1 and E2 so D1 and D2 is going to get multiplied with these rows so 1 into 1 + 4 into 0 Plus two will give us three so from reu we will get three and for this Row 1 into 1 + 4 into 1 which is five and then we add two so it will be seven so E1 will be three and E2 will be 7even so now you can see that from the previous example we found the E1 and E2 so now the value of this E1 and E2 will go to the next hidden layer which is fub1 and FS2 so this E1 and E2 will get multiplied by this row and this row so 3 into 0 + 7 into 1 plus 1 will give us 8 and from relu function we will get eight here and then 3 into 1 + 7 into 0 will give us three and then we add the bias which is one so we will get four from the re function so fub1 will be eight and FS2 will be four so as you can see the input value was 2 and three then in the next layer it gets changed to three into five and then in the intermediate hidden layers the values are getting changed and this two and three of the input will become eight and four in the last hidden layer here so now next exercise we have to find out this value so we are going to multiply these inputs with the first row so 3 into 0 + 1 into 1 1 plus the bias is zero so we will get one from The Rao so A1 will be one here and then for this particular value we have to go from A1 to B1 so for getting B1 we are going to multiply this A1 and A2 with the first row so 1 into 1 + 4 into min-1 so minus 3 and then adding the bias and passing through the r function minus 3 will return as zero here so B1 will be zero so B1 and B2 is given so you have to find out C1 C2 D1 D2 E1 E2 and F1 F2 you can just simply try out these values by yourselves so for C2 we have to multiply this B1 and B2 with this row so 0 into 1 + 2 into 0 then Min - 1 will give us min-1 so passing through the real function this minus one will return as zero so C2 will be zero here and then for this D2 these two are getting multiplied with this so 2 into 1+ 0 into minus one and then adding the zero will give us two so passing through the function we will also get two as the output of D2 so D2 will give us two here so now for E2 we are going to multiply these two values with this row so 1 into 1 + 2 into one will give us three then adding two will give us five so we will get five from the Rel function so E2 will be five and then for fub1 we need to multiply these two values with this row so 3 into 0 0 + 5 into 1 5 and then adding the bias will give us six so from the ru we will get six so F1 will be six you can see like for constitution of the value of F1 both E1 and E2 has some impact on F1 so now we have three inputs and then three rows of WS and biases so it is getting a bit more complex now so for this A1 we have to multiply these three inputs with the first row so 3 into 0 + 1 into 1+ 2 into 1 will give us three and then adding the bias will give us three also so from the ru we will get three and for A2 these three input values if we try to multiply this three input values with this row we will get 2 into one two and then adding the bias will give us five and passing through the function the output of A2 will be five so A1 is three and A2 is five so let's fill up the values for the connections of A1 and A2 so for A1 X1 gets multiplied by 0er so X1 to A1 will be zero or we can simply cross out the connection and then from X2 to A1 we will get one so X2 to A1 will get one here and then from X3 to A1 we will get one again so X3 to A1 we'll get one also and in case of A2 X1 gets multiplied by 0er so X1 to A2 will be zero or you can simply cross out the connection and then X2 gets multiplied by zero for A2 so X2 to A2 also gets zero or again you can simply cross out the connection here and then X3 gets multiplied by one in case of A2 so X3 to A2 will be one here next exercise we are getting one more hidden layer which is B so B1 B2 and B3 are all the values for this hidden layer so now we will multiply these values of the previous s layer which is A1 A2 and A3 these rows of vets and biases so for B2 we need to multiply 3 into 0 + 5 into 1 + 4 into 0 so 5 - 1 will give us four and from B function we will get four and for B3 we need to multiply 3 into -1 - 3 so 5 + 5 into 0 + 4 into 1 so Min - 3 + 4 will give us 1 and then 1 + 0 will give us one also from the real function so so B3 will be 1 and B2 will be 4 you can fill up the values for the connection following the previous example now we are adding one more hidden layer which is C so for C1 we are going to multiply this with this row so 7 into 0 + 4 into 0 + 1 into 1 will give us one and adding the bias and passing through the real function we will also get one and for C3 we are going to multiply this row with B1 B2 and B3 so only B1 gets m multiplied by this one so 7 into one and then passing through the r function we will get seven here so C1 will be one and C3 will be seven now this example is getting more complex as we are adding one more hidden layer so for D1 C1 C2 C3 will get multiplied by this row so 1 into 1+ 4 into 1 will give us five so from the ru we will get five so D1 will be five and for D2 1 into 0 + 4 into 1 + 7 into 1 so 4 + 7 will give us 11 so from reu we will get 11 so D1 will be 5 and D2 will be 11 next exercise we have the same number of hidden layers but now we have to find out this value of A2 and A3 so for A2 this X1 X2 X3 will get multiplied by this row so 1 into 1 + 3 will give us four and passing through the real function we will get four and for this A3 we will only multiply this two into one because the other values will be zero and then adding the bias will give us two so passing through the L function we will also get two so A2 will be 4 and A3 will be 2 next example using the values of A1 A2 and A3 we need to find out the values of B1 B2 and B3 we have to multiply these values with this row and this row so for B1 we will multiply 1 into 1 + 2 into 1 which is three and then adding the bias and passing through the function we will get three as B1 for B2 1 into 0 + 4 into 1 + 2 into 0 so 4 minus 1 will give us three and passing through the real function we will also get three so B2 will be three also now we have to find out this value of C1 and C3 using the previous values of B1 B2 and B3 so for C1 only this one and one will get multiplied so one will go through the real function and we will get one as C1 so C1 will be one and then only three and one will be M getting multiplied in case of C3 so three and then passing through the real function will also give us three so for C3 we will get the value as three now last exercise we have to find out these values of two D2 and D3 using the values of C1 C2 and C3 so in case of D2 We Get 3 into 1 + 3 into 1 so six passing through the real function we will get D2 as six so D2 will be six and then for D3 1 into 1 + 3 into 1 will give us four and passing through the function we we will also get four so for D3 we will get four so in this video we have learned how to calculate the values for the hidden layers deep neural networks are powerful tools in the field of AI which is capable of learning and making predictions for large and complex data sets I hope you found this video helpful thank you