How do we draw a matrix, by the way, I I just want I mean this is nothing to do with the class but if some of you are not sure how to draw matrix or visualize a matrix in your notebook you Python notebook sometime you want to see made you know to draw a bat matrix and show it to others or so how you can do that sorry simple data frame is it okay no okay okay okay now I mean visualization perspectives of what okay what I was saying that I found I struggled quite a bit to draw these brackets around the matrix okay I struggle I said what I mean ever I was drawing it with but apparently there is a nice language called latex.

X which is which you can actually write in in your job in a notebook to draw these so I just wanted to share because I I struggle with it and I want to make sure you know it's that you don't need to worry about how to draw you know earlier I was drawing one line here and then small line another small line and these those buggers will never meeting each other.

Yeah so that was a struggle right so but now it came out nicely so you know I just thought I'll share so this is a matrix-like what is the size of this matrix six by two right six by two matrix right this is the data that we are working with now the first thing we do is calculating the mean right we calculate the average of each feature and I have made a small change to our earlier example rather than 0.8 3 and all I got nice 6 and 4 right so I got 6 & 4 basically add all of them and divide by 6 right and so this is my new and once I knew how to draw matrix I got crazy with it right.

So this is how we will shift the mean basically subtract the average of that feature and somebody pointed out earlier which I have not corrected yes there's this error this should be 6 here right so I have not corrected it than basic this is the new matrix that I will have where this is where I need to rotate the line right that's months once I have done this I will need to this what this means in math that we have shifted the origin now right so now before we can calculate the rotate the line and all we will need to learn something about variance and covariance what is variance in a particular feature how your data is spread out here right so how do you calculate it extra - so they like this right sorry yeah this okay so there's a petition which is usually square right but I have this is a particularly zijn why I kept it this way but.

Yeah right yes I mean so now this is variance is this tells me about how varied a particular feature is right what is covariance basically tells how two features basically are related to each other like in how are they kind of move together in MA or not at all right so and the formula is this right if x and y are two features x so you say actual x minus the average of x by x the difference you know and divide by n minus one so we have to work with variance and covariance right because we are that's what PCA is taking advantage of it is taking advantage of variance and covariance right so we have to know of the math behind it now so this is the matrix that we have.

We can use matrix multiplication to get covariance how many of you are aware that if you want to calculate the covariance we can get we can actually use matrix multiplication and how do we do that is it's very simple we just multiply the transpose of a matrix with itself what is the transpose close two rows and rows two columns so so first you know when we say our matrix is six by two and when I its transpose it what will be the dimension two by six then I multiply it by six by two so what will be the output 2 by 2 right.

I mean and that's what I am interested in because I want to know what is the covariance between X and X what is the covariance if the x and y are two features this is basically nothing but the variance that this will give me variance and this will give me covariance, of course, it repeats it twice on you know for x and y y in x and x of I will be same right it doesn't make any difference right so it will repeat but it's an easier way to calculate a covariance right so it's we can calculate a covariance this is how we do it right and when we multiply we will get something like this right we just multiplied the shifted mean values with the transpose of itself to get the covariance in our data invariance and covariance right so we will need to calculate this right so that's the first thing so or the second or third thing way right we do right and then once we have the covariance matrix it becomes really simple I'm just kidding soda.

So okay how do I get these values I have to divide it by n minus 1 remember that right I have to divide by n minus 1 n is 6 in our case we have 6 examples so divided by 5 that's the definition of covariance so the just a multiplication will give me the total and I still have to divide it by the number of examples minus 1 right so this is my covariance now I want basically what the idea is by how do I get P C's is basically I can want to get eigenvector and eigenvalues of the covariance matrix so covariance matrix is coming from our data right I mean we I started with F 1 F 2 and we got it two covariance matrix now I want to calculate eigenvectors and eigenvalues of the covariance matrix because covariance matrix is basically it has the variance and the covariance information which represents the information in the data right.

I want to build these two Thanks called eigenvector and eigenvalues of the covariance matrix so how do I do that so this is a simple equation AE is equal to lambda e is equal to lambda right I mean you know that sounds so easy comment but not in the matrix world right not in the matrix will somehow you know so so but these are you know that's what you know that's the equation for I then vectors so what is a here he is nothing but the covariance matrix that we have built now what is lambda is the eigenvalue and E, of course, is eigenvector right that's the property of eigenvector is like a menu x with the matrix it gives you it cuts the output is the eigenvector x the eigenvalue and if you again multiply lambda a with e you still get you to know you know II and with lambda square right.

So that's the property of eigenvectors again I mean there are some people have gone and who really understand and this thought the I mean this was it was not meant for PCA you know somebody you know discovered it or ever came up with it for some other reasons but we just ended up using for PCA right so if you're interested just go to youtube and learn more about eigenvalues it for math right so how do I solve this equation so what all I have here I have the covariance matrix right and the other two I need to solve right other two I need to solve so what I will do is basically convert this into you know kind of a related or same equation which a is equal to lambda ie what is I now identity matrix right I can build I can use an identity matrix to carry out my work little easier.