<< Chapter < Page Chapter >> Page >

Okay? So anomaly detection is also used in security applications. If many, very unusual transactions to start to appear on my credit card, that’s a sign to me that someone has stolen my credit card. And what I want to do now is talk about specific algorithm for density estimation, and in particular, one that works with data sets like these, that, you know, this distribution like this doesn’t really fall into any of the standard text book distributions. This is not really, like, a Gaussian or a [inaudible] explanation or anything. So can we come up with a model to estimate densities that may look like these somewhat unusual shapes? Okay. So to describe the algorithm a bit a more I’m also going to use a one dimensional example rather than a two D example, and in the example that I’m going to describe I’m going to say that let’s imagine maybe a data set that looks like this where the horizontal access here is the X axis and these dots represent the positions of the data set that I have. Okay. So this data set looks like it’s maybe coming from a density that looks like that as if this was the sum of two Gaussian distributions and so the specific model I’m gonna describe will be what’s called a mixture of Gaussian’s model.

And just be clear that the picture I have is that when visioning that maybe there were two separate Gaussian’s that generated this data set, and if only I knew what the two Gaussian’s were, then I could put a Gaussian to my crosses, put a Gaussian to the Os and then sum those up to get the overall density for the two, but the problem is I don’t actually have access to these labels. I don’t actually know which of the two Gaussian’s each of my data points came from and so what I’d like to do is come up with an algorithm to fit this mixture of Gaussian’s model even when I don’t know which of the two Gaussian’s each of my data points came from. Okay. So here’s the idea. In this model, I’m going to imagine there’s a latent random variable, latent is just synonymous with hidden or unobserved, okay. So we’re gonna imagine there’s a latent random variable Z and XI, ZI have a joint distribution that is given as follows. We have that P of X, ZI by the chamber of probability, this is always like that. This is always true. And moreover, our [inaudible] is given by the following ZI is distributed multinomial with parameters I. And in the special case where I have just to make sure that two Gaussian’s and ZI will be [inaudible], and so these parameter [inaudible] are the parameters of a multinomial distribution.

And the distribution of XI conditioned on ZI being equal to J so it’s P of XI given ZI is equal to J. That’s going to be a Gaussian distribution with [inaudible] and covariant sigler. Okay. So this should actually look extremely familiar to you. What I’ve written down are pretty much the same equations that I wrote down for the Gaussian Discriminant Analysis algorithm that we saw way back, right, except that the differences – instead of, I guess supervised learning where we were given the cross labels Y, I’ve now replaced Y in Gaussian Discriminant Analysis with these latent random variables or these unobserved random variables Z, and we don’t actually know what the values of Z are. Okay. So just to make the link to the Gaussian Discriminant Analysis even a little more explicit – if we knew what the Zs were, which was actually don’t, but suppose for the sake of argument that we actually knew which of, say the two Gaussian’s, each of our data points came from, then you can use [inaudible]estimation – you can write down the likelihood the parameters which would be that and you can then use [inaudible] estimation and you get exactly the same formula as in Gaussian Discriminant Analysis. Okay. So if you knew the value of the Z, you can write down the law of likelihood and do maximum likeliness this way, and you can then estimate all the parameters of your model. Does this make sense? Raise your hand if this makes sense. Cool. Some of you have questions? Some of you didn’t raise your hands. Yeah.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Machine learning. OpenStax CNX. Oct 14, 2013 Download for free at http://cnx.org/content/col11500/1.4
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Machine learning' conversation and receive update notifications?

Ask