<< Chapter < Page Chapter >> Page >

So in the problem set we’re actually gonna let you play around more with this algorithm. So I won’t say too much more about it here. But to finally move on to the next topic let me check the questions you have. Yeah?

Student: It seems like you still have the same problem of overfitting and underfitting, like when you had a Q’s tow. Like you make it too small in your –

Instructor (Andrew Ng) :Yes, absolutely. Yes. So locally weighted regression can run into – locally weighted regression is not a penancier for the problem of overfitting or underfitting. You can still run into the same problems with locally weighted regression. What you just said about – and so some of these things I’ll leave you to discover for yourself in the homework problem. You’ll actually see what you just mentioned. Yeah?

Student: It almost seems like you’re not even thoroughly [inaudible] with this locally weighted, you had all the data that you originally had anyway.

Instructor (Andrew Ng) :Yeah.

Student: I’m just trying to think of [inaudible] the original data points.

Instructor (Andrew Ng) :Right. So the question is, sort of, this – it’s almost as if you’re not building a model, because you need the entire data set. And the other way of saying that is that this is a non-parametric learning algorithm. So this –I don’t know. I won’t debate whether, you know, are we really building a model or not. But this is a perfectly fine – so if I think when you write a code implementing locally weighted linear regression on the data set I think of that code as a whole – as building your model. So it actually uses – we’ve actually used this quite successfully to model, sort of, the dynamics of this autonomous helicopter this is. Yeah?

Student: I ask if this algorithm that learn the weights based on the data?

Instructor (Andrew Ng) :Learn what weights? Oh, the weights WI.

Student: Instead of using [inaudible].

Instructor (Andrew Ng) :I see, yes. So it turns out there are a few things you can do. One thing that is quite common is how to choose this band with parameter tow, right? As using the data. We’ll actually talk about that a bit later when we talk about model selection. Yes? One last question.

Student: I used [inaudible] Gaussian sometimes if you [inaudible]Gaussian and then –

Instructor (Andrew Ng) :Oh, I guess. Lt’s see. Boy. The weights are not random variables and it’s not, for the purpose of this algorithm, it is not useful to endow it with probable semantics. So you could choose to define things as Gaussian, but it, sort of, doesn’t lead anywhere. In fact, it turns out that I happened to choose this, sort of, bell-shaped function to define my weights. It’s actually fine to choose a function that doesn’t even integrate to one, that integrates to infinity, say, as you’re weighting function. So in that sense, I mean, you could force in the definition of a Gaussian, but it’s, sort of, not useful. Especially since you use other functions that integrate to infinity and don’t integrate to one. Okay? It’s the last question and let’s move on

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Machine learning. OpenStax CNX. Oct 14, 2013 Download for free at http://cnx.org/content/col11500/1.4
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Machine learning' conversation and receive update notifications?

Ask