English

No-Free-Lunch Theorems for Transfer Learning

By  





I will present a formal framework for transfer learning and investigate under which conditions is it possible to provide performance guarantees for such scenarios. I will address two key issues: *1) Which notions of task-similarity suffice to provide meaningful error bounds on a target task, for a predictor trained on a (different) source task *2) Can we do better than just train a hypothesis on the source task and analyze its performance on the target task Can the use of unlabeled target samples reduce the target prediction error
Find OpenCourseWare Online Exams!
Attribution: The Open Education Consortium
http://www.ocwconsortium.org/courses/view/17d055114c5e4ea6891a03f8e08a8844/
Course Home http://videolectures.net/nipsworkshops09_ben_david_nflt/