- Thesis Defense: Transfer L...
- Edit Event
- Cancel Event
- Preview Reminder
- Send Reminder
- Other events happening in January 2017
Thesis Defense: Transfer Learning for Low-resource Natural Language Analysis
Speaker:
Yuan Zhang
, MIT CSAIL
Date: Monday, January 30, 2017
Time: 3:00 PM to 4:00 PM Note: all times are in the Eastern Time Zone
Public: Yes
Location: 32-D463 (Stata Center - Star Conference Room)
Event Type:
Room Description:
Host: Regina Barzilay, MIT CSAIL
Contact: Marcia G. Davidson, 617-253-3049, marcia@csail.mit.edu
Speaker URL: None
Speaker Photo:
None
Reminders to:
seminars@csail.mit.edu, rbg@csail.mit.edu
Reminder Subject:
TALK: Thesis Defense: Transfer Learning for Low-resource Natural Language Analysis
Expressive machine learning models such as deep neural networks are highly effective when they can be trained with large amounts of in-domain labeled training data. While such annotations may not be readily available for the target task, it is often possible to find labeled data for another related task. The goal of this thesis is to develop novel transfer learning techniques that can effectively leverage annotations in source tasks to improve performance of the target low-resource task. In particular, we focus on two transfer learning scenarios: (1) transfer across languages and (2) transfer across tasks or domains in the same language.
In multilingual transfer, we tackle challenges from two perspectives. First, we show that linguistic prior knowledge can be utilized to guide syntactic parsing with little human intervention, by using a hierarchical low-rank tensor method. In both unsupervised and semi-supervised transfer scenarios, this method consistently outperforms state-of-the-art multilingual transfer parsers and the traditional tensor model across more than ten languages. Second, we study lexical-level multilingual transfer in low-resource settings. We demonstrate that only a few (e.g., ten) word translation pairs suffice for an accurate transfer for part-of-speech (POS) tagging. Averaged across six languages, our approach achieves a 37.5% improvement over the monolingual top-performing method when using a comparable amount of supervision.
In the second monolingual transfer scenario, we propose an aspect-augmented adversarial network that allows aspect transfer over the same domain. We use this method to transfer across different aspects in the same pathology reports, where traditional domain adaptation approaches commonly fail. Experimental results demonstrate that our approach outperforms different baselines and model variants, yielding a 24% gain on this pathology dataset.
Thesis Advisor: Regina Barzilay
Thesis Committee: Jim Glass and Tommi Jaakkola
Research Areas:
Impact Areas:
Created by Marcia G. Davidson at Monday, January 23, 2017 at 2:24 PM.