Hi~ I am Tianyu Gao, a first-year PhD student at Princeton University, advised by Prof. Danqi Chen. Before joining Princeton, I received my bachelor degree at Tsinghua University. During my time at Tsinghua, I was a member of THUNLP and was advised by Prof. Zhiyuan Liu. Here is my CV.

Find me at twitter, google scholar, and github! Some of my projects are also in THUNLP github.

Research


My research interests lie within the intersection of natural language processing and machine learning. More specifically, my research interests include:

  • Training NLP models with fewer annotations. Annotations for language are expensive to gather, so it is meaningful to develop models and algorithms that learn more efficiently.
    • Human can grasp new knowledge with only a handful of examples. So can machines. Few-shot learning aims at guiding models to learn new tasks with limited data.
    • There are huge amounts of unlabeled data on the Internet and we can utilize them with unsupervised / semi-supervised training, like pretraining language models or bootstrapping from a few seeds of annotations.
    • Existing structured information can act as an external knowledge for NLP models, like knowledge graphs in distant supervision relation extraction.
    • I explore the above aspects mainly in the field of information extraction, an important area in NLP.

Highlighted Publications


Please refer to my publications for the full list.

Xu Han*, Tianyu Gao*, Yankai Lin*, Hao Peng, Yaoliang Yang, Chaojun Xiao, Zhiyuan Liu, Peng Li, Maosong Sun, Jie Zhou (* indicates equal contribution)
More Data, More Relations, More Context and More Openness: A Review and Outlook for Relation Extraction
Arxiv preprint, 2020 pdf

Xiaozhi Wang, Tianyu Gao, Zhaocheng Zhu, Zhiyuan Liu, Juanzi Li, Jian Tang
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
Arxiv preprint, 2020 pdf

Tianyu Gao, Xu Han, Ruobing Xie, Zhiyuan Liu, Fen Lin, Leyu Lin, Maosong Sun
Neural Snowball for Few-Shot Relation Learning
Proceedings of AAAI, 2020 pdf code

Tianyu Gao, Xu Han, Hao Zhu, Zhiyuan Liu, Peng Li, Maosong Sun, Jie Zhou
FewRel 2.0: Towards More Challenging Few-Shot Relation Classification
Proceedings of EMNLP (Short Paper), 2019 pdf code

Xu Han*, Tianyu Gao*, Yuan Yao, Deming Ye, Zhiyuan Liu, Maosong Sun (* indicates equal contribution)
OpenNRE: An Open and Extensible Toolkit for Neural Relation Extraction
Proceedings of EMNLP (Demonstration Track), 2019 pdf code

Experiences


Tsinghua NLP Lab. As research assistant. Nov. 2017 - Present

  • Directed by Prof. Zhiyuan Liu.
  • Research on natural language processing and machine learning.

Mila-Quebec AI Institute. As Research Intern. July 2019 - Sept. 2019

  • Directed by Prof. Jian Tang.
  • Research on knowledge graph embedding and pre-training language models.

WeChat AI, Tencent. Tencent Rhino-Bird Elite Training Program. May 2019 - Present

  • Directed by Dr. Peng Li
  • Research on natural language processing and machine learning.

Momenta. As intern. May 2017 - May 2018

  • Directed by Ji Liang
  • Research on semantic segmentation.