Haoyu He - 何灏宇

Haoyu He - 何灏宇

PhD Student

University of Tübingen


I am a first-year PhD student in the Autonomous Vision Group at University of Tübingen, supervised by Prof. Andreas Geiger. My research interests span natural language processing (NLP), trustworthy machine learning, and causal inference in NLP. The research vision of mine is to build systems that can intereact with human in a bidirectional understandable pattern.

  • Natural Language Processing
  • Information Retrieval
  • Machine Learning
  • Msc in Artificial Intelligence, 2022

    Northeastern University

  • BEng in Computer Science and Technology, 2019

    Wuhan University of Science and Technology

Industrial Experience

Amazon Web Services
Software Dev Engineer Intern
Amazon Web Services
Dec 2020 – Sep 2021 Shanghai
Worked as a research intern supervised by Dr. Xingjian Shi. Proposed a meta-learning framework that can be used to learn the underpinning factors within the process of Knowledge Distillation (KD). Based on this framework, we conducted a systematic experimental study of KD in NLP and proposed a novel objective function (MI-\alpha) to boost knowledge transfer.
E-Capital Transfer Co., Ltd.
Research Intern
E-Capital Transfer Co., Ltd.
Aug 2020 – May 2020 Shanghai

Responsibilities include:

  • Studied semantic models and improved the performance of sentence similarity prediction in RASA-based conversational agents, a product in this company.
  • Achieved in improving the accuracy of sentence similarity prediction task from 34% to 52% on the business dataset.
Ipsos (China) Consulting Co., Ltd.
Natural Language Processing Engineering Intern
Ipsos (China) Consulting Co., Ltd.
Oct 2019 – Apr 2020 Shanghai
Used NLP techniques to develop models which are used to analyze surveys and marketing reports.

Recent Publications

[1] Haoyu He, Xingjian Shi, Jonas Mueller, Sheng Zha, Mu Li, and GeorgeKarypis. Distiller: A systematic study of model distillation methods innatural language processing. InProceedings of the Second Workshop onSimple and Efficient Natural Language Processing, pages 119–133, Virtual,November 2021. Association for Computational Linguistics.