Overview

My research interest lies in a broad range of Natural Language Processing (NLP) problems. Natural language is a fundamental form of information and communication. I design Machine Learning models (deep learning, unsupervised learning, transfer learning, reinforcement learning, multimodal learning) for NLP systems that understand language and communicate with people in different contexts, domains, text styles, and languages. To be specific, we have the following ongoing research projects:

Interactive and executable semantic parsing: Synthesize formal programs (e.g., SQL) from natural language for question answering between humans and machines

Text summarization: Summarize information in news, emails, and scientific articles

Cross-lingual information retrieval: Retrieve information relevant to a user’s query from documents in multiple low-resource languages

Open-domain data-to-text generation: Generate accurate textual descriptions for data in tables, databases, knowledge bases, knowledge graphs