I’m an applied scientist at AWS Translate.
I was a Data Science PhD candidate at New York University and a member of ML² Group at CILVR, co-advised by Prof. Sam Bowman and Prof. Kyunghyun Cho. During my PhD, I worked on applying transfer learning and multi-task learning methods to NLP problems, and analyzing these methods to understand why and when they work/fail.
Previously, I interned at AWS AI, Facebook AI Research, and Grammarly.
Prior to NYU, I developed Information Retrieval systems at Institute of High Performance Computing in Singapore. Before that, I earned my bachelor’s degree in Computer Science at Nanyang Technological University.
Publications
* equal contribution
2022
BBQ: A Hand-Built Bias Benchmark for Question Answering
Alicia Parrish, Angelica Chen, Nikita Nangia, Vishakh Padmakumar, Jason Phang, Jana Thompson, Phu Mon Htut, Samuel R Bowman.
ACL. 2022.
[ArXiv]Clustering Examples in Multi-Dataset Benchmarks with Item Response Theory
Pedro Rodriguez, Phu Mon Htut, John P Lalor, João Sedoc.
The Workshop on Insights from Negative Results in NLP at ACL. 2022.
[Paper]
2021
- Comparing Test Sets with Item Response Theory
Clara Vania*, Phu Mon Htut*, William Huang*, Dhara Mungra, Richard Yuanzhe Pang, Jason Phang, Haokun Liu, Kyunghyun Cho, Samuel R. Bowman.
ACL. 2021.
[Paper]
2020
Intermediate-Task Transfer Learning with Pretrained Models for Natural Language Understanding: When and Why Does It Work?
Yada Pruksachatkun*, Jason Phang*, Haokun Liu*, Phu Mon Htut*, Xiaoyi Zhang, Richard Yuanzhe Pang, Clara Vania, Katharina Kann, Samuel R. Bowman.
ACL. 2020.
[ArXiv]English Intermediate-Task Training Improves Zero-Shot Cross-Lingual Transfer Too
Jason Phang*, Iacer Calixto*, Phu Mon Htut, Yada Pruksachatkun, Haokun Liu, Clara Vania, Katharina Kann, Samuel R. Bowman.
AACL-IJCNLP. 2020.
[ArXiv]jiant: A Software Toolkit for Research on General-Purpose Text Understanding Models
Yada Pruksachatkun*, Phil Yeres*, Haokun Liu, Jason Phang, Phu Mon Htut, Alex Wang, Ian Tenney, Samuel R. Bowman.
ACL. 2020. (Demo track)
[ArXiv][jiant library]
2019
Do Attention Heads in BERT Track Syntactic Dependencies?
Phu Mon Htut*, Jason Phang*, Shikha Bordia*, and Samuel R. Bowman.
Natural Language, Dialog and Speech (NDS) Symposium, The New York Academy of Sciences. 2019. (Extended Abstract).
[Paper] [Poster] [Blog]Investigating BERT’s Knowledge of Language: Five Analysis Methods with NPIs.
Alex Warstadt*, Yu Cao*, Ioana Grosu*, Wei Peng*, Hagen Blix*, Yining Nie*, Anna Alsop*, Shikha Bordia*, Haokun Liu*, Alicia Parrish*, Sheng-Fu Wang*, Jason Phang*, Anhad Mohananey*, Phu Mon Htut*, Paloma Jeretic* and Samuel R. Bowman.
EMNLP. 2019.
[ArXiv]The Unbearable Weight of Generating Artificial Errors for Grammatical Error Correction.
Phu Mon Htut, Joel Tetreault.
The Workshop on Innovative Use of NLP for Building Educational Applications (BEA), ACL. 2019.
[Paper]Inducing Constituency Trees through Neural Machine Translation.
Phu Mon Htut, Kyunghyun Cho, Samuel R. Bowman.
Preprint. 2019.
[ArXiv] [Blog]Generalized Inner Loop Meta-Learning.
Edward Grefenstette, Brandon Amos, Denis Yarats, Phu Mon Htut, Artem Molchanov, Franziska Meier, Douwe Kiela, Kyunghyun Cho, Soumith Chintala.
Preprint. 2019.
[ArXiv][higher library]
2018
Grammar Induction with Neural Language Models: An Unusual Replication.
Phu Mon Htut, Kyunghyun Cho, Samuel R. Bowman.
EMNLP. 2018.
The Workshop on the Analysis and Interpretation of Neural Networks for NLP (Blackbox-NLP). 2018. (Extended abstract)
[Paper] [arXiv] [Code/Output-Parses]Training a Ranking Function for Open-Domain Question Answering.
Phu Mon Htut, Samuel R. Bowman, Kyunghyun Cho.
NAACL: Student Research Workshop. 2018.
[Paper] [arXiv] [Poster]
Teaching
TA for:
- DS-GA 1012: Natural Language Understanding and Computational Semantics, NYU (Spring-2020, Spring-2021)
- Sequence-to-sequence learning Tutorial, African Master’s Program in Machine Intelligence (AMMI) (2020)
- DS-GA 1011: Natural Language Processing with Representation Learning, NYU (Fall-2018)
Service
Outreach
- Organizer and TA: NYU AI School 2019, 2021, 2022
- Organizer and Mentor: Myanmar NLP Reading Group
Miscellaneous
I’m originally from Yangon, Myanmar(Burma).
You can call me “Phu” (h sound is silent for both my first and last name). Fun fact: I don’t have a family name.