Jason Lee

I am a Ph.D student in Computer Science at New York University, where I work on generative modelling with a focus on text. I am a member of the CILVR group and work closely with Kyunghyun Cho.

Before joining NYU, I received my Bachelors and Masters from St John's College, University of Cambridge and worked as a research assistant at ETH Zürich. I also spent time at Google Brain and Facebook AI Research as an intern.

My research is supported by Qualcomm Innovation Fellowship (2016-2017).

Email | CV | Google Scholar | Github

Jason Lee

Iterative Refinement in the Continuous Space for Non-Autoregressive Neural Machine Translation
J. Lee, R. Shu and K. Cho
Empirical Methods in Natural Language Processing (EMNLP), 2020
[arxiv] [bib]

On the Discrepancy between Density Estimation and Sequence Generation
J. Lee, D. Tran, O. Firat and K. Cho
In submission, 2020
[arxiv] [bib]

Latent-Variable Non-Autoregressive Neural Machine Translation with Deterministic
Inference using a Delta Posterior
R. Shu, J. Lee, H. Nakayama and K. Cho
AAAI Conference on Artificial Intelligence (AAAI) , 2020
[arxiv] [bib]

Multi-Turn Beam Search for Neural Dialogue Modeling
I. Kulikov*, J. Lee* and K. Cho
Neural Information Processing Systems (NeurIPS), Conversational AI Workshop, 2019
[arxiv] [bib]

Countering Language Drift via Visual Grounding
J. Lee, K. Cho and D. Kiela
Empirical Methods in Natural Language Processing (EMNLP), 2019
[arxiv] [bib]

Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement
J. Lee*, E. Mansimov* and K. Cho
Empirical Methods in Natural Language Processing (EMNLP), 2018
[arxiv] [code] [bib]

Emergent Translation in Multi-Agent Communication
J. Lee, K. Cho, J. Weston and D. Kiela
International Conference on Learning Representations (ICLR), 2018
[arxiv] [code] [bib]

Fully Character-Level Neural Machine Translation without Explicit Segmentation
J. Lee, K. Cho and T. Hofmann
Transactions of the Association for Computational Linguistics (TACL), 2017
[arxiv] [journal] [code] [bib]