I am a second-year master student majoring in Computer Science at Georgia Tech. I'm working with Prof. Diyi Yang at SALT lab. Before coming to Georgia Tech, I received B.S. in Computer Science and Biological Science at Peking University in 2019, where I worked with Prof. Sujian Li. In summer 2018, I worked as a research intern with Prof. Bonnie Webber at the University of Edinburgh. I was also an intern at Microsoft Research Asia and Amazon.
I have a broad interest in Natural Language Processing and Machine Learning. My research goal is to 1) solve the data scarcity problem, 2) improve the generalization ability of models, and 3) design controllable and interpretable models, by 1) using unlabeled, out-of-domain or augmented text data, 2) incorporating external knowledge or inductive biases into models, and 3) designing models with intermediate abstractions/discrete structures.
Specifically, I'm trying to achieve this goal in 1) text generation, 2) structured prediction especially semantic parsing, and 3) cross-lingual tasks.
Nov 23rd, 2020: Two papers are submitted to NAACL 2021.
Sep 15th, 2020: One paper "Planning and Generating Natural and Diverse Disfluent Texts as Augmentation for Disfluency Detection." is accepted as a long paper at EMNLP 2020.
Master of Science: Computer Science, Georgia Tech, Atlanta, America. Aug 2019 - May 2021 (expected).
Bachelor of Science: Computer Science and Biological Science, Peking University, Beijing, China. Sep 2015 - July 2019.
- Jingfeng Yang, Federico Fancellu, Bonnie Webber, Diyi Yang. 2020. Frustratingly Simple but Surprisingly Strong: Using Language-Independent Features for Zero-shot Cross-lingual Semantic Parsing. Submitted to NAACL' 2021. (Available upon request)
- Yang Zhong, Jingfeng Yang, Wei Xu, Diyi Yang. 2020. MsBC: On Automatically Detecting Multi-Span Subjective Bias. Submitted to NAACL' 2021. (Available upon request)
- Jingfeng Yang, Zhaoran Ma, Diyi Yang. 2020. Planning and Generating Natural and Diverse Disfluent Texts as Augmentation for Disfluency Detection. In EMNLP' 2020. [Paper] [Code]
- Jingfeng Yang, Federico Fancellu, Bonnie Webber. 2019. A survey of cross-lingual features for zero-shot semantic parsing. Preprint. [Paper] [Code]
- Jingfeng Yang, Sujian Li. 2018. Chinese Discourse Segmentation Using Bilingual Discourse Commonality. Preprint. [Paper]
- Yizhong Wang, Sujian Li, Jingfeng Yang. 2018. Toward Fast and Accurate Neural Discourse Segmentation. EMNLP' 2018. [Paper]
- Yizhong Wang, Sujian Li, Jingfeng Yang, Xu Sun, Houfeng Wang. 2017. Tag-enhanced tree-structured neural networks for implicit discourse relation classification. In The 8th International Joint Conference on Natural Language Processing (IJCNLP' 2017). [Paper]
Rreviewer: NAACL' 2021
Research Intern in College of Computing, Georgia Institute of Technology. Advisor: Diyi Yang. Aug 2019 - present.
Visiting Researcher in Institute for Language, Cognition and Computation, The University of Edinburgh. Advisor: Bonnie Webber. July 2018 - Sep 2018.
Research Intern in Department of Computational Linguistics, Peking University. Advisor: Sujian Li. July 2017 - June 2019.
Head Teaching Assistant, CS-4650 Natural Language Processing, Georgia Institue of Technology, Atlanta. Spring 2021.
Teaching Assistant, CS-4650/7650 Natural Language Processing, Georgia Institue of Technology, Atlanta. Fall 2020.
Software Development Engineer Intern, Amazon, San Francisco. May 2020 - July 2020.
Teaching Assistant, CS-4650/7650 Natural Language Processing, Georgia Institue of Technology, Atlanta. Spring 2020.
Research and Software Engineer Intern, Microsoft Research Asia, Beijing, China. December 2018 - March 2019.
May 4th Fellowship, 2016-2017.
Merit Student of Peking University, 2016-2017.
Kwang-Hua Fellowship, 2015-2016.
Merit Student of Peking University, 2015-2016.
Silver medalist in Chinese Mathematics Olympiad (CMO), 2015.