I am a second-year master student majoring in Computer Science at Georgia Tech. I'm working with Prof. Diyi Yang at SALT lab. Before coming to Georgia Tech, I had wonderful research experiences in University of Edinburgh, Peking University and Microsoft Research Asia.
My research involves Natural Language Processing, and my goal is to 1) solve the data scarcity problem; 2) improve the generalization ability of models; and 3) design controllable models, by 1) using unlabeled, out-of-domain or augmented text data, 2) incorporating external knowledge or inductive bias into models, and 3) designing models with intermediate abstractions/discrete structures.
I'm trying to achieve this goal in 1) natural languge generation, 2) structured prediction especially semantic parsing, and 3) cross-lingual tasks.
Nov 23rd, 2020: Two papers are submitted to NAACL.
Sep 15th, 2020: One paper "Planning and Generating Natural and Diverse Disfluent Texts as Augmentation for Disfluency Detection." is accepted as a long paper at EMNLP 2020.
Master of Science: Computer Science, Georgia Tech, Atlanta, America. Aug 2019 - May 2021 (expected).
Bachelor of Science: Computer Science and Biological Science, Peking University, Beijing, China. Sep 2015 - July 2019.
Jingfeng Yang, Federico Fancellu, Bonnie Webber, Diyi Yang. 2020. Frustratingly Simple but Surprisingly Strong: Using Language-Independent Features for Zero-shot Cross-lingual Semantic Parsing. Submitted to 2021 NAACL. (Available upon request)
Yang Zhong, Jingfeng Yang, Wei Xu, Diyi Yang. 2020. MsBC: On Automatically Detecting Multi-Span Subjective Bias. Submitted to 2021 NAACL. (Available upon request)
Stephanie Schoch, Wanyu Du, Jingfeng Yang, Diyi Yang, Yangfeng Ji. 2020. It Takes Two - Tasks and Datasets! Linguistically-Informed Analysis of Style Transfer. Submitted to TACL. (Available upon request)
Jingfeng Yang, Zhaoran Ma, Diyi Yang. 2020. Planning and Generating Natural and Diverse Disfluent Texts as Augmentation for Disfluency Detection. In 2020 Empirical Methods in Natural Language Processing (EMNLP). [Long Paper] [Code]
Jingfeng Yang, Sujian Li. 2018. Chinese Discourse Segmentation Using Bilingual Discourse Commonality. Preprint. [Paper]
Yizhong Wang, Sujian Li, Jingfeng Yang. 2018. Toward Fast and Accurate Neural Discourse Segmentation. In 2018 Empirical Methods in Natural Language Processing (EMNLP). [Paper]
Yizhong Wang, Sujian Li, Jingfeng Yang, Xu Sun, Houfeng Wang. 2017. Tag-enhanced tree-structured neural networks for implicit discourse relation classification. In The 8th International Joint Conference on Natural Language Processing (IJCNLP). [Paper]
Research Intern in College of Computing, Georgia Institute of Technology. Advisor: Diyi Yang. Aug 2019 - present.
Visiting Researcher in Institute for Language, Cognition and Computation, The University of Edinburgh. Advisor: Bonnie Webber. July 2018 - Sep 2018.
Research Intern in Department of Computational Linguistics, Peking University. Advisor: Sujian Li. July 2017 - June 2019.
Software Development Engineer Intern, Amazon, San Francisco. May 2020 - July 2020.
Teaching Assistant, CS7650-4650 Natural Language Processing, Georgia Institue of Technology, Atlanta. Fall 2020.
Teaching Assistant, CS7650-4650 Natural Language Processing, Georgia Institue of Technology, Atlanta. Spring 2020.
Research and Software Engineer Intern, Microsoft Research Asia, Beijing, China. December 2018 - March 2019.
May 4th Fellowship, 2016-2017.
Merit Student of Peking University, 2016-2017.
Kwang-Hua Fellowship, 2015-2016.
Merit Student of Peking University, 2015-2016.
Silver medalist in Chinese Mathematics Olympiad (CMO), 2015.