Personal Information

He is a principal researcher and team leader at WeChat, Tencent Inc, China.
He got his Ph.D. degree at Institute of Computing Technology, Chinese Academy of Sciences.
His research interests include machine translation, natural language processing and large language modeling.
E-mail: fandongmeng [at] tencent [dot] com
News
Publications
Please go to [Full List] or [Google Scholar] to see all my publications.
Recent Preprints
- Jiaan Wang, Fandong Meng and Jie Zhou. ExTrans: Multilingual Deep Reasoning Translation via Exemplar-Enhanced Reinforcement Learning. [arxiv]
- Zhengrui Ma, Yang Feng, Chenze Shao, Fandong Meng, Jie Zhou and Min Zhang. Efficient Speech Language Modeling via Energy Distance in Continuous Latent Space. [arxiv]
- Ziqing Qiao, Yongheng Deng, Jiali Zeng, Dong Wang, Lai Wei, Fandong Meng, Jie Zhou, Ju Ren and Yaoxue Zhang. ConCISE: Confidence-guided Compression in Step-by-step Efficient Reasoning. [arxiv]
- Xue Zhang, Songming Zhang, Yunlong Liang, Fandong Meng, Yufeng Chen, Jinan Xu and Jie Zhou. A Dual-Space Framework for General Knowledge Distillation of Large Language Models. [arxiv]
- Jiaan Wang, Fandong Meng and Jie Zhou. Deep Reasoning Translation via Reinforcement Learning. [arxiv]
- Zhibin Lan, Liqiang Niu, Fandong Meng, Jie Zhou and Jinsong Su. LLaVE: Large Language and Vision Embedding Models with Hardness-Weighted Contrastive Learning. [arxiv]
- Xinyan Guan, Jiali Zeng, Fandong Meng, Chunlei Xin, Yaojie Lu, Hongyu Lin, Xianpei Han, Le Sun and Jie Zhou. DeepRAG: Thinking to Retrieval Step by Step for Large Language Models. [arxiv]
- Jiaan Wang, Fandong Meng, Yingxue Zhang and Jie Zhou. Retrieval-Augmented Machine Translation with Unstructured Knowledge. [arxiv]
- Meiqi Chen, Fandong Meng, Yingxue Zhang, Yan Zhang and Jie Zhou. CRAT: A Multi-Agent Framework for Causality-Enhanced Reflective and Retrieval-Augmented Translation with Large Language Models. [arxiv]
- Xiangyu Hong, Che Jiang, Biqing Qi, Fandong Meng, Mo Yu, Bowen Zhou and Jie Zhou. On the token distance modeling ability of higher RoPE attention dimension. [arxiv]
- Chao Hu, Yitian Chai, Hao Zhou, Fandong Meng, Jie Zhou and Xiaodong Gu. How Effectively Do Code Language Models Understand Poor-Readability Code?. [paper]
- Wenchao Chen, Liqiang Niu, Ziyao Lu, Fandong Meng and Jie Zhou. MaskMamba: A Hybrid Mamba-Transformer Model for Masked Image Generation. [arxiv]
Recent Publications
- Yunlong Liang, Fandong Meng and Jie Zhou. THOR-MoE: Hierarchical Task-Guided and Context-Responsive Routing for Neural Machine Translation. In Proceedings of ACL 2025.
- Jiaan Wang, Fandong Meng, Zengkui Sun, Yunlong Liang, Yuxuan Cao, Jiarong Xu, Haoxiang Shi and Jie Zhou. An Empirical Study of Many-to-Many Summarization with Large Language Models. In Proceedings of ACL 2025. [arxiv]
- Xue Zhang, Yunlong Liang, Fandong Meng, Songming Zhang, Yufeng Chen, Jinan Xu and Jie Zhou. Less, but Better: Efficient Multilingual Expansion for LLMs via Layer-wise Mixture-of-Experts. In Proceedings of ACL 2025.
- Liang Zhang, Ziyao Lu, Fandong Meng, Hui Li, Jie Zhou and Jinsong Su. Advancing SMoE for Continuous Domain Adaptation of MLLMs: Adaptive Router and Domain-Specific Loss. In Proceedings of ACL 2025.
- Liang Zhang, Yang Zhang, Ziyao Lu, Fandong Meng,, Jie Zhou and Jinsong Su. A Self-Denoising Model for Robust Few-Shot Relation Extraction. In Proceedings of ACL 2025.
- Kun Ouyang, Yuanxin Liu, Shicheng Li, Yi Liu, Hao Zhou, Fandong Meng, Jie Zhou and Xu Sun. PunchBench: Benchmarking MLLMs in Multimodal Punchline Comprehension. In Proceedings of ACL 2025.
- Jiaan Wang, Fandong Meng, Yunlong Liang and Jie Zhou. DRT: Deep Reasoning Translation via Long Chain-of-Thought. In Findings of ACL 2025.
- Bowen Ping, Jiali Zeng, Fandong Meng, Shuo Wang, Jie Zhou and Shanghang Zhang. LongDPO: Unlock Better Long-form Generation Abilities for LLMs via Critique-augmented Stepwise Information. In Findings of ACL 2025.
- Yijie Chen, Yijin Liu, Fandong Meng, Yufeng Chen, Jinan Xu and Jie Zhou. Enhancing Cross-Tokenizer Knowledge Distillation with Contextual Dynamical Mapping. In Findings of ACL 2025.
- Zhibin Lan, Liqiang Niu, Fandong Meng, Wenbo Li, Jie Zhou and Jinsong Su. AVG-LLaVA: An Efficient Large Multimodal Model with Adaptive Visual Granularity. In Findings of ACL 2025.
- Jiaxin Shen, Jinan Xu, Huiqi Hu, Luyi Lin, Guoyang Ma, Fei Zheng, Fandong Meng,, Jie Zhou and Wenjuan Han. A Law Reasoning Benchmark for LLM with Tree-Organized Structures including Factum Probandum, Evidence and Experiences. In Findings of ACL 2025.
- Chenze Shao, Fandong Meng and Jie Zhou. Continuous Visual Autoregressive Generation via Score Maximization. In Proceedings of ICML 2025.
- Chenze Shao, Fandong Meng and Jie Zhou. Patch-Level Training for Large Language Models. In Proceedings of ICLR 2025. [arxiv]
- Yuxian Gu, Hao Zhou, Fandong Meng, Jie Zhou and Minlie Huang. MiniPLM: Knowledge Distillation for Pre-Training Language Models. In Proceedings of ICLR 2025. [arxiv]
- Yutong Wang, Jiali Zeng, Xuebo Liu, Derek F. Wong, Fandong Meng, Jie Zhou and Min Zhang. DelTA: An Online Document-Level Translation Agent Based on Multi-Level Memory. In Proceedings of ICLR 2025. [arxiv]
- Yucheng Ding, Yangwenjian Tan, Xiangyu Liu, Chaoyue Niu, Fandong Meng, Jie Zhou, Ning Liu, Fan Wu and Guihai Chen. Personalized Language Model Learning on Text Data Without User Identifiers. In Proceedings of KDD 2025.
- Xue Zhang, Yunlong Liang, Fandong Meng, Songming Zhang, Yufeng Chen, Jinan Xu and Jie Zhou. Multilingual Knowledge Editing with Language-Agnostic Factual Neurons. In Proceedings of COLING 2025. [arxiv]
Professional Services
- Action Editor: ACL Rolling Review (2024)
- Standing Reviewer: TACL (2021-)
- Meta-Reviewer: NeurIPS 2024 (Area Chair); AAAI 2022 (Senior PC Member)
- PC Member & Reviewer: NeurIPS (2023); ACL (2014, 2017-2022); EMNLP (2018, 2020-2022); NAACL (2016, 2018, 2021); AAAI (2020, 2021); IJCAI (2019); COLING (2018)
Honours
- Best Long Paper Awards of ACL 2019 and EMNLP 2023.
- Champion of ICDAR 2023 Competition on Robust Layout Segmentation in Corporate Documents.
- Champion of WMT22 Chat Translation Task on English->German and German->English.
- Champion of WMT22 Biomedical Translation Task on Chinese->English.
- Champion of WMT21 news translation task on English->Chinese, English->Japanese, Japanese->English, and English->German (among constrained systems).
- Champion of WMT20 news translation task on Chinese->English.
- National Scholarship for Graduate Excellence of UCAS in 2015.
- UCAS Outstanding Student Awards in 2013 and 2014.
- Outstanding Winner (world 1/14) of COMAP's Mathematical Contest in Modeling (MCM/ICM) in 2010.
- First Prize Winner of National Mathematic Contest In Modeling in 2009.