您的位置: 首页» 学院概况» 师资队伍» 副高级

副高级

于   恒

  时间:2024-03-11  浏览:


基本信息

  • 职称副教授

  • 研究方向:大语言模型,表示学习,自然语言处理

  • 电子邮箱:hengyu@bnu.edu.cn

  • 个人主页:https://hengyu-nlp.github.io/

个人简介

365BET体育投注官网副教授,2014年到2023年先后在三星北京研究院、搜狗、阿里巴巴和Shopee从事多语言信息处理技术的研究,并将其成功应用于面向全球电子商务的商业化场景,发表学术论文20余篇,获得ACL杰出论文奖,浙江省科学技术进步二等奖,达摩奖等荣誉。2024年起在学院任教,主要研究领域为大语言模型,表示学习,自然语言处理。欢迎感兴趣的老师、研究生和本科生随时通过邮件或微信联系进行学术交流与合作。

教育背景

  • 2009年—2014 中国科学院大学 计算机软件与理论 博士 导师:刘群

  • 2005年—2009 天津师范大学计算机学院 计算机科学与技术 学士

工作经历

  • 2024.3 - 至今 365BET体育投注官网 副教授

  • 2022.2 - 2024.2 Shopee Principal Engineer

  • 2017.3 - 2022.2 阿里巴巴 高级算法专家

  • 2016.11 - 2017.3 搜狗 研究员

  • 2014.7 - 2016.10 三星中国研究院 研究员

主持和参加的科研项目

  • 三星-清华⼤学合作项⽬,基于深度学习的机器翻译在旅游领域的应⽤,2015-2016

  • 阿⾥巴巴-苏州⼤学合作项⽬⾯向神经机器翻译的⼲预技术研究,2017-2018

  • 863重大项目子课题, 多语言互联网语言翻译关键技术,2011-2013

  • 自然科学基金面上项目,面向科技文献的机器翻译关键技术研究,2009-2011

  • 自然科学基金重点项目,融合语言知识与统计模型的机器翻译方法研究,2008-2011

  • 863重点项目课题,面向跨语言搜索的机器翻译关键技术研究,2007-2010

主要学术成果

[1] Wei Xiangpeng, Heng Yu, Hu Yue, Weng Rongxiang, Luo Weihua, and Rong Jin. Learning to generalize to More: Continuous Semantic Augmentation for Neural Machine Translation. In Proceedings of ACL2022, Outstanding paper (CCF-A)

[2] Weng Rongxiang, Heng Yu, Luo Weihua, and Min zhang. Deep Fusing Pre-trained Models into Neural Machine Translation. In Proceedings of AAAI2022 (CCF-A)

[3] Wei Xiangpeng, Weng Rongxiang, Hu Yue, Xing Luxi, Heng Yu and Luo Weihua. On learning universal representations across languages. In Proceedings of ICLR2021 (Google h-index 303)

[4] Wei Xiangpeng, Heng Yu, Hu Yue, Weng, Rongxiang, Xing Luxi and Luo Weihua. Uncertainty-Aware Semantic Augmentation for Neural Machine Translation. In Proceedings of EMNLP2020 (CCF-B)

[5] Weng Rongxiang, Heng Yu, Wei Xiangpeng and Luo Weihua. Towards Enhancing Faithfulness for Neural Machine Translation. In Proceedings of EMNLP2020 (CCF-B)

[6] Deng Yongchao, Yu Hongfei, Heng Yu, Duan Xiangyu and Luo Weihua. Factorized Transformer for Multi-Domain Neural Machine Translation. In Proceedings of EMNLP2020 (CCF-B)

[7] Song Kai, Zhou Xiaoqing, Heng Yu, Huang Zhongqiang, Zhang Yue, Luo Weihua, Duan Xiangyu and Zhang Min. Towards better word alignment in transformer. In Proceedings of IEEE/ACM Transactions on Audio, Speech, and Language Processing 2020 (Impact Factor 4.364)

[8] Wei Xiangpeng, Heng Yu*, Hu Yue, Zhang Yue, Weng Rongxiang and Luo Weihua. Multiscale collaborative deep models for neural machine translation. In Proceedings of ACL2020 (CCF-A)

[9] Zhu Changfeng, Heng Yu, Cheng Shanbo and Luo Weihua. Language-aware Interlingua for Multilingual Neural Machine Translation. In Proceedings of ACL2020 (CCF-A)

[10] Song Kai, Wang Kun, Heng Yu, Zhang Yue, Huang Zhongqiang, Luo Weihua, Duan Xiangyu and Zhang Min. Alignment-enhanced transformer for constraining nmt with pre-specified translations. In Proceedings of AAAI2020 (CCF-A)

[11] Weng Rongxiang, Wei Haoran, Huang Shujian, Heng Yu, Bing Lidong, Luo Weihua and Chen Jiajun. GRET: Global Representation Enhanced Transformer. In Proceedings of AAAI2020 (CCF-A)

[12] Weng Rongxiang, Heng Yu, Huang, Shujian, Luo Weihua and Chen, Jiajun. Acquiring knowledge from pre-trained model to neural machine translation. In Proceedings of AAAI2020 (CCF-A)

[13] Zhou Long, Zhang Jiajun, Zong Chengqing, Heng Yu. Sequence generation: From both sides to the middle. In Proceedings of IJCAI2019 (CCF-A)

[14] Song Kai, Zhang Yue, Heng Yu, Luo Weihua, Wang Kun and Zhang Min. Code-switching for enhancing NMT with pre-specified translation In Proceedings of NAACL2019

[15] Chunyang Liu, Yang Liu, Maosong Sun and Heng Yu. Agreement-based Learning of Parallel Lexicons and Phrases from Non-Parallels Corpora. In Proceedings of ACL2016 (CCF-A)

[16] Heng Yu and Xuan Zhu. Recurrent Neural Network based Rule Sequence Model for Statistical Machine Translation. In Proceedings of ACL2015 (CCF-A)

[17] Chunyang Liu, Yang Liu, Huanbo Luan, Maosong Sun, and Heng Yu. Generalized Agreement for Bidirectional Word Alignment. In Proceedings of EMNLP2015 (CCF-B)

[18] Heng Yu, Liang Huang, Haitao Mi, and Qun Liu. Max-violation perceptron and forced decoding for scalable MT training. In Proceedings of EMNLP2013 (CCF-B)

[19] 多语言翻译模型的生成方法、翻译方法及设备, CN110874537B, 2018

[20] Method and device for translating object information and acquiring derivative information, US10990768B2, 2017

[21] Object information translation and derivative information acquisition method and device, CN107273106B, 2016

[22] 一种面向增量式翻译的结构化语言模型构建方法及系统, CN102945231A, 2012

奖励与荣誉

  • ACL杰出论文奖, 2022

  • 阿里巴巴达摩院最高奖-达摩奖: 多语言创新技术, 2021

  • 浙江省科学技术进步二等奖-面向全球电子商务的多语言处理技术与平台, 2019

  • 三星中国研究院创新奖,2016

学术与社会服务

  • 中国计算机学会中文信息技术专委会青工委委员

  • 国际会议ACL, NIPS, ICLR, EMNLP, AAAI, IJCAI 审稿人

招生说明

招生研究方向为以教育应用为特色的大语言模型和相关自然语言处理技术,招生理念请参考黄华教授招生说明(https://vmcl.bnu.edu.cn/recruit/zpsm/index.htm),可以通过hengyu@bnu.edu.cn与我联系。


Baidu
sogou