新闻中心
网站首页   学会概况   学会规章   新闻中心   学术交流
社会服务   科学普及  计算机大赛   会员中心   联系方式
一键拨号
一键留言
会员中心
通知公告
技术创新论坛 Recent work on Natural Language Generation
2019-09-04
南京大学计算机科学与技术系
软件新技术与产业化协同创新中心

摘 要:
Natural language generation has been a fundamental technology in many applications such as machine writing, machine translation, chatbots, etc. In this talk, we will begin from the taxonomy of current deep generative models for text generation, then introduce our recent work in different branches. State-of-the-art text generation models employ neural networks such as RNN and Transformer to parameterize the density of text in an auto-regressive fashion, because the density of sentences is intractable for its exponential space. We will first introduce some advanced approaches to better factorize the density. Then we turn to the variational auto-encoders (VAE), which approximates the density of sentences with variational inference. Our recent work incorporates syntax latent variables to improve the quality of texts from VAE. We also propose a DGMVAE for interpretable text generation. Finally, different to previous approaches with explicit density of sentences, we explore a novel Markov Chain Monte Carlo approach called CGMH for constrained text generation, which does not keep an explicit density of sentences and generates sentences abandoning the left-to-right fashion. CGMH could also be used for generating fluent adversarial examples of text.
报告人简介:
Dr. Hao Zhou is a researcher at ByteDance AI Lab. His research interests are machine learning and its applications for natural language processing, including syntax parsing, machine translation and text generation. Currently he focuses on deep generative models for NLP. Previously he received his Ph.D. degrees in 2017, from Nanjing University. He has served in the Program Committee for ACL, EMNLP, IJCAI, AAAI, NIPS. He has more than 20 publications in prestigious conferences and journals, including ACL, EMNLP, NAACL, TACL, AAAI, IJCAI, NIPS and JAIR. He will give tutorials on deep generative model for text generation at NLPCC’19 and discreteness of neural NLP at EMNLP’2019 (Homepage: https://zhouh.github.io/).
时间:9月6日(星期五) 14:00
地点:计算机科学技术楼230室

上一篇:CSAI 卓越科学家大讲堂 联邦学习与人工智能
下一篇:学术报告Cost Effective Data Placement in the Cloud for Efficient Data Access of Online Social Networks
版权所有:江苏省计算机学会
苏ICP备14049275号-1