avatar

让我们面对现实,让我们终于理想(2020/01/26)

Bingning Wang (王炳宁)

Bingning is currently the head of Pre-training at Baichuan Intelligence. He received his Ph.D from the Institute of Automation, Chinese Academy of Sciences in 2018, under the supervision of Prof. Jun Zhao and Kang Liu, with research focused on question answering systems and generative models.

Before joining Baichuan, he held senior research positions at Sogou and Tencent, accumulating extensive experience in large-scale generative models. He led the creation and release of several large-scale Chinese QA datasets, including ReCO, ComQA, ChiQA, and T2Ranking. At Tencent, he contributed to the development of the "Shenzhou" series of BERT-based language models, which achieved top rankings on the CLUE leaderboard.

He is the leading force behind the Baichuan series of pre-trained models, which have garnered over 10,000 stars on GitHub and more than 10 million downloads globally on Hugging Face. Wang has published 12 first-author papers in top-tier AI and NLP conferences such as ACL, SIGIR, and AAAI, and received the Best Paper runner-up Award at CIKM 2021. His Ph.D. dissertation, Key Technologies for Machine Reading Comprehension, was recognized as an Excellent Doctoral Dissertation by the Chinese Information Processing Society in 2019. He also serves as an executive member of the Youth Working Committee of the Chinese Information Processing Society.

Research Interests and Thoughts

What's New

Projects

Publications [Google Scholar]

Awards

Service

Contact

Email: god@bingning.wang, daniel@baichuan-inc.com

Miscellaneous