当前位置: 首 页 - 科学研究 - 学术报告 - 正文

bat365在线平台、所2020年系列学术活动(第306场):邓柯 副教授 清华大学

发表于: 2020-12-04   点击: 

报告题目:Partition-Mallows Model and Its Inference for Rank Aggregation

报 告 人:邓柯 副教授 清华大学

报告时间:2020年12月8日 10:10-11:10

报告地点:腾讯会议 ID:869 870 123 会议密码:654321

校内联系人:朱复康 fzhu@jlu.edu.cn


报告摘要:Learning how to rank and how to aggregate ranking lists has been an active research area for many years and its advances have played a vital role in many applications ranging from Bioinformatics to internet commerce. The problem of discerning reliability of rankers based only on the rank data is of great interest to many practitioners, but has received less attention from researchers. By dividing the ranked entities into two disjoint groups, i.e., the relevant ones and the irrelevant/background ones, and incorporating the Mallows model to model the relative ranking of relevant entities, we propose a novel framework for rank aggregation that can not only distinguish quality differences among the rankers, but also provide the detailed ranking information for relevant entities. Theoretical properties of the proposed approach are established, and its advantages over existing approaches are demonstrated via simulation studies and real-data applications. Extensions of our method to handle partial ranking lists and conduct covariate-assisted rank aggregation are also discussed.


报告人简介:Ke Deng is Associate Professor of Statistics at Tsinghua University. He received his B.Sc. in Applied Math in 2003 and Ph.D. in Statistics in 2008 from Peking University. He was a research associate at Harvard University before joining Tsinghua University in 2013. His research interests include Bayesian statistics and computation, bioinformatics, natural language processing and digital humanity. He is the founding President of the Chinese Association of Statistical Computing, the Vice President of the Chinese Association of Artificial Intelligence in Medicine, Research Fellow of Beijing Academy of Artificial Intelligence, and Associate Editor of Statistica Sinica.