Published multiple papers in international conferences such as ICML (International Conference on Machine Learning), IEEE Transactions on Information Theory, etc. Specific papers include but are not limited to: Exactly Tight Information-theoretic Generalization Bounds via Binary Jensen-Shannon Divergence, How Does Distribution Matching Help Domain Generalization: An Information-theoretic Analysis, Towards Generalization beyond Pointwise Learning: A Unified Information-theoretic Perspective, etc.
Research Experience
Post Doctoral Scholar at the School of Biomedical Informatics, Ohio State University.
Education
Completed Ph.D. degree at the School of Computer Science and Technology, Xi’an Jiaotong University in September 2024, advised by Prof. Chen Li and Prof. Tieliang Gong; obtained B.E. degree in Computer Science and Technology at Xi’an Jiaotong University.
Background
Research interests: machine learning and statistical learning theory. Recently, focusing on information-theoretic generalization analysis and robust learning in supervised learning, contrastive learning, and domain generalization. Main research topics include: analyzing the generalization ability of randomized learning algorithms through the lens of information theory; designing effective and robust learning algorithms based on information-theoretic measurements and analysis; developing computationally efficient approximations for information-theoretic quantities and measurements.