Research work
I am currently Assistant professor at the Chinese university of Hongkong, Shenzhen (CUHK-SZ).
My main research work consists in employing Theory of Random Matrices to predict the performances of machine learning algorithms like Ridge regression, robust regression, kernel methods, ELM, transfer learning, softmax classification… The theory of random matrices becomes essential when one studies data with high dimension (like images or sounds for instance). My personal contribution is to start from a Concentration of Measure hypothesis on the data to avoid Gaussian or strong independence hypotheses. The concentration of measure hypothesis is a flexible property valid for a wide range of random vectors, possibly having complex dependencies between the entries (like images or sounds). It allows us to obtain precise convergence results on the performances. Besides, the validity of this assumption has been justified for artificial images provided by Generative neural networks, as they are Lipschitz transformation of a Gaussian vector. Our practical experiments tend to show that this hypothesis is also true for most real data.
A secondary topic concerns a projects I had to lead when I was data scientist at EDF Beijing. The goal is to model static physical system like chillers or heating networks with transformers. I recruited a PhD student, Wenrang Zhang, who has been working on the project since September 2023.
I am looking for PhD students and Postdoc researchers to work on Random Matrices/Concentration of Measure issues. Please contact me if you are interested in these subjects AND had a serious look at these publications: [operations and concentration] [MP non iid - in preparation] [Softmax]. I also published the following [Internship offer] for students with a strong background in maths and interested in Machine Learning applications.