Bin Yu

Chancellor's Distinguished Professor and Class of 1936 Second Chair 

Departments of Statistics and Electrical Engineering and Computer Sciences

 UC Berkeley 

Chan-Zuckerberg Biohub Investigator Alumnus • Weill Neurohub Investigator

mail: 367 Evans Hall #3860 • Berkeley, CA 94720

phone: 510-642-2781 • fax: 510-642-7892 •  binyu@berkeley.edu

Welcome

I'm Bin Yu, the head of the Yu Group at Berkeley, which consists of 12-15 students and postdocs from Statistics and EECS. I was formally trained as a statistician, but my research interests and achievements extend beyond the realm of statistics. Together with my group, my work has leveraged new computational developments to solve important scientific problems by combining novel and often interpretable statistical machine learning approaches with the domain expertise of my many collaborators in neuroscience, genomics and precision medicine. We also develop relevant theory to understand random forests and deep learning for insight into and guidance for practice. 

We have developed the PCS framework for veridical data science (or responsible, reliable, and transparent data analysis and decision-making). PCS stands for predictability, computability and stability, and it unifies, streamlines, and expands on ideas and best practices of machine learning and statistics.

In order to augment empirical evidence for decision-making,  we are investigating statistical machine learning methods/algorithms (and associated statistical inference problems) such as dictionary learning, non-negative matrix factorization (NMF), EM and deep learning (CNNs and LSTMs), and heterogeneous effect estimation in randomized experiments (X-learner). Our recent algorithms include staNMF for unsupervised learning, iterative Random Forests (iRF) and signed iRF (s-iRF) for discovering predictive and stable high-order interactions in supervised learning, next generation tree-based methods (e.g. fast and interpretable greedy-tree sums (FIGS) and hierarchical shrinked (HS) trees, and RF+ ), contextual decomposition (CD), aggregated contextual decomposition (ACD), and adaptive wavelet distillation (AWD) for interpretation of Deep Neural Networks (DNNs). 

My vision for data science - papers & talks

In the news