About this Event
5200 N Lake Rd, Merced, CA 95343https://eecs.ucmerced.edu/seminars
Fall 2021 Electrical Engineering and Computer Science (EECS) Seminar Series
"TAO: Efficient and Universal Algorithm to Train Decision Trees and Tree-Based Models"
University of California, Merced
Faculty Host: Miguel Carreira-Perpiñán
Decision trees and tree-based models are one of the oldest and most well-known models in machine learning. In recent years, they are attracting a lot of attention from researchers and practitioners due to increasing interest in model interpretability and efficiency which are the main advantageous of decision trees. Despite a rich history, training this type of models is still considered as a non-trivial task and little progress was made since early 90s. In this talk, I present a novel algorithm, Tree Alternating Optimization (TAO), which can be applied to efficiently train many types of trees. Although the resulting algorithm is simple and benefits from high parallelism, it is based on solid optimization principles which is supported by theoretical grounding. Empirically, we show that decision trees (either single or ensemble of them) that are trained by the TAO algorithm can outperform the state-of-the-art tree-based models (like XGBoost, Random Forest, etc.) in terms of accuracy, model size and prediction runtime, often by a large margin.
Arman Zharmagambetov is a PhD candidate in EECS department at UC Merced advised by Prof. Miguel Á. Carreira-Perpiñán. He holds BSc and MSc in Mathematical and Computer Modeling from International Information Technology University (Almaty, Kazakhstan). His primary research interests are in machine learning. Specifically, his recent works include developing learning algorithms for decision trees and tree-based models; and their applications in various domains, such as: supervised learning, neural net compression, dimensionality reduction and model interpretability. Arman is recipient of several graduate fellowships and his publications appeared in top venues, including ICML, EMNLP, ICASSP, etc.
0 people are interested in this event