Thursday, June 12, 2025 2pm to 4pm
About this Event
5200 N Lake Rd, Merced, CA 95343
The Role of Alternating Optimization in Learning Decision Tree Ensembles
Abstract
Decision forests have become an important model class in machine learning, known for their strong predictive performance, robustness, and minimal needs for hyper-parameter tuning. However, their training remains largely heuristic: individual trees and entire forests are typically built via greedy partitioning methods without optimizing a clearly defined objective function. This thesis presents a more principled, optimization-driven framework for learning decision forests. Building on the recently proposed Tree Alternating Optimization algorithm, we demonstrate how explicitly optimized trees can significantly improve Gradient Boosting performance. We then extend this idea to the ensemble level, introducing Forest Alternating Optimization (FAO), a method that jointly optimizes all trees in a fixed-sized forest, replacing the incremental, greedy nature of traditional boosting. Demonstrating the superior accuracy of FAO forests, we further explore FAO in the context of decision stump ensembles, which correspond to generalized additive models (GAMs). With targeted regularization, these stump forests achieve state-of-the-art results in learning accurate GAMs.
Biography
Magzhan Gabidolla is a PhD candidate in EECS department
at UC Merced advised by Prof. Miguel Á. Carreira-Perpiñán.
He holds BSc degree in Computer Science from
Nazarbayev University (Astana, Kazakhstan). His primary
research interests are in machine learning. Specifically, his recent works include learning trees-based models, forests and generalized additive models, and their application in various domains: supervised learning, clustering, neural network compression, image processing and model interpretability. His publications appeared in top venues, including ICML, CVPR, KDD, and EMNLP.
0 people are interested in this event
User Activity
No recent activity