About this Event
5200 N Lake Rd, Merced, CA 95343https://eecs.ucmerced.edu/
Decision trees are considered to be one of the oldest machine learning models which received a lot of attention from practitioners and research community. They are conceptually simple yet powerful. State of-the-art frameworks, such as XGBoost or LightGBM, rely on them as base learners, but they have been used as well as standalone predictors. Despite the rich history of decision trees and existence of numerous methods, their applicability beyond traditional supervised learning has been explored in limited extent. For instance, various fast growing ML subfields, such as semi-supervised and selfsupervised learning, nonlinear dimensionality reduction, etc. have been barely used with trees. What is common to most of these tasks is that the objective function takes a certain form, which involves "manifold regularization" to exploit the geometry of the underlying data distribution. In this dissertation, we study decision trees and, more generally, tree-based models under this setting. We argue that these type of problems carry a great practical importance. Furthermore, using semisupervised learning and nonlinear dimensionality reduction as examples, we derive a generic algorithm to solve such optimization problems. It is based on a reformulation of the problem which requires
iteratively solving two simpler problems. One problem will always involve supervised learning of a tree. The second one will depend on a particular problem type and can be one of the following: solving a linear system, optimizing non-linear embeddings or something else. We also show that the algorithm is scalable and highly effective on number of tasks.
Arman Zharmagambetov is a PhD candidate in EECS department at UC Merced. He holds BSc and MSc in Mathematical and Computer Modeling from IITU (Almaty, Kazakhstan). His primary research interests are in machine learning. Specifically, his recent works include developing learning algorithms for decision trees and treebased models; and their applications in various domains, such as: (semi-)supervised learning, neural net compression, dimensionality reduction and model interpretability. Arman is recipient of
several graduate fellowships and his publications appeared in top venues, including NeurIPS, ICML, EMNLP, ICASSP, etc.
0 people are interested in this event