Thursday, May 1, 2025 2pm to 4pm
About this Event
5200 Lake Rd, Merced, CA 95343
Towards Uncertainty-Aware Model-Based Reinforcement Learning
Model-based Reinforcement Learning (MBRL) has garnered significant attention in the field of artificial intelligence primarily due to its sample efficiency, requiring substantially fewer interactions with the environment compared to model-free approaches. However, this efficiency comes with an inherent challenge: the learned environmental models inevitably contain imperfections and uncertainties that can significantly impact agent performance. As MBRL systems transition from academic research to real-world applications, the need for robust uncertainty quantification becomes increasingly critical. This dissertation advances MBRL by developing novel methodologies for incorporating uncertainty awareness, thereby enhancing both performance and safety in real-world applications.
Zhiyu An is a final-year PhD candidate in the Electrical Engineering and Computer Science (EECS) program at University of California, Merced. His research focuses on uncertainty-aware model-based reinforcement learning, with methods and applications to energy-efficient control of Heating, Ventilation, and Air-Conditioning (HVAC) systems. He is the first author of multiple publications in top venues including Design Automation Conference (DAC), International Conference on Learning Representations (ICLR), Learning for Dynamics and Control Conference (L4DC), and ACM International Conference on Energy-Efficient Buildings, Cities, and Transportation (BuildSys). His work was named Best Paper Runner-Up at ACM BuildSys’23. He has been a reviewer for International Conference on IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), ACM Conference on Sensor Systems (SenSys), IEEE Internet-of-Things Journal (IoT-J), and IEEE Transactions on Network Science and Engineering (T-NSE).
0 people are interested in this event
User Activity
No recent activity