This paper describes an investigation of machine-learning control for the supervisory control of building active and passive thermal storage inventory. Previous studies show that the utilization of either active or passive, or both can yield significant peak cooling load reduction and associated electrical demand and operational cost savings. In this study, a model-free learning control is investigated for the operation of electrically driven chilled water systems in heavy-mass commercial buildings. The reinforcement learning controller learns to operate the building and cooling plant optimally based on the feedback it receives from past control actions. The learning agent interacts with its environment by commanding the global zone temperature setpoints and TES charging/discharging rate. The controller extracts cues about the environment solely based on the reinforcement feedback it receives, which in this study is the monetary cost of each control action. No prediction or system model is required. Over time and by exploring the environment, the reinforcement learning controller establishes a statistical summary of plant operation, which is continuously updated as operation continues. This presented analysis revealed that learning control is a feasible methodology to find a near-optimal control strategy for exploiting the active and passive building thermal storage capacity, and also shows that the learning performance is affected by the dimensionality of the action and state space, the learning rate and several other factors. Moreover learning speed proved to be relatively low when dealing with tasks associated with large state and action spaces.
Skip Nav Destination
ASME 2005 International Solar Energy Conference
August 6–12, 2005
Orlando, Florida, USA
Conference Sponsors:
- Solar Energy Division
ISBN:
0-7918-4737-3
PROCEEDINGS PAPER
Evaluation of Reinforcement Learning for Optimal Control of Building Active and Passive Thermal Storage Inventory
Simeng Liu,
Simeng Liu
University of Nebraska at Lincoln, Omaha, NE
Search for other works by this author on:
Gregor P. Henze
Gregor P. Henze
University of Nebraska at Lincoln, Omaha, NE
Search for other works by this author on:
Simeng Liu
University of Nebraska at Lincoln, Omaha, NE
Gregor P. Henze
University of Nebraska at Lincoln, Omaha, NE
Paper No:
ISEC2005-76085, pp. 301-311; 11 pages
Published Online:
October 15, 2008
Citation
Liu, S, & Henze, GP. "Evaluation of Reinforcement Learning for Optimal Control of Building Active and Passive Thermal Storage Inventory." Proceedings of the ASME 2005 International Solar Energy Conference. Solar Energy. Orlando, Florida, USA. August 6–12, 2005. pp. 301-311. ASME. https://doi.org/10.1115/ISEC2005-76085
Download citation file:
17
Views
Related Proceedings Papers
Related Articles
Evaluation of Reinforcement Learning for Optimal Control of Building Active and Passive Thermal Storage Inventory
J. Sol. Energy Eng (May,2007)
Advances in Near-Optimal Control of Passive Building Thermal Storage
J. Sol. Energy Eng (May,2010)
Parametric Analysis of Active and Passive Building Thermal Storage Utilization
J. Sol. Energy Eng (February,2005)
Related Chapters
QP Based Encoder Feedback Control
Robot Manipulator Redundancy Resolution
Feedback-Aided Minimum Joint Motion
Robot Manipulator Redundancy Resolution
Cooling a Radar’s Electronic Board
Electromagnetic Waves and Heat Transfer: Sensitivites to Governing Variables in Everyday Life