Real-time maintenance decision making in large manufacturing system is complex because it requires the integration of different information, including the degradation states of machines, as well as inventories in the intermediate buffers. In this paper, by using a discrete time Markov chain (DTMC) model, we consider the real-time maintenance policies in manufacturing systems consisting of multiple machines and intermediate buffers. The optimal policy is investigated by using a Markov Decision Process (MDP) approach. This policy is compared with a baseline policy, where the maintenance decision on one machine only depends on its degradation state. The result shows how the structures of the policies are affected by the buffer capacities and real-time buffer levels.

This content is only available via PDF.
You do not currently have access to this content.