首页 | 本学科首页   官方微博 | 高级检索  
   检索      


Strategic capacity decision‐making in a stochastic manufacturing environment using real‐time approximate dynamic programming
Authors:Nikolaos E Pratikakis  Matthew J Realff  Jay H Lee
Institution:School of Chemical and Biomolecular Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332‐0100
Abstract:In this study, we illustrate a real‐time approximate dynamic programming (RTADP) method for solving multistage capacity decision problems in a stochastic manufacturing environment, by using an exemplary three‐stage manufacturing system with recycle. The system is a moderate size queuing network, which experiences stochastic variations in demand and product yield. The dynamic capacity decision problem is formulated as a Markov decision process (MDP). The proposed RTADP method starts with a set of heuristics and learns a superior quality solution by interacting with the stochastic system via simulation. The curse‐of‐dimensionality associated with DP methods is alleviated by the adoption of several notions including “evolving set of relevant states,” for which the value function table is built and updated, “adaptive action set” for keeping track of attractive action candidates, and “nonparametric k nearest neighbor averager” for value function approximation. The performance of the learned solution is evaluated against (1) an “ideal” solution derived using a mixed integer programming (MIP) formulation, which assumes full knowledge of future realized values of the stochastic variables (2) a myopic heuristic solution, and (3) a sample path based rolling horizon MIP solution. The policy learned through the RTADP method turned out to be superior to polices of 2 and 3. © 2010 Wiley Periodicals, Inc. Naval Research Logistics 2010
Keywords:queueing networks  approximate dynamic programming  real‐time dynamic programming  capacity planning
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号