首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The existing literature on economic design of process control charts generally assumes perfect process adjustment, such that the process mean is returned to an exactly centered “in control” state following any real or false alarm control chart signal. This paper presents a model which demonstrates the effects of imperfect process adjustment on the economically designed control chart parameters. The model demonstrates that the optimal control limit width depends fundamentally on the precision with which the process can be adjusted. The greater the process adjustment error, all else constant, the wider will be the optimal control limits, in order to alleviate the potential for process overcontrol and tampering effects. By endogenously modeling these effects, the new model helps to rectify the problem of poor statistical properties for which the economic design approach has been criticized. © 1999 John Wiley & Sons, Inc. Naval Research Logistics 46: 597–612, 1999  相似文献   

2.
A new approximate method is proposed for the economic design of control charts based on an estimate of the power of the control chart at optimality. Multiple linear regression is employed for the derivation of the approximate formula expressing the power of the control chart as a function of the model parameters. A simple optimization procedure is then used to determine the economic design of the control chart for the predicted value of the chart's detection power. The application of the new approach is illustrated through Duncan's models for variables control charts for processes subject to single and multiple assignable causes. Evaluation of the performance of the approximate method indicates that the approximate control chart design is very close to the exact optimum while its implementation requirements are reduced.  相似文献   

3.
This article provides formulas for estimating the parameters to be used in the basic EOQ lot-size model. The analysis assumes that the true values of these parameters are unknown over known ranges and perhaps nonstationary over time. Two measures of estimator “goodness” are derived from EOQ sensitivity analysis. Formulas are given for computing the minimax choice and the minimum expected value choice for the parameter estimates using both measures of estimator “goodness”. A numerical example is included.  相似文献   

4.
加速退化试验广泛应用于橡胶密封件等长寿命产品的可靠性评估,试验过程中需要将高应力水平下的试验结果外推到正常应力水平。要获得准确的产品可靠性评估结果,需要保证加速应力下的退化失效机理与正常应力下的退化失效机理一致。基于似然比检验原理,提出加速退化试验机理一致性判别方法及流程。针对失效机理一致与失效机理变化两种场合,提出对数线性及非对数线性两类加速模型,并结合混合效应模型描述产品退化过程。利用似然比检验判断加速模型参数是否变化,完成失效机理一致性判别。仿真算例和应用实例表明,该方法能够有效判别橡胶密封件失效机理是否变化,并找到失效机理不变的应力水平边界。  相似文献   

5.
SPC(统计过程控制)作为一种先进的质量管理方法,在国外企业被广泛采用,目前国内也有众多企业开始推行.分析了实施SPC的重要性,根据SPC技术在推广应用中的主要工作内容,给出了导弹厚膜混合电路生产中SPC技术的应用方法,包括关键工序节点和工艺参数的确定、工艺参数数据的采集、控制图的使用、工序能力评价以及所采用的控制技术等实际应用方面的内容.  相似文献   

6.
The traditional approach to economic design of control charts is based on the assumption that a process is monitored using only a performance variable. If, however, the performance variable is costly to measure and a less expensive surrogate variable is available, the process may be more efficiently controlled by using both performance and surrogate variables. In this article we propose a model for economic design of a two-stage control chart which uses a highly correlated surrogate variable together with a performance variable. The process is assumed to be monitored by the surrogate variable until it signals out-of-control behavior, then by the performance variable until it signals out-of-control behavior or maintains in-control signals for a prespecified amount of time, and the two variables are used in alternating fashion. An algorithm based on the direct search method of Hooke and Jeeves [6] is used to find the optimum values of design parameters. The proposed model is applied to the end-closure welding process for nuclear fuel to compute the amount of reduction in cost compared with the current control procedure. © 1999 John Wiley & Sons, Inc. Naval Research Logistics 46: 958–977, 1999  相似文献   

7.
JIT (just-in-time) is widely regarded as an excellent tool for reducing costs and cycle times, and for improving quality in manufacturing operations. JIT follows a multistep procedure. First, it identifies and prioritizes wastes or non-value-adding activities. Second, it forces these wastes to be removed. MRP (materials requirements planning) can identify the same wastes and prioritize them in the same way that JIT does, by using data from the MRP database and master production schedule, and a waste identification model. In this article, a model is developed which describes the process by which the classic JIT system identifies and prioritizes waste. An equivalent MRP waste identification model is then developed for the production environment of the classic JIT system. (The classic JIT system was developed to produce many products having low to medium volumes.) The results developed here can be extended to other production environments where adaptations of the classic JIT system are used. An example, taken from an actual application, is presented to illustrate the models and the equivalence of JIT and MRP as systems for identifying and prioritizing wastes in manufacturing. © 1993 John Wiley & Sons, Inc.  相似文献   

8.
The classical Economic Order Quantity Model requires the parameters of the model to be constant. Some EOQ models allow a single parameter to change with time. We consider EOQ systems in which one or more of the cost or demand parameters will change at some time in the future. The system we examine has two distinct advantages over previous models. One obvious advantage is that a change in any of the costs is likely to affect the demand rate and we allow for this. The second advantage is that often, the times that prices will rise are fairly well known by announcement or previous experience. We present the optimal ordering policy for these inventory systems with anticipated changes and a simple method for computing the optimal policy. For cases where the changes are in the distant future we present a myopic policy that yields costs which are near-optimal. In cases where the changes will occur in the relatively near future the optimal policy is significantly better than the myopic policy.  相似文献   

9.
This paper introduces a special control chart procedure for exponentially distributed product life. Statistical control of product life in manufacturing requires continuing life tests of manufactured product so as to detect changes in product life and take appropriate corrective action. These life testing experiments may become exceedingly time consuming and thus can be both impractical because of serious time delays in implementing corrective action on the process when indicated, and quite uneconomical. It is desirable to inquire into the character of life testing by means of a control chart procedure based on the real time to first failure within given samples. Measuring the real minimum life provides a considerable reduction in duration of the testing procedure and in the number of specimens destroyed, yielding a considerable economy over the Shewhart's X control chart.  相似文献   

10.
This article examines the applicability of acceptance sampling and the effectiveness of Deming's kp rule in relation to the degree of process stability achieved through statistical process control techniques. A discrete-event simulation model is used to characterize the correlation between the number of defective units in a randomly drawn sample versus in the remainder of a lot, in response to a number of system and control chart parameters. The model reveals that such correlation is typically present when special causes of variation affect the production process from time to time, even though the process is tightly monitored through statistical process control. Comparison of these results to an analogous mixed binomial scenario reveals that the mixed binomial model overstates the correlation in question if the state of the process is not necessarily constant during lot production. A generalization of the kp analysis is presented that incorporates the possibility of dependence between a sample and the unsampled portion of the lot. This analysis demonstrates that acceptance sampling is generally ineffective for lots generated by a process subject to statistical process control, despite the fact that the number of defectives in the sample and in the remainder of the lot are not strictly independent. © 1994 John Wiley & Sons, Inc.  相似文献   

11.
A process control scheme is developed in which decisions as to the frequency of sampling are made based upon the choice of an Average Outgoing Quality Limit. The scheme utilizes plotted points on a U-control chart for defects and the theory of runs to determine when to switch among Reduced, Normal, Tightened, and 100 percent inspection. The scheme is formulated as a semi-Markov process to derive steady stale equations for the probabilities of being in Reduced, Normal, Tightened, or 100 percent inspection and for Average Outgoing Quality and Average Fraction Inspected. The resulting system and the computer programs used to derive it are discussed.  相似文献   

12.
This study investigates the statistical process control application for monitoring queue length data in M/G/1 systems. Specifically, we studied the average run length (ARL) characteristics of two different control charts for detecting changes in system utilization. First, the nL chart monitors the sums of successive queue length samples by subgrouping individual observations with sample size n. Next is the individual chart with a warning zone whose control scheme is specified by two pairs of parameters, (upper control limit, du) and (lower control limit, dl), as proposed by Bhat and Rao (Oper Res 20 (1972) 955–966). We will present approaches to calculate ARL for the two types of control charts using the Markov chain formulation and also investigate the effects of parameters of the control charts to provide useful design guidelines for better performance. Extensive numerical results are included for illustration. © 2011 Wiley Periodicals, Inc. Naval Research Logistics, 2011  相似文献   

13.
Determination of the gunfire probability of kill against a target requires two parameters to be taken into consideration: the likelihood of hitting the target (susceptibility) and the conditional probability of kill given a hit (vulnerability). Two commonly used methods for calculating the latter probability are (1) treating each hit upon the target independently, and (2) setting an exact number of hits to obtain a target kill. Each of these methods contains an implicit assumption about the probability distribution of the number of hits‐to‐kill. Method (1) assumes that the most likely kill scenario occurs with exactly one hit, whereas (2) implies that achieving a precise number of hits always results in a kill. These methods can produce significant differences in the predicted gun effectiveness, even if the mean number of hits‐to‐kill for each distribution is the same. We therefore introduce a new modeling approach with a more general distribution for the number of hits‐to‐kill. The approach is configurable to various classes of damage mechanism and is able to match both methods (1) and (2) with a suitable choice of parameter. We use this new approach to explore the influence of various damage accumulation models on the predicted effectiveness of weapon‐target engagements.  相似文献   

14.
Currently, both the hardware and software designs of many large computing systems aim at improved system performance through exploitation of parallelism in multiprocessor systems. In studying these systems, mathematical modelling and analysis constitute an important step towards providing design tools that can be used in building such systems. With this view the present paper describes a queueing model of a multiprocessor system operating in a job-shop environment in which arriving jobs consist of a random number of segments (sub-jobs). Two service disciplines are considered: one assumes that the sub-jobs of a given job are capable of parallel operation on different processors while the other assumes that the same sub-jobs must be operated in a strictly serial sequ'snce. The results (in particular, the mean number in the system and waiting time in queue) obtained for these two disciplines are shown to be bounds for more general job structures.  相似文献   

15.
This article gives a full analysis of a component-replacement model in which preventive replacements are only possible at maintenance opportunities. These opportunities arise according to a Poisson process, independently of failures of the component. Conditions for the existence of a unique average optimal control limit policy are established and an equation characterizing the optimal policy and minimal average costs is derived. An important result is that the optimal policy can be described as a so-called one-opportunity-look-ahead policy. Such policies play an important role as heuristics in more general models. It is shown that there is a correspondence with the well-known age-replacement model, which can be considered as an extreme case of the model. Finally, some numerical results are given.  相似文献   

16.
Conventional control charts are often designed to optimize out‐of‐control average run length (ARL), while constraining in‐control ARL to a desired value. The widely employed grid search approach in statistical process control (SPC) is time‐consuming with unsatisfactory accuracy. Although the simulation‐based ARL gradient estimators proposed by Fu and Hu [Manag Sci 45 (1999), 395–413] can alleviate this issue, it still requires a large number of simulation runs to significantly reduce the variance of gradient estimators. This article proposes a novel ARL gradient estimation approach based on integral equation for efficient analysis and design of control charts. Although this article compares with the results of Fu and Hu [Manag Sci 45 (1999), 395–413] based on the exponentially weighted moving average (EWMA) control chart, the proposed approach has wide applicability as it can generally fit into any control chart with Markovian property under any distributions. It is shown that the proposed method is able to provide a fast, accurate, and easy‐to‐implement algorithm for the design and analysis of EWMA charts, as compared to the simulation‐based gradient estimation method. Moreover, the proposed gradient estimation method facilitates the computation of high‐order derivatives that are valuable in sensitivity analysis. The code is written in Matlab, which is available on request. © 2014 Wiley Periodicals, Inc. Naval Research Logistics 61: 223–237, 2014  相似文献   

17.
A model is developed taking into consideration all the costs (namely cost of sampling, cost of not detecting a change in the process, cost of a false indication of change, and the cost of readjusting detected changes) incurred when a production process, using an unscheduled setup policy, utilizes fraction-defective control charts to control current production. The model is based on the concept of the expected time between detection of changes calling for setups. It is shown that the combination of unscheduled setups and control charts can be utilized in an optimal way if those combinations of sample size, sampling interval, and extent of control limits from process average are used that provide the minimum expected total cost per unit of time. The costs of a production process that uses unscheduled setups in conjunction with the appropriate optimal control charts are compared to the costs of a production process that uses scheduled setups at optimum intervals in conjunction with its appropriate control charts. This comparison indicates the criteria for selecting production processes with scheduled setups using optimal setup intervals over unscheduled setups. Suggestions are made to evaluate the optimal process setup strategy and the accompanying optimal decision parameters, for any specific cost data, by use of computer enumeration. A numerical example for assumed cost and process data is provided.  相似文献   

18.
Economic screening procedures based on a continuous screening variable X in place of a dichotomous performance variable T are presented. Optimal critical values on the screening variable minimizing the expected cost are obtained for two models; it is assumed that X given T is normally distributed in normal model and P[T = 1|X] is a logistic function of X in the logistic model, and that costs are incurred by screening inspection and misclassification errors. Cases where some parameters are unknown are also considered.  相似文献   

19.
The authors extend the generalized von Neumann model they developed (with J. G. Kemeny) in 1956 to an open model by assuming that there are exogeneously determined export and import prices and that any amount can be exported or imported at these prices. The open model is then characterized by means of seven axioms. It is shown, by applying the theory of linear programming, that if four economically reasonable assumptions hold, the open model has at least one solution in which at least one good with positive export price is exported and at least one good with positive import price is imported. It is also shown that, in general, a continuum of expansion rates can be achieved by varying certain control variables. The choice of these expansion rates gives indirectly the choice of a suitable sub-economy and also determines the exports and imports of the economy. Other results and examples are discussed.  相似文献   

20.
反作用轮低速特性观测补偿方法   总被引:4,自引:0,他引:4       下载免费PDF全文
反作用轮在现代高精度卫星姿态控制中占据着重要的地位。但由于反作用轮工作于低速状态,其转速过零时摩擦力矩的非线性特征将会对姿态控制精度产生较大的影响,并影响卫星运行寿命。基于Dahl摩擦模型建立了直流电机驱动的反作用轮系统数学模型,在此基础上设计了用于改善反作用轮低速性能的补偿观测器,并将其应用于三正交结构姿态控制系统。数字仿真说明此方法可以有效地抑制反作用轮低速摩擦产生的扰动,从而大幅度改善卫星姿态控制精度及其姿态稳定性。最后探讨了该观测器方法同变结构控制方法的综合应用前景。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号