首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
A new connection between the distribution of component failure times of a coherent system and (adaptive) progressively Type‐II censored order statistics is established. Utilizing this property, we develop inferential procedures when the data is given by all component failures until system failure in two scenarios: In the case of complete information, we assume that the failed component is also observed whereas in the case of incomplete information, we have only information about the failure times but not about the components which have failed. In the first setting, we show that inferential methods for adaptive progressively Type‐II censored data can directly be applied to the problem. For incomplete information, we face the problem that the corresponding censoring plan is not observed and that the available inferential procedures depend on the knowledge of the used censoring plan. To get estimates for distributional parameters, we propose maximum likelihood estimators which can be obtained by solving the likelihood equations directly or via an Expectation‐Maximization‐algorithm type procedure. For an exponential distribution, we discuss also a linear estimator to estimate the mean. Moreover, we establish exact distributions for some estimators in the exponential case which can be used, for example, to construct exact confidence intervals. The results are illustrated by a five component bridge system. © 2015 Wiley Periodicals, Inc. Naval Research Logistics 62: 512–530, 2015  相似文献   

2.
In this article, a mixture of Type‐I censoring and Type‐II progressive censoring schemes, called an adaptive Type‐II progressive censoring scheme, is introduced for life testing or reliability experiments. For this censoring scheme, the effective sample size m is fixed in advance, and the progressive censoring scheme is provided but the number of items progressively removed from the experiment upon failure may change during the experiment. If the experimental time exceeds a prefixed time T but the number of observed failures does not reach m, we terminate the experiment as soon as possible by adjusting the number of items progressively removed from the experiment upon failure. Computational formulae for the expected total test time are provided. Point and interval estimation of the failure rate for exponentially distributed failure times are discussed for this censoring scheme. The various methods are compared using Monte Carlo simulation. © 2009 Wiley Periodicals, Inc. Naval Research Logistics, 2009  相似文献   

3.
Mixed censoring is useful extension of Type I and Type II censoring and combines some advantages of both types of censoring. This paper proposes a general Bayesian framework for designing a variable acceptance sampling scheme with mixed censoring. A general loss function which includes the sampling cost, the time‐consuming cost, the salvage value, and the decision loss is employed to determine the Bayes risk and the corresponding optimal sampling plan. An explicit expression of the Bayes risk is derived. The new model can easily be adapted to create life testing models for different distributions. Specifically, two commonly used distributions including the exponential distribution and the Weibull distribution are considered with a special decision loss function. We demonstrate that the proposed model is superior to models with Type I or Type II censoring. Numerical examples are reported to illustrate the effectiveness of the method proposed. © 2004 Wiley Periodicals, Inc. Naval Research Logistics, 2004  相似文献   

4.
This article presents new tools and methods for finding optimum step‐stress accelerated life test plans. First, we present an approach to calculate the large‐sample approximate variance of the maximum likelihood estimator of a quantile of the failure time distribution at use conditions from a step‐stress accelerated life test. The approach allows for multistep stress changes and censoring for general log‐location‐scale distributions based on a cumulative exposure model. As an application of this approach, the optimum variance is studied as a function of shape parameter for both Weibull and lognormal distributions. Graphical comparisons among test plans using step‐up, step‐down, and constant‐stress patterns are also presented. The results show that depending on the values of the model parameters and quantile of interest, each of the three test plans can be preferable in terms of optimum variance. © 2008 Wiley Periodicals, Inc. Naval Research Logistics, 2008  相似文献   

5.
For various parameter combinations, the logistic–exponential survival distribution belongs to four common classes of survival distributions: increasing failure rate, decreasing failure rate, bathtub‐shaped failure rate, and upside‐down bathtub‐shaped failure rate. Graphical comparison of this new distribution with other common survival distributions is seen in a plot of the skewness versus the coefficient of variation. The distribution can be used as a survival model or as a device to determine the distribution class from which a particular data set is drawn. As the three‐parameter version is less mathematically tractable, our major results concern the two‐parameter version. Boundaries for the maximum likelihood estimators of the parameters are derived in this article. Also, a fixed‐point method to find the maximum likelihood estimators for complete and censored data sets has been developed. The two‐parameter and the three‐parameter versions of the logistic–exponential distribution are applied to two real‐life data sets. © 2008 Wiley Periodicals, Inc. Naval Research Logistics, 2008  相似文献   

6.
Instead of measuring a Wiener degradation or performance process at predetermined time points to track degradation or performance of a product for estimating its lifetime, we propose to obtain the first‐passage times of the process over certain nonfailure thresholds. Based on only these intermediate data, we obtain the uniformly minimum variance unbiased estimator and uniformly most accurate confidence interval for the mean lifetime. For estimating the lifetime distribution function, we propose a modified maximum likelihood estimator and a new estimator and prove that, by increasing the sample size of the intermediate data, these estimators and the above‐mentioned estimator of the mean lifetime can achieve the same levels of accuracy as the estimators assuming one has failure times. Thus, our method of using only intermediate data is useful for highly reliable products when their failure times are difficult to obtain. Furthermore, we show that the proposed new estimator of the lifetime distribution function is more accurate than the standard and modified maximum likelihood estimators. We also obtain approximate confidence intervals for the lifetime distribution function and its percentiles. Finally, we use light‐emitting diodes as an example to illustrate our method and demonstrate how to validate the Wiener assumption during the testing. © 2008 Wiley Periodicals, Inc. Naval Research Logistics, 2008  相似文献   

7.
This paper considers the statistical analysis of masked data in a series system, where the components are assumed to have Marshall‐Olkin Weibull distribution. Based on type‐I progressive hybrid censored and masked data, we derive the maximum likelihood estimates, approximate confidence intervals, and bootstrap confidence intervals of unknown parameters. As the maximum likelihood estimate does not exist for small sample size, Gibbs sampling is used to obtain the Bayesian estimates and Monte Carlo method is employed to construct the credible intervals based on Jefferys prior with partial information. Numerical simulations are performed to compare the performances of the proposed methods and one data set is analyzed.  相似文献   

8.
By running life tests at higher stress levels than normal operating conditions, accelerated life testing (ALT) quickly yields information on the lifetime distribution of a test unit. The lifetime at the design stress is then estimated through extrapolation using a regression model. In constant‐stress testing, a unit is tested at a fixed stress level until failure or the termination time point of test, whereas step‐stress testing allows the experimenter to gradually increase the stress levels at some prefixed time points during the test. In this work, the optimal k‐level constant‐stress and step‐stress ALTs are compared for the exponential failure data under complete sampling and Type‐I censoring. The objective is to quantify the advantage of using the step‐stress testing relative to the constant‐stress one. Assuming a log‐linear life–stress relationship with the cumulative exposure model for the effect of changing stress in step‐stress testing, the optimal design points are determined under C/D/A‐optimality criteria. The efficiency of step‐stress testing to constant‐stress one is then discussed in terms of the ratio of optimal objective functions based on the information matrix. © 2013 Wiley Periodicals, Inc. Naval Research Logistics 00: 000–000, 2013  相似文献   

9.
The Signal‐to‐Interference‐plus‐Noise Ratio (SINR) is an important metric of wireless communication link quality. SINR estimates have several important applications. These include optimizing the transmit power level for a target quality of service, assisting with handoff decisions and dynamically adapting the data rate for wireless Internet applications. Accurate SINR estimation provides for both a more efficient system and a higher user‐perceived quality of service. In this paper, we develop new SINR estimators and compare their mean squared error (MSE) performance. We show that our new estimators dominate estimators that have previously appeared in the literature with respect to MSE. The sequence of transmitted bits in wireless communication systems consists of both pilot bits (which are known both to the transmitter and receiver) and user bits (which are known only by the transmitter). The SINR estimators we consider alternatively depend exclusively on pilot bits, exclusively on user bits, or simultaneously use both pilot and user bits. In addition, we consider estimators that utilize smoothing and feedback mechanisms. Smoothed estimators are motivated by the fact that the interference component of the SINR changes relatively slowly with time, typically with the addition or departure of a user to the system. Feedback estimators are motivated by the fact that receivers typically decode bits correctly with a very high probability, and therefore user bits can be thought of as quasipilot bits. For each estimator discussed, we derive an exact or approximate formula for its MSE. Satterthwaite approximations, noncentral F distributions (singly and doubly) and distribution theory of quadratic forms are the key statistical tools used in developing the MSE formulas. In the case of approximate MSE formulas, we validate their accuracy using simulation techniques. The approximate MSE formulas, of interest in their own right for comparing the quality of the estimators, are also used for optimally combining estimators. In particular, we derive optimal weights for linearly combining an estimator based on pilot bits with an estimator based on user bits. The optimal weights depend on the MSE of the two estimators being combined, and thus the accurate approximate MSE formulas can conveniently be used. The optimal weights also depend on the unknown SINR, and therefore need to be estimated in order to construct a useable combined estimator. The impact on the MSE of the combined estimator due to estimating the weights is examined. © 2004 Wiley Periodicals, Inc. Naval Research Logistics, 2004  相似文献   

10.
Lifetime experiments are common in many research areas and industrial applications. Recently, process monitoring for lifetime observations has received increasing attention. However, some existing methods are inadequate as neither their in control (IC) nor out of control (OC) performance is satisfactory. In addition, the challenges associated with designing robust and flexible control schemes have yet to be fully addressed. To overcome these limitations, this article utilizes a newly developed weighted likelihood ratio test, and proposes a novel monitoring strategy that automatically combines the likelihood of past samples with the exponential weighted sum average scheme. The proposed Censored Observation‐based Weighted‐Likelihood (COWL) control chart gives desirable IC and OC performances and is robust under various scenarios. In addition, a self‐starting control chart is introduced to cope with the problem of insufficient reference samples. Our simulation shows a stronger power in detecting changes in the censored lifetime data using our scheme than using other alternatives. A real industrial example based on the breaking strength of carbon fiber also demonstrates the effectiveness of the proposed method. © 2016 Wiley Periodicals, Inc. Naval Research Logistics 63: 631–646, 2017  相似文献   

11.
在II型混合截尾样本下,得到了广义逆指数分布未知参数的最大似然估计。利用最大似然估计的渐近正态性构造了参数的渐近置信区间,运用Lindley's逼近方法和TierneyKadane's逼近方法计算出了参数的Bayes估计。最后,运用Monte-Carlo方法对上述估计方法结果作了模拟比较。  相似文献   

12.
Log‐normal and Weibull distributions are the most popular distributions for modeling skewed data. In this paper, we consider the ratio of the maximized likelihood in choosing between the two distributions. The asymptotic distribution of the logarithm of the maximized likelihood ratio has been obtained. It is observed that the asymptotic distribution is independent of the unknown parameters. The asymptotic distribution has been used to determine the minimum sample size required to discriminate between two families of distributions for a user specified probability of correct selection. We perform some numerical experiments to observe how the asymptotic methods work for different sample sizes. It is observed that the asymptotic results work quite well even for small samples also. Two real data sets have been analyzed. © 2004 Wiley Periodicals, Inc. Naval Research Logistics, 2004  相似文献   

13.
In system reliability analysis, for an n ‐component system, the estimation of the performance of the components in the system is not straightforward in practice, especially when the components are dependent. Here, by assuming the n components in the system to be identically distributed with a common distribution belonging to a scale‐family and the dependence structure between the components being known, we discuss the estimation of the lifetime distributions of the components in the system based on the lifetimes of systems with the same structure. We develop a general framework for inference on the scale parameter of the component lifetime distribution. Specifically, the method of moments estimator (MME) and the maximum likelihood estimator (MLE) are derived for the scale parameter, and the conditions for the existence of the MLE are also discussed. The asymptotic confidence intervals for the scale parameter are also developed based on the MME and the MLE. General simulation procedures for the system lifetime under this model are described. Finally, some examples of two‐ and three‐component systems are presented to illustrate all the inferential procedures developed here. © 2012 Wiley Periodicals, Inc. Naval Research Logistics, 2012  相似文献   

14.
The maximum likelihood estimator of the service distribution function of an M/G/∞ service system is obtained based on output time observations. This estimator is useful when observation of the service time of each customer could introduce bias or may be impossible. The maximum likelihood estimator is compared to the estimator proposed by Mark Brown, [2]. Relative to each other, Brown's estimator is useful in light traffic while the maximum likelihood estimator is applicble in heavy trafic. Both estimators are compared to the empirical distribution function based on a sample of service times and are found to have drawbacks although each estimator may have applications in special circumstances.  相似文献   

15.
One important thrust in the reliability literature is the development of statistical procedures under various “restricted family” model assumptions such as the increasing failure rate (IFR) and decreasing failure rate (DFR) distributions. However, relatively little work has been done on the problem of testing fit to such families as a null hypothesis. Barlow and Campo proposed graphical methods for assessing goodness of fit to the IFR model in single-sample problems. For the same problem with complete data, Tenga and Santner studied several analytic tests of the null hypothesis that the common underlying distribution is IFR versus the alternative that it is not IFR for complete data. This article considers the same problem for the case of four types of censored data: (i) Type I (time) censoring, (ii) Type I1 (order statistic) censoring, (iii) a hybrid of Type I and Type I1 censoring, and (iv) random censorship. The least favorable distributions of several intuitive test statistics are derived for each of the four types of censoring so that valid small-sample-size α tests can be constructed from them. Properties of these tests are investigated.  相似文献   

16.
Accelerated life testing (ALT) is widely used to determine the failure time distribution of a product and the associated life‐stress relationship in order to predict the product's reliability under normal operating conditions. Many types of stress loadings such as constant‐stress, step‐stress and cyclic‐stress can be utilized when conducting ALT. Extensive research has been conducted on the analysis of ALT data obtained under a specified stress loading. However, the equivalency of ALT experiments involving different stress loadings has not been investigated. In this article, a log‐location‐scale distribution under Type I censoring is considered in planning ALT. An idea is provided for the equivalency of various ALT plans involving different stress loadings. Based on this idea, general equivalent ALT plans and some special types of equivalent ALT plans are explored. For demonstration, a constant‐stress ALT and a ramp‐stress ALT for miniature lamps are presented and their equivalency is investigated. © 2010 Wiley Periodicals, Inc. Naval Research Logistics, 2010  相似文献   

17.
Modern technology is producing high reliability products. Life testing for such products under normal use condition takes a lot of time to obtain a reasonable number of failures. In this situation a step‐stress procedure is preferred for accelerated life testing. In this paper we assume a Weibull and Lognormal model whose scale parameter depends upon the present level as well as the age at the entry in the present stress level. On the basis of that we propose a parametric model to the life distribution for step‐stress testing and suggest a suitable design to estimate the parameters involved in the model. A simulation study has been done by the proposed model based on maximum likelihood estimation. © 2003 Wiley Periodicals, Inc. Naval Research Logistics, 2003  相似文献   

18.
In this article, we study the Shewhart chart of Q statistics proposed for the detection of process mean shifts in start‐up processes and short runs. Exact expressions for the run‐length distribution of this chart are derived and evaluated using an efficient computational procedure. The procedure can be considerably faster than using direct simulation. We extend our work to analyze the practice of requiring multiple signals from the chart before responding, a practice sometimes followed with Shewhart charts. The results show that waiting to receive multiple signals severely reduces the probability of quickly detecting shifts in certain cases, and therefore may be considered a risky practice. Operational guidelines for practitioners implementing the chart are discussed. © 2009 Wiley Periodicals, Inc. Naval Research Logistics, 2009  相似文献   

19.
In progressive censoring, items are removed at certain times during the life test. Commonly, it is assumed that the removed items are used for further testing. In order to take into account information about these additional testing in inferential procedures, we propose a two‐step model of stage life testing with one fixed stage‐change time which incorporates information about both the removed items (further tested under different conditions) and those remaining in the current life test. We show that some marginal distributions in our model correspond either to progressive censoring with a fixed censoring time or to a simple‐step stress model. Furthermore, assuming a cumulative exposure model, we establish exact inferential results for the distribution parameters when the lifetimes are exponentially distributed. An extension to Weibull distributed lifetimes is also discussed.  相似文献   

20.
We consider the multitasking scheduling problem on unrelated parallel machines to minimize the total weighted completion time. In this problem, each machine processes a set of jobs, while the processing of a selected job on a machine may be interrupted by other available jobs scheduled on the same machine but unfinished. To solve this problem, we propose an exact branch‐and‐price algorithm, where the master problem at each search node is solved by a novel column generation scheme, called in‐out column generation, to maintain the stability of the dual variables. We use a greedy heuristic to obtain a set of initial columns to start the in‐out column generation, and a hybrid strategy combining a genetic algorithm and an exact dynamic programming algorithm to solve the pricing subproblems approximately and exactly, respectively. Using randomly generated data, we conduct numerical studies to evaluate the performance of the proposed solution approach. We also examine the effects of multitasking on the scheduling outcomes, with which the decision maker can justify making investments to adopt or avoid multitasking.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号