首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   205篇
  免费   41篇
  2021年   2篇
  2019年   8篇
  2018年   3篇
  2017年   5篇
  2016年   13篇
  2015年   14篇
  2014年   9篇
  2013年   59篇
  2012年   9篇
  2011年   7篇
  2010年   4篇
  2009年   7篇
  2008年   9篇
  2007年   12篇
  2006年   8篇
  2005年   12篇
  2004年   10篇
  2003年   13篇
  2002年   10篇
  2001年   6篇
  2000年   5篇
  1999年   7篇
  1995年   2篇
  1994年   2篇
  1993年   2篇
  1991年   1篇
  1990年   2篇
  1988年   1篇
  1987年   1篇
  1985年   2篇
  1981年   1篇
排序方式: 共有246条查询结果,搜索用时 546 毫秒
41.
In this paper, we derive new families of facet‐defining inequalities for the finite group problem and extreme inequalities for the infinite group problem using approximate lifting. The new valid inequalities for the finite group problem include two‐ and three‐slope facet‐defining inequalities as well as the first family of four‐slope facet‐defining inequalities. The new valid inequalities for the infinite group problem include families of two‐ and three‐slope extreme inequalities. These new inequalities not only illustrate the diversity of strong inequalities for the finite and infinite group problems, but also provide a large variety of new cutting planes for solving integer and mixed‐integer programming problems. © 2008 Wiley Periodicals, Inc. Naval Research Logistics, 2008  相似文献   
42.
Instead of measuring a Wiener degradation or performance process at predetermined time points to track degradation or performance of a product for estimating its lifetime, we propose to obtain the first‐passage times of the process over certain nonfailure thresholds. Based on only these intermediate data, we obtain the uniformly minimum variance unbiased estimator and uniformly most accurate confidence interval for the mean lifetime. For estimating the lifetime distribution function, we propose a modified maximum likelihood estimator and a new estimator and prove that, by increasing the sample size of the intermediate data, these estimators and the above‐mentioned estimator of the mean lifetime can achieve the same levels of accuracy as the estimators assuming one has failure times. Thus, our method of using only intermediate data is useful for highly reliable products when their failure times are difficult to obtain. Furthermore, we show that the proposed new estimator of the lifetime distribution function is more accurate than the standard and modified maximum likelihood estimators. We also obtain approximate confidence intervals for the lifetime distribution function and its percentiles. Finally, we use light‐emitting diodes as an example to illustrate our method and demonstrate how to validate the Wiener assumption during the testing. © 2008 Wiley Periodicals, Inc. Naval Research Logistics, 2008  相似文献   
43.
This article is concerned with the scaling variant of Karmarkar's algorithm for linear programming problems. Several researchers have presented convergence analyses for this algorithm under various nondegeneracy types of assumptions, or under assumptions regarding the nature of the sequence of iterates generated by the algorithm. By employing a slight perturbation of the algorithm, which is computationally imperceptible, we are able to prove without using any special assumptions that the algorithm converges finitely to an ε-optimal solution for any chosen ε > 0, from which it can be (polynomically) rounded to an optimum, for ε > 0 small enough. The logarithmic barrier function is used as a construct for this analysis. A rounding scheme which produces an optimal extreme point solution is also suggested. Besides the non-negatively constrained case, we also present a convergence analysis for the case of bounded variables. An application in statistics to the L1 estimation problem and related computational results are presented.  相似文献   
44.
In this paper, we present an optimization model for coordinating inventory and transportation decisions at an outbound distribution warehouse that serves a group of customers located in a given market area. For the practical problems which motivated this paper, the warehouse is operated by a third party logistics provider. However, the models developed here may be applicable in a more general context where outbound distribution is managed by another supply chain member, e.g., a manufacturer. We consider the case where the aggregate demand of the market area is constant and known per period (e.g., per day). Under an immediate delivery policy, an outbound shipment is released each time a demand is realized (e.g., on a daily basis). On the other hand, if these shipments are consolidated over time, then larger (hence more economical) outbound freight quantities can be dispatched. In this case, the physical inventory requirements at the third party warehouse (TPW) are determined by the consolidated freight quantities. Thus, stock replenishment and outbound shipment release policies should be coordinated. By optimizing inventory and freight consolidation decisions simultaneously, we compute the parameters of an integrated inventory/outbound transportation policy. These parameters determine: (i) how often to dispatch a truck so that transportation scale economies are realized and timely delivery requirements are met, and (ii) how often, and in what quantities, the stock should be replenished at the TPW. We prove that the optimal shipment release timing policy is nonstationary, and we present algorithms for computing the policy parameters for both the uncapacitated and finite cargo capacity problems. The model presented in this study is considerably different from the existing inventory/transportation models in the literature. The classical inventory literature assumes that demands should be satisfied as they arrive so that outbound shipment costs are sunk costs, or else these costs are covered by the customer. Hence, the classical literature does not model outbound transportation costs. However, if a freight consolidation policy is in place then the outbound transportation costs can no longer be ignored in optimization. Relying on this observation, this paper models outbound transportation costs, freight consolidation decisions, and cargo capacity constraints explicitly. © 2002 Wiley Periodicals, Inc. Naval Research Logistics 49: 531–556, 2002; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/nav.10030  相似文献   
45.
For computing an optimal (Q, R) or kindred inventory policy, the current literature provides mixed signals on whether or when it is safe to approximate a nonnormal lead‐time‐demand (“LTD”) distribution by a normal distribution. The first part of this paper examines this literature critically to justify why the issue warrants further investigations, while the second part presents reliable evidence showing that the system‐cost penalty for using the normal approximation can be quite serious even when the LTD‐distribution's coefficient of variation is quite low—contrary to the prevalent view of the literature. We also identify situations that will most likely lead to large system‐cost penalty. Our results indicate that, given today's technology, it is worthwhile to estimate an LTD‐distribution's shape more accurately and to compute optimal inventory policies using statistical distributions that more accurately reflect the LTD‐distributions' actual shapes. © 2003 Wiley Periodicals, Inc. Naval Research Logistics, 2003  相似文献   
46.
Today, many products are designed and manufactured to function for a long period of time before they fail. Determining product reliability is a great challenge to manufacturers of highly reliable products with only a relatively short period of time available for internal life testing. In particular, it may be difficult to determine optimal burn‐in parameters and characterize the residual life distribution. A promising alternative is to use data on a quality characteristic (QC) whose degradation over time can be related to product failure. Typically, product failure corresponds to the first passage time of the degradation path beyond a critical value. If degradation paths can be modeled properly, one can predict failure time and determine the life distribution without actually observing failures. In this paper, we first use a Wiener process to describe the continuous degradation path of the quality characteristic of the product. A Wiener process allows nonconstant variance and nonzero correlation among data collected at different time points. We propose a decision rule for classifying a unit as normal or weak, and give an economic model for determining the optimal termination time and other parameters of a burn‐in test. Next, we propose a method for assessing the product's lifetime distribution of the passed units. The proposed methodologies are all based only on the product's initial observed degradation data. Finally, an example of an electronic product, namely contact image scanner (CIS), is used to illustrate the proposed procedure. © 2002 Wiley Periodicals, Inc. Naval Research Logistics, 2003  相似文献   
47.
Burn‐in is a technique to enhance reliability by eliminating weak items from a population of items having heterogeneous lifetimes. System burn‐in can improve system reliability, but the conditions for system burn‐in to be performed after component burn‐in remain a little understood mathematical challenge. To derive such conditions, we first introduce a general model of heterogeneous system lifetimes, in which the component burn‐in information and assembly problems are related to the prediction of system burn‐in. Many existing system burn‐in models become special cases and two important results are identified. First, heterogeneous system lifetimes can be understood naturally as a consequence of heterogeneous component lifetimes and heterogeneous assembly quality. Second, system burn‐in is effective if assembly quality variation in the components and connections which are arranged in series is greater than a threshold, where the threshold depends on the system structure and component failure rates. © 2003 Wiley Periodicals, Inc. Naval Research Logistics 50: 364–380, 2003.  相似文献   
48.
49.
In many practical manufacturing environments, jobs to be processed can be divided into different families such that a setup is required whenever there is a switch from processing a job of one family to another job of a different family. The time for setup could be sequence independent or sequence dependent. We consider two particular scheduling problems relevant to such situations. In both problems, we are given a set of jobs to be processed on a set of identical parallel machines. The objective of the first problem is to minimize total weighted completion time of jobs, and that of the second problem is to minimize weighted number of tardy jobs. We propose column generation based branch and bound exact solution algorithms for the problems. Computational experiments show that the algorithms are capable of solving both problems of medium size to optimality within reasonable computational time. © 2003 Wiley Periodicals, Inc. Naval Research Logistics 50: 823–840, 2003.  相似文献   
50.
The Markov assumption that transition probabilities are assumed to be constant over entire periods has been applied in economic and social structures, for example, in the analysis of income and wage distributions. In many cases, however, nonstationary transition probabilities exist over different periods. Based on causative matrix technique, this study shows a binomial approximation for obtaining nonstationary interim transition probabilities under undisturbance when the first and the last transition matrices are known.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号