首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   205篇
  免费   41篇
  2021年   2篇
  2019年   8篇
  2018年   3篇
  2017年   5篇
  2016年   13篇
  2015年   14篇
  2014年   9篇
  2013年   59篇
  2012年   9篇
  2011年   7篇
  2010年   4篇
  2009年   7篇
  2008年   9篇
  2007年   12篇
  2006年   8篇
  2005年   12篇
  2004年   10篇
  2003年   13篇
  2002年   10篇
  2001年   6篇
  2000年   5篇
  1999年   7篇
  1995年   2篇
  1994年   2篇
  1993年   2篇
  1991年   1篇
  1990年   2篇
  1988年   1篇
  1987年   1篇
  1985年   2篇
  1981年   1篇
排序方式: 共有246条查询结果,搜索用时 61 毫秒
31.
32.
Demand forecasting performance is subject to the uncertainty underlying the time series an organization is dealing with. There are many approaches that may be used to reduce uncertainty and thus to improve forecasting performance. One intuitively appealing such approach is to aggregate demand in lower‐frequency “time buckets.” The approach under concern is termed to as temporal aggregation, and in this article, we investigate its impact on forecasting performance. We assume that the nonaggregated demand follows either a moving average process of order one or a first‐order autoregressive process and a single exponential smoothing (SES) procedure is used to forecast demand. These demand processes are often encountered in practice and SES is one of the standard estimators used in industry. Theoretical mean‐squared error expressions are derived for the aggregated and nonaggregated demand to contrast the relevant forecasting performances. The theoretical analysis is supported by an extensive numerical investigation and experimentation with an empirical dataset. The results indicate that performance improvements achieved through the aggregation approach are a function of the aggregation level, the smoothing constant, and the process parameters. Valuable insights are offered to practitioners and the article closes with an agenda for further research in this area. © 2013 Wiley Periodicals, Inc. Naval Research Logistics 60: 479–498, 2013  相似文献   
33.
Many Markov chain models have very large state spaces, making the computation of stationary probabilities very difficult. Often the structure and numerical properties of the Markov chain allows for more efficient computation through state aggregation and disaggregation. In this article we develop an efficient exact single pass aggregation/disaggregation algorithm which exploits structural properties of large finite irreducible mandatory set decomposable Markov chains. The required property of being of mandatory set decomposable structure is a generalization of several other Markov chain structures for which exact aggregation/disaggregation algorithms exist. © 1995 John Wiley & Sons, Inc.  相似文献   
34.
35.
Many manufacturers sell their products through retailers and share the revenue with those retailers. Given this phenomenon, we build a stylized model to investigate the role of revenue sharing schemes in supply chain coordination and product variety decisions. In our model, a monopolistic manufacturer serves two segments of consumers, which are distinguished by their willingness to pay for quality. In the scenario with exogenous revenue sharing ratios, when the potential gain from serving the low segment is substantial (e.g., the low‐segment consumers' willingness to pay is high enough or the low segment takes a large enough proportion of the market), the retailer is better off abandoning the revenue sharing scheme. Moreover, when the potential gain from serving the low (high) segment is substantial enough, the manufacturer finds it profitable to offer a single product. Furthermore, when revenue sharing ratios are endogenous, we divide our analysis into two cases, depending on the methods of cooperation. When revenue sharing ratios are negotiated at the very beginning, the decentralized supply chain causes further distortion. This suggests that the central premise of revenue sharing—the coordination of supply chains—may be undermined if supply chain parties meticulously bargain over it.  相似文献   
36.
Mean residual life is a useful dynamic characteristic to study reliability of a system. It has been widely considered in the literature not only for single unit systems but also for coherent systems. This article is concerned with the study of mean residual life for a coherent system that consists of multiple types of dependent components. In particular, the survival signature based generalized mixture representation is obtained for the survival function of a coherent system and it is used to evaluate the mean residual life function. Furthermore, two mean residual life functions under different conditional events on components’ lifetimes are also defined and studied.  相似文献   
37.
As a complex system with multiple components usually deteriorates with age, preventive maintenance (PM) is often performed to keep the system functioning in a good state to prolong its effective age. In this study, a nonhomogeneous Poisson process with a power law failure intensity is used to describe the deterioration of a repairable system, and the optimal nonperiodic PM schedule can be determined to minimize the expected total cost per unit time. However, since the determination of such optimal PM policies may involve numerous uncertainties, which typically make the analyses difficult to perform because of the scarcity of data, a Bayesian decision model, which utilizes all available information effectively, is also proposed for determining the optimal PM strategies. A numerical example with a real failure data set is used to illustrate the effectiveness of the proposed approach. The results show that the optimal schedules derived by Bayesian approach are relatively more conservative than that for non‐Bayesian approach because of the uncertainty of the intensity function, and if the intensity function are updated using the collected data set, which indicates more severe deterioration than the prior belief, replacing the entire system instead of frequent PM activities before serious deterioration is suggested. © 2010 Wiley Periodicals, Inc. Naval Research Logistics, 2010  相似文献   
38.
This article studies a min‐max path cover problem, which is to determine a set of paths for k capacitated vehicles to service all the customers in a given weighted graph so that the largest path cost is minimized. The problem has wide applications in vehicle routing, especially when the minimization of the latest service completion time is a critical performance measure. We have analyzed four typical variants of this problem, where the vehicles have either unlimited or limited capacities, and they start from either a given depot or any depot of a given depot set. We have developed approximation algorithms for these four variants, which achieve approximation ratios of max{3 ‐ 2/k,2}, 5, max{5 ‐ 2/k,4}, and 7, respectively. We have also analyzed the approximation hardness of these variants by showing that, unless P = NP , it is impossible for them to achieve approximation ratios less than 4/3, 3/2, 3/2, and 2, respectively. We have further extended the techniques and results developed for this problem to other min‐max vehicle routing problems.© 2010 Wiley Periodicals, Inc. Naval Research Logistics, 2010  相似文献   
39.
In this paper, we present an optimization model for coordinating inventory and transportation decisions at an outbound distribution warehouse that serves a group of customers located in a given market area. For the practical problems which motivated this paper, the warehouse is operated by a third party logistics provider. However, the models developed here may be applicable in a more general context where outbound distribution is managed by another supply chain member, e.g., a manufacturer. We consider the case where the aggregate demand of the market area is constant and known per period (e.g., per day). Under an immediate delivery policy, an outbound shipment is released each time a demand is realized (e.g., on a daily basis). On the other hand, if these shipments are consolidated over time, then larger (hence more economical) outbound freight quantities can be dispatched. In this case, the physical inventory requirements at the third party warehouse (TPW) are determined by the consolidated freight quantities. Thus, stock replenishment and outbound shipment release policies should be coordinated. By optimizing inventory and freight consolidation decisions simultaneously, we compute the parameters of an integrated inventory/outbound transportation policy. These parameters determine: (i) how often to dispatch a truck so that transportation scale economies are realized and timely delivery requirements are met, and (ii) how often, and in what quantities, the stock should be replenished at the TPW. We prove that the optimal shipment release timing policy is nonstationary, and we present algorithms for computing the policy parameters for both the uncapacitated and finite cargo capacity problems. The model presented in this study is considerably different from the existing inventory/transportation models in the literature. The classical inventory literature assumes that demands should be satisfied as they arrive so that outbound shipment costs are sunk costs, or else these costs are covered by the customer. Hence, the classical literature does not model outbound transportation costs. However, if a freight consolidation policy is in place then the outbound transportation costs can no longer be ignored in optimization. Relying on this observation, this paper models outbound transportation costs, freight consolidation decisions, and cargo capacity constraints explicitly. © 2002 Wiley Periodicals, Inc. Naval Research Logistics 49: 531–556, 2002; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/nav.10030  相似文献   
40.
Instead of measuring a Wiener degradation or performance process at predetermined time points to track degradation or performance of a product for estimating its lifetime, we propose to obtain the first‐passage times of the process over certain nonfailure thresholds. Based on only these intermediate data, we obtain the uniformly minimum variance unbiased estimator and uniformly most accurate confidence interval for the mean lifetime. For estimating the lifetime distribution function, we propose a modified maximum likelihood estimator and a new estimator and prove that, by increasing the sample size of the intermediate data, these estimators and the above‐mentioned estimator of the mean lifetime can achieve the same levels of accuracy as the estimators assuming one has failure times. Thus, our method of using only intermediate data is useful for highly reliable products when their failure times are difficult to obtain. Furthermore, we show that the proposed new estimator of the lifetime distribution function is more accurate than the standard and modified maximum likelihood estimators. We also obtain approximate confidence intervals for the lifetime distribution function and its percentiles. Finally, we use light‐emitting diodes as an example to illustrate our method and demonstrate how to validate the Wiener assumption during the testing. © 2008 Wiley Periodicals, Inc. Naval Research Logistics, 2008  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号