首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   228篇
  免费   44篇
  国内免费   1篇
  2021年   2篇
  2019年   7篇
  2018年   3篇
  2017年   5篇
  2016年   11篇
  2015年   16篇
  2014年   10篇
  2013年   67篇
  2012年   10篇
  2011年   12篇
  2010年   7篇
  2009年   8篇
  2008年   9篇
  2007年   12篇
  2006年   7篇
  2005年   11篇
  2004年   8篇
  2003年   11篇
  2002年   13篇
  2001年   7篇
  2000年   5篇
  1999年   6篇
  1997年   3篇
  1996年   2篇
  1994年   2篇
  1993年   1篇
  1992年   1篇
  1991年   2篇
  1990年   1篇
  1988年   3篇
  1987年   1篇
  1986年   1篇
  1985年   2篇
  1984年   1篇
  1983年   3篇
  1982年   1篇
  1981年   1篇
  1980年   1篇
排序方式: 共有273条查询结果,搜索用时 140 毫秒
11.
A stochastic optimization model for capacity expansion for a service industry that incorporates uncertainty in future demand is developed. Based on a weighted set of possible demand scenarios, the model generates a recommended schedule of capacity expressions, and calculates the resulting sales under each scenario. The capacity schedule specifies the size, location, and timing of these expansions that will maximize the company's expected profit. The model includes a budget constraint on available resources. By using Lagrangian relaxation and exploiting the special nested knapsack structure in the sub-problems, an algorithm was developed for its solution. Based on the initial computational results, this algorithm appears to be more efficient than linear programming for this special problem. © 1994 John Wiley & Sons, Inc.  相似文献   
12.
Mean residual life is a useful dynamic characteristic to study reliability of a system. It has been widely considered in the literature not only for single unit systems but also for coherent systems. This article is concerned with the study of mean residual life for a coherent system that consists of multiple types of dependent components. In particular, the survival signature based generalized mixture representation is obtained for the survival function of a coherent system and it is used to evaluate the mean residual life function. Furthermore, two mean residual life functions under different conditional events on components’ lifetimes are also defined and studied.  相似文献   
13.
As a complex system with multiple components usually deteriorates with age, preventive maintenance (PM) is often performed to keep the system functioning in a good state to prolong its effective age. In this study, a nonhomogeneous Poisson process with a power law failure intensity is used to describe the deterioration of a repairable system, and the optimal nonperiodic PM schedule can be determined to minimize the expected total cost per unit time. However, since the determination of such optimal PM policies may involve numerous uncertainties, which typically make the analyses difficult to perform because of the scarcity of data, a Bayesian decision model, which utilizes all available information effectively, is also proposed for determining the optimal PM strategies. A numerical example with a real failure data set is used to illustrate the effectiveness of the proposed approach. The results show that the optimal schedules derived by Bayesian approach are relatively more conservative than that for non‐Bayesian approach because of the uncertainty of the intensity function, and if the intensity function are updated using the collected data set, which indicates more severe deterioration than the prior belief, replacing the entire system instead of frequent PM activities before serious deterioration is suggested. © 2010 Wiley Periodicals, Inc. Naval Research Logistics, 2010  相似文献   
14.
This article studies a min‐max path cover problem, which is to determine a set of paths for k capacitated vehicles to service all the customers in a given weighted graph so that the largest path cost is minimized. The problem has wide applications in vehicle routing, especially when the minimization of the latest service completion time is a critical performance measure. We have analyzed four typical variants of this problem, where the vehicles have either unlimited or limited capacities, and they start from either a given depot or any depot of a given depot set. We have developed approximation algorithms for these four variants, which achieve approximation ratios of max{3 ‐ 2/k,2}, 5, max{5 ‐ 2/k,4}, and 7, respectively. We have also analyzed the approximation hardness of these variants by showing that, unless P = NP , it is impossible for them to achieve approximation ratios less than 4/3, 3/2, 3/2, and 2, respectively. We have further extended the techniques and results developed for this problem to other min‐max vehicle routing problems.© 2010 Wiley Periodicals, Inc. Naval Research Logistics, 2010  相似文献   
15.
Many manufacturers sell their products through retailers and share the revenue with those retailers. Given this phenomenon, we build a stylized model to investigate the role of revenue sharing schemes in supply chain coordination and product variety decisions. In our model, a monopolistic manufacturer serves two segments of consumers, which are distinguished by their willingness to pay for quality. In the scenario with exogenous revenue sharing ratios, when the potential gain from serving the low segment is substantial (e.g., the low‐segment consumers' willingness to pay is high enough or the low segment takes a large enough proportion of the market), the retailer is better off abandoning the revenue sharing scheme. Moreover, when the potential gain from serving the low (high) segment is substantial enough, the manufacturer finds it profitable to offer a single product. Furthermore, when revenue sharing ratios are endogenous, we divide our analysis into two cases, depending on the methods of cooperation. When revenue sharing ratios are negotiated at the very beginning, the decentralized supply chain causes further distortion. This suggests that the central premise of revenue sharing—the coordination of supply chains—may be undermined if supply chain parties meticulously bargain over it.  相似文献   
16.
We seek dynamic server assignment policies in finite‐capacity queueing systems with flexible and collaborative servers, which involve an assembly and/or a disassembly operation. The objective is to maximize the steady‐state throughput. We completely characterize the optimal policy for a Markovian system with two servers, two feeder stations, and instantaneous assembly and disassembly operations. This optimal policy allocates one server per station unless one of the stations is blocked, in which case both servers work at the unblocked station. For Markovian systems with three stations and instantaneous assembly and/or disassembly operations, we consider similar policies that move a server away from his/her “primary” station only when that station is blocked or starving. We determine the optimal assignment of each server whose primary station is blocked or starving in systems with three stations and zero buffers, by formulating the problem as a Markov decision process. Using this optimal assignment, we develop heuristic policies for systems with three or more stations and positive buffers, and show by means of a numerical study that these policies provide near‐optimal throughput. Furthermore, our numerical study shows that these policies developed for assembly‐type systems also work well in tandem systems. © 2008 Wiley Periodicals, Inc. Naval Research Logistics, 2008  相似文献   
17.
In this article, the Building Evacuation Problem with Shared Information (BEPSI) is formulated as a mixed integer linear program, where the objective is to determine the set of routes along which to send evacuees (supply) from multiple locations throughout a building (sources) to the exits (sinks) such that the total time until all evacuees reach the exits is minimized. The formulation explicitly incorporates the constraints of shared information in providing online instructions to evacuees, ensuring that evacuees departing from an intermediate or source location at a mutual point in time receive common instructions. Arc travel time and capacity, as well as supply at the nodes, are permitted to vary with time and capacity is assumed to be recaptured over time. The BEPSI is shown to be NP‐hard. An exact technique based on Benders decomposition is proposed for its solution. Computational results from numerical experiments on a real‐world network representing a four‐story building are given. Results of experiments employing Benders cuts generated in solving a given problem instance as initial cuts in addressing an updated problem instance are also provided. © 2008 Wiley Periodicals, Inc. Naval Research Logistics, 2008  相似文献   
18.
In this paper, we derive new families of facet‐defining inequalities for the finite group problem and extreme inequalities for the infinite group problem using approximate lifting. The new valid inequalities for the finite group problem include two‐ and three‐slope facet‐defining inequalities as well as the first family of four‐slope facet‐defining inequalities. The new valid inequalities for the infinite group problem include families of two‐ and three‐slope extreme inequalities. These new inequalities not only illustrate the diversity of strong inequalities for the finite and infinite group problems, but also provide a large variety of new cutting planes for solving integer and mixed‐integer programming problems. © 2008 Wiley Periodicals, Inc. Naval Research Logistics, 2008  相似文献   
19.
Instead of measuring a Wiener degradation or performance process at predetermined time points to track degradation or performance of a product for estimating its lifetime, we propose to obtain the first‐passage times of the process over certain nonfailure thresholds. Based on only these intermediate data, we obtain the uniformly minimum variance unbiased estimator and uniformly most accurate confidence interval for the mean lifetime. For estimating the lifetime distribution function, we propose a modified maximum likelihood estimator and a new estimator and prove that, by increasing the sample size of the intermediate data, these estimators and the above‐mentioned estimator of the mean lifetime can achieve the same levels of accuracy as the estimators assuming one has failure times. Thus, our method of using only intermediate data is useful for highly reliable products when their failure times are difficult to obtain. Furthermore, we show that the proposed new estimator of the lifetime distribution function is more accurate than the standard and modified maximum likelihood estimators. We also obtain approximate confidence intervals for the lifetime distribution function and its percentiles. Finally, we use light‐emitting diodes as an example to illustrate our method and demonstrate how to validate the Wiener assumption during the testing. © 2008 Wiley Periodicals, Inc. Naval Research Logistics, 2008  相似文献   
20.
A machine-replacement problem is analyzed in a technological-development environment, in which a new-type machine (built by a new technology) may appear in the future. The solution of the replacement problem depends on purchasing, operating, and resale costs, and on the probability distribution of the market debut of the new technology, and it indicates whether to replace the existing machine now with an available similar type of machine, or to continue to operate the existing machine for at least one more period. A dynamic discounted cost model is presented, and a method is suggested for finding the optimal age for replacement of an existing machine (under rather general conditions of a technological environment). A solution procedure and a numerical example are given.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号