全文获取类型
收费全文 | 229篇 |
免费 | 29篇 |
出版年
2021年 | 2篇 |
2019年 | 5篇 |
2018年 | 2篇 |
2017年 | 5篇 |
2016年 | 11篇 |
2015年 | 15篇 |
2014年 | 9篇 |
2013年 | 69篇 |
2012年 | 10篇 |
2011年 | 7篇 |
2010年 | 4篇 |
2009年 | 8篇 |
2008年 | 9篇 |
2007年 | 11篇 |
2006年 | 8篇 |
2005年 | 11篇 |
2004年 | 7篇 |
2003年 | 12篇 |
2002年 | 10篇 |
2001年 | 7篇 |
2000年 | 6篇 |
1999年 | 7篇 |
1998年 | 1篇 |
1995年 | 1篇 |
1994年 | 1篇 |
1993年 | 1篇 |
1991年 | 1篇 |
1990年 | 1篇 |
1989年 | 1篇 |
1986年 | 1篇 |
1983年 | 2篇 |
1982年 | 1篇 |
1978年 | 1篇 |
1976年 | 2篇 |
1975年 | 1篇 |
1974年 | 1篇 |
1973年 | 2篇 |
1972年 | 2篇 |
1970年 | 3篇 |
排序方式: 共有258条查询结果,搜索用时 326 毫秒
91.
We consider a generalization of the well‐known generalized assignment problem (GAP) over discrete time periods encompassed within a finite planning horizon. The resulting model, MultiGAP, addresses the assignment of tasks to agents within each time period, with the attendant single‐period assignment costs and agent‐capacity constraint requirements, in conjunction with transition costs arising between any two consecutive periods in which a task is reassigned to a different agent. As is the case for its single‐period antecedent, MultiGAP offers a robust tool for modeling a wide range of capacity planning problems occurring within supply chain management. We provide two formulations for MultiGAP and establish that the second (alternative) formulation provides a tighter bound. We define a Lagrangian relaxation‐based heuristic as well as a branch‐and‐bound algorithm for MultiGAP. Computational experience with the heuristic and branch‐and‐bound algorithm on over 2500 test problems is reported. The Lagrangian heuristic consistently generates high‐quality and in many cases near‐optimal solutions. The branch‐and‐bound algorithm is also seen to constitute an effective means for solving to optimality MultiGAP problems of reasonable size. © 2012 Wiley Periodicals, Inc. Naval Research Logistics, 2012 相似文献
92.
Alan Washburn 《海军后勤学研究》2006,53(4):354-362
The Jelinski–Moranda model of software reliability is generalized by introducing a negative‐binomial prior distribution for the number of faults remaining, together with a Gamma distribution for the rate at which each fault is exposed. This model is well suited to sequential use, where a sequence of reliability forecasts is made in the process of testing or using the software. We also investigate replacing the Gamma distribution with a worst‐case assumption about failure rates (the worst‐case failure rate in models such as this is not infinite, since faults with large failure rates are immediately discovered and removed). © 2006 Wiley Periodicals, Inc. Naval Research Logistics, 2006 相似文献
93.
We deal with the problem of minimizing makespan on a single batch processing machine. In this problem, each job has both processing time and size (capacity requirement). The batch processing machine can process a number of jobs simultaneously as long as the total size of these jobs being processed does not exceed the machine capacity. The processing time of a batch is just the processing time of the longest job in the batch. An approximation algorithm with worst‐case ratio 3/2 is given for the version where the processing times of large jobs (with sizes greater than 1/2) are not less than those of small jobs (with sizes not greater than 1/2). This result is the best possible unless P = NP. For the general case, we propose an approximation algorithm with worst‐case ratio 7/4. A number of heuristics by Uzosy are also analyzed and compared. © 2001 John Wiley & Sons, Inc. Naval Research Logistics 48: 226–240, 2001 相似文献
94.
Ebru K. Bish Thin‐Yin Leong Chung‐Lun Li Jonathan W. C. Ng David Simchi‐Levi 《海军后勤学研究》2001,48(5):363-385
We consider a container terminal discharging containers from a ship and locating them in the terminal yard. Each container has a number of potential locations in the yard where it can be stored. Containers are moved from the ship to the yard using a fleet of vehicles, each of which can carry one container at a time. The problem is to assign each container to a yard location and dispatch vehicles to the containers so as to minimize the time it takes to download all the containers from the ship. We show that the problem is NP‐hard and develop a heuristic algorithm based on formulating the problem as an assignment problem. The effectiveness of the heuristic is analyzed from both worst‐case and computational points of view. © 2001 John Wiley & Sons, Inc. Naval Research Logistics 48: 363–385, 2001 相似文献
95.
In this paper, we present an O(nm log(U/n)) time maximum flow algorithm. If U = O(n) then this algorithm runs in O(nm) time for all values of m and n. This gives the best available running time to solve maximum flow problems satisfying U = O(n). Furthermore, for unit capacity networks the algorithm runs in O(n2/3m) time. It is a two‐phase capacity scaling algorithm that is easy to implement and does not use complex data structures. © 2000 John Wiley & Sons, Inc. Naval Research Logistics 47: 511–520, 2000 相似文献
96.
Degradation experiments are widely used to assess the reliability of highly reliable products which are not likely to fail under the traditional life tests. In order to conduct a degradation experiment efficiently, several factors, such as the inspection frequency, the sample size, and the termination time, need to be considered carefully. These factors not only affect the experimental cost, but also affect the precision of the estimate of a product's lifetime. In this paper, we deal with the optimal design of a degradation experiment. Under the constraint that the total experimental cost does not exceed a predetermined budget, the optimal decision variables are solved by minimizing the variance of the estimated 100pth percentile of the lifetime distribution of the product. An example is provided to illustrate the proposed method. Finally, a simulation study is conducted to investigate the robustness of this proposed method. © 1999 John Wiley & Sons, Inc. Naval Research Logistics 46: 689–706, 1999 相似文献
97.
Alan R. Washburn 《海军后勤学研究》1998,45(3):243-257
The problem of searching for randomly moving targets such as children and submarines is known to be fundamentally difficult, but finding efficient methods for generating optimal or near optimal solutions is nonetheless an important practical problem. This paper investigates the efficiency of Branch and Bound methods, with emphasis on the tradeoff between the accuracy of the bound employed and the time required to compute it. A variety of bounds are investigated, some of which are new. In most cases the best bounds turn out to be imprecise, but very easy to compute. © 1998 John Wiley & Sons, Inc. Naval Research Logistics 45: 243–257, 1998 相似文献
98.
This paper introduces a general or “distribution‐free” model to analyze the lifetime of components under accelerated life testing. Unlike the accelerated failure time (AFT) models, the proposed model shares the advantage of being “distribution‐free” with the proportional hazard (PH) model and overcomes the deficiency of the PH model not allowing survival curves corresponding to different values of a covariate to cross. In this research, we extend and modify the extended hazard regression (EHR) model using the partial likelihood function to analyze failure data with time‐dependent covariates. The new model can be easily adopted to create an accelerated life testing model with different types of stress loading. For example, stress loading in accelerated life testing can be a step function, cyclic, or linear function with time. These types of stress loadings reduce the testing time and increase the number of failures of components under test. The proposed EHR model with time‐dependent covariates which incorporates multiple stress loadings requires further verification. Therefore, we conduct an accelerated life test in the laboratory by subjecting components to time‐dependent stresses, and we compare the reliability estimation based on the developed model with that obtained from experimental results. The combination of the theoretical development of the accelerated life testing model verified by laboratory experiments offers a unique perspective to reliability model building and verification. © 1999 John Wiley & Sons, Inc. Naval Research Logistics 46: 303–321, 1999 相似文献
99.
In this paper we consider an inventory model in which the retailer does not know the exact distribution of demand and thus must use some observed demand data to forecast demand. We present an extension of the basic newsvendor model that allows us to quantify the value of the observed demand data and the impact of suboptimal forecasting on the expected costs at the retailer. We demonstrate the approach through an example in which the retailer employs a commonly used forecasting technique, exponential smoothing. The model is also used to quantify the value of information and information sharing for a decoupled supply chain in which both the retailer and the manufacturer must forecast demand. © 2003 Wiley Periodicals, Inc. Naval Research Logistics 50: 388–411, 2003 相似文献
100.
R. Alan Bowman 《海军后勤学研究》2003,50(5):481-497
A key problem in project management is to decide which activities are the most important to manage and how best to manage them. A considerable amount of literature has been devoted to assigning “importance” measures to activities to help with this important task. When activity times are modeled as random variables, these activity importance measures are more complex and difficult to use. A key problem with all existing measures is that they summarize the importance in a single number. The result is that it is difficult for managers to determine a range of times for an activity that might be acceptable or unacceptable. In this paper, we develop sensitivity curves that display the most useful measures of project performance (in terms of schedule) as a function of an activity's time. The structure of the networks allows us to efficiently estimate these curves for all desired activities, all desired time ranges, and all desired measures in a single set of simulation runs. The resulting curves provide insights that are not available when considering summarized measures alone. Chief among these insights is the ability to identify an acceptable range of times for an activity that will not lead to negative scheduling consequences. © 2003 Wiley Periodicals, Inc. Naval Research Logistics 50: 481–497, 2003 相似文献