全文获取类型
收费全文 | 606篇 |
免费 | 18篇 |
出版年
2020年 | 5篇 |
2019年 | 9篇 |
2017年 | 12篇 |
2016年 | 10篇 |
2015年 | 12篇 |
2014年 | 8篇 |
2013年 | 122篇 |
2011年 | 6篇 |
2009年 | 5篇 |
2007年 | 8篇 |
2005年 | 8篇 |
2004年 | 9篇 |
2003年 | 6篇 |
2002年 | 8篇 |
2001年 | 7篇 |
2000年 | 6篇 |
1999年 | 9篇 |
1998年 | 8篇 |
1997年 | 9篇 |
1996年 | 10篇 |
1995年 | 15篇 |
1994年 | 16篇 |
1993年 | 12篇 |
1992年 | 14篇 |
1991年 | 15篇 |
1990年 | 8篇 |
1989年 | 9篇 |
1988年 | 14篇 |
1987年 | 13篇 |
1986年 | 9篇 |
1985年 | 12篇 |
1984年 | 10篇 |
1983年 | 9篇 |
1982年 | 11篇 |
1981年 | 10篇 |
1980年 | 11篇 |
1979年 | 14篇 |
1978年 | 8篇 |
1977年 | 9篇 |
1976年 | 8篇 |
1975年 | 12篇 |
1974年 | 11篇 |
1973年 | 11篇 |
1972年 | 13篇 |
1971年 | 13篇 |
1970年 | 6篇 |
1969年 | 8篇 |
1968年 | 10篇 |
1967年 | 6篇 |
1948年 | 5篇 |
排序方式: 共有624条查询结果,搜索用时 15 毫秒
511.
In this article we present a stochastic model for determining inventory rotation policies for a retail firm which must stock many hundreds of distinctive items having uncertain heterogeneous sales patterns. The model develops explicit decision rules for determining (1) the length of time that an item should remain in inventory before the decision is made on whether or not to rotate the item out of inventory and (2) the minimum sales level necessary for retaining the item in inventory. Two inventory rotation policies are developed, the first of which maximizes cumulative expected sales over a finite planning horizon and the second of which maximizes cumulative expected profit. We also consider the statistical behavior of items having uncertain, discrete, and heterogeneous sales patterns using a two-period prediction methodology where period 1 is used to accumulate information on individual sales rates and this knowledge is then used, in a Bayesian context, to make sales predictions for period 2. This methodology assumes that over an arbitrary time interval sales for each item are Poisson with unknown but stationary mean sales rates and the mean sales rates are distributed gamma across all items. We also report the application of the model to a retail firm which stocks many hundreds of distinctive unframed poster art titles. The application provides some useful insights into the behavior of the model as well as some interesting aspects pertaining to the implementation of the results in a “real-world” situation. 相似文献
512.
We introduce an algorithm, called TMO (Two-Machine Optimal Scheduling) which minimizes the makespan for two identical processors. TMO employs lexicographic search in conjunction with the longest-processing time sequence to derive an optimal schedule. For the m identical parallel processors problem, we propose an improvement algorithm, which improves the seed solution obtained by any existing heuristic. The improvement algorithm, called Extended TMO, breaks the original m-machine problem into a set of two-machine problems and solves them repeatedly by the TMO. A simulation study is performed to evaluate the effectiveness of the proposed algorithms by comparing it against three existing heuristics: LPT (Graham, [11]), MULTIFIT (Coffman, Garey, and Johnson, [6]), and RMG (Lee and Massey, [17]). The simulation results show that: for the two processors case, the TMO performs significantly better than LPT, MULTIFIT, and RMG, and it generally takes considerably less CPU time than MULTIFIT and RMG. For the general parallel processors case, the Extended TMO algorithm is shown to be capable of greatly improving any seed solution. © 1995 John Wiley & Sons, Inc. 相似文献
513.
This article defines and develops a simulation optimization system based upon response surface classification and the integration of multiple search strategies. Response surfaces are classified according to characteristics that indicate which search technique will be most successful. Typical surface characteristics include statistical measures and topological features, while search techniques encompass response surface methodology, simulated annealing, random search, etc. The classify-then-search process flow and a knowledge-based architecture are developed and then demonstrated with a detailed computer example. The system is useful not only as an approach to optimizing simulations, but also as a means for integrating search techniques and thereby providing the user with the most promising path toward an optimal solution. © 1995 John Wiley & Sons, Inc. 相似文献
514.
In this note we describe a local-search heuristic (LSH) for large non-unicost set-covering problems (SCPs). The new heuristic is based on the simulated annealing algorithm and uses an improvement routine designed to provide low-cost solutions within a reasonable amount of CPU time. The solution costs associated with the LSH compared very favorably to the best previously published solution costs for 20 large SCPs taken from the literature. In particular, the LSH yielded new benchmark solutions for 17 of the 20 test problems. We also report that, for SCPs where column cost is correlated with column coverage, the new heuristic provides solution costs competitive with previously published results for comparable problems. © 1995 John Wiley & Sons, Inc. 相似文献
515.
This article presents the application of a simulated annealing heuristic to an NP-complete cyclic staff-scheduling problem. The new heuristic is compared to branch-and-bound integer programming algorithms, as well as construction and linear programming-based heuristics. It is designed for use in a continuously operating scheduling environment with the objective of minimizing the number of employees necessary to satisfy forecast demand. The results indicate that the simulated annealing-based method tends to dominate the branch-and-bound algorithms and the other heuristics in terms of solution quality. Moreover, the annealing algorithm exhibited rapid convergence to a low-cost solution. The simulated annealing heuristic is executed in a single program and does not require mathematical programming software. © 1993 John Wiley & Sons, Inc. 相似文献
516.
Peter C. Fishbur 《海军后勤学研究》1992,39(6):741-755
Necessary and sufficient conditions are specified for a general theory of additive measurement that presumes very little set-theoretic structure. The theory is illustrated for numerical representations in extensive, conjoint, difference, threshold, expected utility, probability, ambiguity, and subset measurement. © 1992 John Wiley & Sons, Inc. 相似文献
517.
In a recent paper, Teng, Chern, and Yang consider four possible inventory replenishment models and determine the optimal replenishment policies for them. They compare these models to identify the best alternative on the basis of minimum total relevant inventory costs. The total cost functions for Model 1 and Model 4 as derived by them are not exact for the comparison. As a result, their conclusion on the least expensive replenishment policy is incorrect. The present article provides the actual total costs for Model 1 and Model 4 to make a correct comparison of the four models. © 2000 John Wiley & Sons, Inc. Naval Research Logistics 47: 602–606, 2000 相似文献
518.
A general model for the failure of fibrous composite materials is described. It is shown to contain some of the well-known models in the literature. The composite material is viewed as a coherent system of independent identically distributed component strengths. Under the assumption that the applied load is redistributed “homotonically” to the unfailed components upon the failure of a component (an individual fiber segment) and that the distributions of component strengths are IFRA, it is shown that the system (composite) strength distribution is also IFRA. Examples are given using carbon reinforced composite data to illustrate the IFRA property. 相似文献
519.
Thomas W. Lucas 《海军后勤学研究》2003,50(4):306-321
There are multiple damage functions in the literature to estimate the probability that a single weapon detonation destroys a point target. This paper addresses differences in the tails of four of the more popular damage functions. These four cover the asymptotic tail behaviors of all monotonically decreasing damage functions with well‐behaved hazard functions. The differences in estimates of probability of kill are quite dramatic for large aim‐point offsets. This is particularly important when balancing the number of threats that can be engaged with the chances of fratricide and collateral damage. In general, analysts substituting one damage function for another may badly estimate kill probabilities in offset‐aiming, which could result in poor doctrine. © 2003 Wiley Periodicals, Inc. Naval Research Logistics 50: 306–321, 2003. 相似文献
520.