全文获取类型
收费全文 | 666篇 |
免费 | 20篇 |
出版年
2019年 | 14篇 |
2018年 | 8篇 |
2017年 | 12篇 |
2016年 | 9篇 |
2015年 | 12篇 |
2014年 | 10篇 |
2013年 | 112篇 |
2010年 | 7篇 |
2009年 | 7篇 |
2008年 | 7篇 |
2007年 | 7篇 |
2004年 | 11篇 |
2003年 | 9篇 |
2002年 | 8篇 |
2001年 | 10篇 |
2000年 | 8篇 |
1999年 | 7篇 |
1998年 | 12篇 |
1997年 | 13篇 |
1996年 | 14篇 |
1995年 | 15篇 |
1994年 | 15篇 |
1993年 | 14篇 |
1992年 | 17篇 |
1991年 | 18篇 |
1990年 | 10篇 |
1989年 | 12篇 |
1988年 | 16篇 |
1987年 | 15篇 |
1986年 | 9篇 |
1985年 | 14篇 |
1984年 | 15篇 |
1983年 | 8篇 |
1982年 | 9篇 |
1981年 | 14篇 |
1980年 | 14篇 |
1979年 | 14篇 |
1978年 | 11篇 |
1977年 | 9篇 |
1976年 | 11篇 |
1975年 | 13篇 |
1974年 | 11篇 |
1973年 | 8篇 |
1972年 | 12篇 |
1971年 | 12篇 |
1970年 | 7篇 |
1969年 | 8篇 |
1968年 | 10篇 |
1967年 | 9篇 |
1966年 | 8篇 |
排序方式: 共有686条查询结果,搜索用时 15 毫秒
161.
Roger C. Vergin 《海军后勤学研究》1968,15(4):523-534
Most maintenance and replacement models for industrial equipment have been developed for independent single-component machines. Most equipment, however, consists of multiple components. Also, when the maintenance crew services several machines, the maintenance policy for each machine is not independent of the states of the other machines. In this paper, two dynamic programming replacement models are presented. The first is used to determine the optimal replacement policy for multi-component equipment. The second is used to determine the optimal replacement policy for a multi-machine system which uses one replacement crew to service several machines. In addition, an approach is suggested for developing an efficient replacement policy for a multi-component, multi-machine system. 相似文献
162.
163.
164.
In this journal in 1967. Szware presented an algorithm for the optimal routing of a common vehicle fleet between m sources and n sinks with p different types of commodities. The main premise of the formulation is that a truck may carry only one commodity at a time and must deliver the entire load to one demand area. This eliminates the problem of routing vehicles between sources or between sinks and limits the problem to the routing of loaded trucks between sources and sinks and empty trucks making the return trip. Szwarc considered only the transportation aspect of the problem (i. e., no intermediate points) and presented a very efficient algorithm for solution of the case he described. If the total supply is greater than the total demand, Szwarc shows that the problem is equivalent to a (mp + n) by (np + m) Hitchcock transportation problem. Digital computer codes for this algorithm require rapid access storage for a matrix of size (mp + n) by (np + m); therefore, computer storage required grows proportionally to p2. This paper offers an extension of his work to a more general form: a transshipment network with capacity constraints on all arcs and facilities. The problem is shown to be solvable directly by Fulkerson's out-of-kilter algorithm. Digital computer codes for this formulation require rapid access storage proportional to p instead of p2. Computational results indicate that, in addition to handling the extensions, the out-of-kilter algorithm is more efficient in the solution of the original problem when there is a mad, rate number of commodities and a computer of limited storage capacity. 相似文献
165.
This paper analyzes the problem faced by a field commander who, confronted by an enemy on N battlefields, must determine an interdiction policy for the enemy's logistics system which minimizes the amount of war material flowing through this system per unit time. The resource utilized to achieve this interdiction is subject to constraint. It can be shown that this problem is equivalent to determining the set of arcs Z* to remove subject to constraint from a directed graph G such that the resulting maximal flow is minimized. A branch and bound algorithm for the solution to this problem is described, and a numerical example is provided. 相似文献
166.
This paper discusses the operations analysis in the underwater search for the remains of the submarine Scorpion The a priori target location probability distribution for the search was obtained by monte-carlo procedures based upon nine different scenarios concerning the Scorpion loss and associated credibility weights. These scenarios and weights were postulated by others. Scorpion was found within 260 yards of the search grid cell having the largest a priori probability Frequent computations of local effectiveness probabilities (LEPs) were carried out on scene during the search and were used to determine an updated (a posteriori) target location distribution. This distribution formed the basis for recommendation of the current high probability areas for search The sum of LEPs weighted by the a priori target location probabilities is called search effectiveness probability (SEP) and was used as the overall measure of effectiveness for the operation. SEP and LEPs were used previously in the Mediterranean H-bomb search On-scene and stateside operations analysis are discussed and the progress of the search is indicated by values of SEP for various periods during the operation. 相似文献
167.
168.
Satya D. Dubey 《海军后勤学研究》1971,18(4):561-566
It is pointed out in this paper that Lomax's hyperbolic function is a special case of both Compound Gamma and Compound Weibull distributions, and both of these distributions provide better models for Lomax's business failure data than his hyperbolic and exponential functions. Since his exponential function fails to yield a valid distribution function, a necessary condition is established to remedy this drawback. In the light of this result, his exponential function is modified in several ways. It is further shown that a natural complement of Lomax's exponential function does not suffer from this drawback. 相似文献
169.
Several approximate procedures are available in the literature for obtaining confidence intervals for the parameter A of an exponential distribution based on time truncated samples. This paper contains the results of an empirical study comparing three of these procedures. 相似文献
170.
The present study is concerned with the determination of a few observations from a sufficiently large complete or censored sample from the extreme value distribution with location and scale parameters μ and σ, respectively, such that the asymptotically best linear unbiased estimators (ABLUE) of the parameters in Ref. [24] yield high efficiencies among other choices of the same number of observations. (All efficiencies considered are relative to the Cramér-Rao lower bounds for regular unbiased estimators.) The study is on the asymptotic theory and under Type II censoring scheme. For the estimation of μ when σ is known, it has been proved that there exists a unique optimum spacing whether the sample is complete, right censored, left censored, or doubly censored. Several tables are prepared to aid in the numerical computation of the estimates as well as to furnish their efficiencies. For the estimation of σ when μ is known, it has been observed that there does not exist a unique optimum spacing. Accordingly we have obtained a spacing based on a complete sample which yields high efficiency. A similar table as above is prepared. When both μ and σ are unknown, we have considered four different spacings based on a complete sample and chosen the one yielding highest efficiency. A table of the efficiencies is also prepared. Finally we apply the above results for the estimation of the scale and/or shape parameters of the Weibull distribution. 相似文献