首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   536篇
  免费   11篇
  2021年   6篇
  2019年   18篇
  2018年   10篇
  2017年   14篇
  2016年   11篇
  2015年   10篇
  2013年   102篇
  2011年   6篇
  2010年   8篇
  2009年   4篇
  2008年   6篇
  2007年   7篇
  2006年   7篇
  2005年   11篇
  2004年   9篇
  2003年   6篇
  2001年   5篇
  2000年   9篇
  1999年   7篇
  1998年   5篇
  1997年   10篇
  1996年   12篇
  1995年   6篇
  1994年   11篇
  1993年   7篇
  1992年   7篇
  1991年   14篇
  1990年   9篇
  1989年   14篇
  1988年   9篇
  1987年   13篇
  1986年   9篇
  1985年   11篇
  1984年   9篇
  1983年   7篇
  1982年   7篇
  1981年   9篇
  1980年   10篇
  1979年   7篇
  1978年   8篇
  1977年   8篇
  1976年   9篇
  1975年   6篇
  1974年   11篇
  1973年   8篇
  1972年   9篇
  1971年   14篇
  1969年   6篇
  1968年   5篇
  1967年   4篇
排序方式: 共有547条查询结果,搜索用时 15 毫秒
301.
This paper addresses the problem of finding a feasible schedule of n jobs on m parallel machines, where each job has a deadline and some jobs are preassigned to some machine. This problem arises in the daily assignment of workload to a set of flight dispatchers, and it is strongly characterized by the fact that the job lengths may assume one out of k different values, for small k. We prove the problem to be NP‐complete for k = 2 and propose an effective implicit enumeration algorithm which allows efficiently solution a set of real‐life instances. © 2000 John Wiley & Sons, Inc. Naval Research Logistics 47: 359–376, 2000  相似文献   
302.
The problem of developing good schedules for Navy C-Schools has been modeled as a combinatorial optimization problem. The only complicating feature of the problem is that classes must be grouped together into sequences known as pipelines. An ideal schedule will have all classes in a pipeline scheduled in consecutive weeks. The objective is to eliminate the nonproductive time spent by sailors at C-Schools who are waiting for the next class in a pipeline. In this investigation an implicit enumeration procedure for this problem was developed. The key component of our algorithm is a specialized greedy algorithm which is used to obtain a good initial incumbent. Often this initial incumbent is either an optimal schedule or a near optimal schedule. In an empirical analysis with the only other competing software system, our greedy heuristic found equivalent or better solutions in substantially less computer time. This greedy heuristic was extended and modified for the A-School scheduling problem and was found to be superior to its only competitor. © 1998 John Wiley & Sons, Inc. Naval Research Logistics 45: 533–551, 1998  相似文献   
303.
This paper deals with the problem of makespan minimization in a flow shop with two machines when the input buffer of the second machine can only host a limited number of parts. Here we analyze the problem in the context of batch processing, i.e., when identical parts must be processed consecutively. We propose an exact branch-and-bound algorithm, in which the bounds exploit the batching nature of the problem. Extensive computational results show the effectiveness of the approach, and allow us to compare it with a previous heuristic approach. © 1998 John Wiley & Sons, Inc. Naval Research Logistics 45: 141–164, 1998  相似文献   
304.
Under various operational conditions, in particular in operations other than war (OOTW) or peacekeeping, an intervening force, here Blue, must occasionally engage in attrition warfare with an opposing force, here Red, that is intermingled with noncombatants. Desirably, Red armed actives are targeted, and not the unarmed noncombatants. This article describes some simple Lanchesterian attrition models that reflect a certain capacity of Blue to discriminate noncombatants from armed and active Red opponents. An explicit extension of the Lanchester square law results: Blue's abstinence concerning the indiscriminate shooting of civilians mixed with Red's is essentially reflected in a lower Blue rate of fire and less advantageous exchange rate. The model applies to other situations involving decoys, and reflects the value of a discrimination capability. © 1997 John Wiley & Sons, Inc. Naval Research Logistics 44 : 507–514, 1997  相似文献   
305.
A new piece of equipment has been purchased in a lot of size m. Some of the items can be used in destructive testing before the item is put into use. Testing uncovers faults which can be removed from the remaining pieces of equipment in the lot. If t < m pieces of equipment are tested, then those that remain, m1 = mt, have reduced fault incidence and are more reliable than initially, but m1 may be too small to be useful, or than is desirable. In this paper models are studied to address this question: given the lot size m, how to optimize by choice of t the effectiveness of the pieces of equipment remaining after the test. The models used are simplistic and illustrative; they can be straightforwardly improved. © 1997 John Wiley & Sons, Inc. Naval Research Logistics 44: 623–637, 1997  相似文献   
306.
This article develops optimal accelerated life test designs for Burr Type XII distributions under periodic inspection and Type I censoring. It is assumed that the mean lifetime (the Burr XII scale parameter) is a log-linear function of stress and that the shape parameters are independent of stress. For given shape parameters, design stress and high test stress, the test design is optimized with respect to the low test stress and the proportion of test units allocated to the low stress. The optimality criterion is the asymptotic variance of the maximum-likelihood estimator of log mean life at the design stress with the use of equally spaced inspection times. Computational results for various values of the shape parameters show that this criterion is insensitive to the number of inspection times and to misspecification of imputed failure probabilities at the design and high test stresses. Procedures for planning an accelerated life test, including selection of sample size, are also discussed. It is shown that optimal designs previously obtained for exponential and Weibull distributions are similar to those obtained here for the appropriate special cases of the Burr XII distribution. Thus the Burr XII distribution is a useful and widely applicable family of reliability models for ALT design. © 1996 John Wiley & Sons, Inc.  相似文献   
307.
This paper examines three types of sensitivity analysis on a firm's responsive pricing and responsive production strategies under imperfect demand updating. Demand has a multiplicative form where the market size updates according to a bivariate normal model. First, we show that both responsive production and responsive pricing resemble the classical pricing newsvendor with posterior demand uncertainty in terms of the optimal performance and first‐stage decision. Second, we show that the performance of responsive production is sensitive to the first‐stage decision, but responsive pricing is insensitive. This suggests that a “posterior rationale” (ie, using the optimal production decision from the classical pricing newsvendor with expected posterior uncertainty) allows a simple and near‐optimal first‐stage production heuristic for responsive pricing. However, responsive production obtains higher expected profits than responsive pricing under certain conditions. This implies that the firm's ability to calculate the first‐stage decision correctly can help determine which responsive strategy to use. Lastly, we find that the firm's performance is not sensitive to the parameter uncertainty coming from the market size, total uncertainty level and information quality, but is sensitive to uncertainty originating from the procurement cost and price‐elasticity.  相似文献   
308.
The object of this article is to investigate the risk-pooling effect of depot stock in two-echelon distribution system in which the depot serves n retailers in parallel, and to develop computationally tractable optimization procedures for such systems. The depot manager has complete information about stock levels and there are two opportunities to allocate stock to the retailers within each order cycle. We identify first- and second-order aspects to the risk-pooling effect. In particular, the second-order effect is the property that the minimum stock available to any retailer after the second allocation converges in probability to a constant as the number of retailers in the system increases, assuming independence of the demands. This property is exploited in the development of efficient procedures to determine near-optimal values of the policy parameters.  相似文献   
309.
The provisions of the 1999 Constitution, which recognises the existence of a single police force and forbids parallel police organisations, have oftentimes generated controversies among actors in the Nigerian federal polity. Rising insecurity precipitates lingering questions on the utility and adequacy of a single, highly centralised and centrally controlled police force given Nigeria’s geographic vastness and demographic diversity. Conversely, arguments have also dwelt on the dangers of fragmentation considering Nigeria’s psychosocial, economic and political nature. This article attempts to balance these arguments by analysing policing and the operations of the Nigeria Police Force (NPF) through the lens of the subsidiarity principle. Subsidiarity is a governance principle in federations, captured in the founding documents of the European Union (EU), which prescribes that governmental powers, authorities and duties should be held by the tier that can best perform them equitably, efficiently, effectively, suitably and based on interest and need. Drawing largely on interviews with purposively selected police scholars, political actors, civil society organisations and police personnel, the paper contends that this principle offers a pragmatic solution to the perennial problems of intergovernmental frictions on the use of the police within the context of governance in the Nigerian federation.  相似文献   
310.
The malaise that the United States, and the West, have experienced in recent campaigns stems in large part from unclear thinking about war, its political essence, and the strategies needed to join the two. Instead, analysis and response are predicated on entrenched theoretical concepts with limited practical utility. The inadequacy of understanding has spawned new, and not so new, terms to capture unanticipated trends, starting with the re-discovery of “insurgency” and “counterinsurgency” and leading to discussion of “hybrid threats” and “gray-zone” operations. New terminology can help, but the change must go deeper. Challenging analytical orthodoxy, this article sets out a unifying approach for the study of political violence, or more accurately: violent politics. It provides a conceptual foundation that helps to make sense of recent shifts in warfare. In effect, it offers sorely needed theoretical insights into the nature of strategy and guides the process of responding to nontraditional threats.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号