首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   683篇
  免费   12篇
  2021年   8篇
  2019年   15篇
  2018年   10篇
  2017年   15篇
  2016年   12篇
  2015年   13篇
  2013年   133篇
  2011年   6篇
  2010年   8篇
  2008年   6篇
  2007年   9篇
  2006年   9篇
  2005年   15篇
  2004年   10篇
  2003年   8篇
  2000年   11篇
  1999年   8篇
  1998年   5篇
  1997年   9篇
  1996年   16篇
  1995年   9篇
  1994年   15篇
  1993年   8篇
  1992年   10篇
  1991年   18篇
  1990年   10篇
  1989年   17篇
  1988年   11篇
  1987年   15篇
  1986年   16篇
  1985年   14篇
  1984年   10篇
  1983年   10篇
  1982年   14篇
  1981年   9篇
  1980年   14篇
  1979年   9篇
  1978年   10篇
  1977年   10篇
  1976年   10篇
  1975年   9篇
  1974年   15篇
  1973年   13篇
  1972年   14篇
  1971年   19篇
  1970年   6篇
  1969年   10篇
  1968年   7篇
  1967年   5篇
  1948年   5篇
排序方式: 共有695条查询结果,搜索用时 15 毫秒
551.
During the Cold War, the spread and fear of communism furnished the overarching ideological rationale for American foreign policy and for the deployment of United States military forces and resources. Subscribing to the domino theory and its potential impact on Southeast Asia, the Johnson Administration committed the United States to the Vietnam War. Following the September 11, 2001 attacks on the United States, and the commencement of the Global War on Terrorism, Washington once again set a national agenda rooted in a simplistic analysis reminiscent of Vietnam and the domino theory. Ignorant of Iraq’s mammoth sectarian, historical, ethnic, and global strategic complexities, the Bush Administration launched Operation Iraqi Freedom (OIF). The absence of critical analysis, contrarian viewpoints, and sound judgment characterized the US policy and strategy for both the Vietnam War and OIF, exhibiting the lack of moral courage that the national security enterprise seeks, but seldom attains. Faced with this challenge, this article draws attention to the ethical lessons we can learn from the dissent of William Fulbright and Andrew Bacevich.  相似文献   
552.
In this article, we consider a single machine scheduling problem, in which identical jobs are split into batches of bounded sizes. For each batch, it is allowed to produce less jobs than a given upper bound, that is, some jobs in a batch can be rejected, in which case a penalty is paid for each rejected job. The objective function is the sum of several components, including the sum of the completion times, total delivery cost, and total rejection cost. We reduce this problem to a min‐cost flow problem with a convex quadratic function and adapt Tamir's algorithm for its solution. © 2017 Wiley Periodicals, Inc. Naval Research Logistics 64: 217–224, 2017  相似文献   
553.
Abstract

Previous research has identified a variety of general mechanisms to explain how insurgents build legitimacy. Yet, there is often a gap between these mechanisms and the interactional dynamics of insurgencies. This article attempts to bridge this gap through a theoretically informed analysis of the Kurdistan Workers’ Party’s (PKK) insurgency in Turkey. I show how the PKK’s efforts to cultivate legitimacy, Turkey’s counterinsurgency strategies, and civilian perceptions of the PKK, all mutually influenced one another. Based on this analysis, I argue that the mechanisms that produce popular legitimacy coevolve with insurgents’ behaviors, states’ interventions, and civilians’ perceptions.  相似文献   
554.
The international system has great difficulty in dealing with illegitimate non-state actors such as transnational terrorist groups and organized crime syndicates. This is due to two main factors: the quality and quantity of influence these illegitimate actors have obtained in an era of globalization, and the fact that international law considers only individual criminals and terrorists as subjects, rather than the entire illegitimate enterprise, and does not adequately link individuals, enterprises and states to more nuanced and complex forms of sponsorship of illegal activities. This work offers an outline for tools that should be embedded in the fabric of international law and agreements, to sustain credibility against illegal non-state actors, to hold accountable  sponsors of illegality and to reinforce the legitimacy of globalization.  相似文献   
555.
556.
When solving location problems in practice it is quite common to aggregate demand points into centroids. Solving a location problem with aggregated demand data is computationally easier, but the aggregation process introduces error. We develop theory and algorithms for certain types of centroid aggregations for rectilinear 1‐median problems. The objective is to construct an aggregation that minimizes the maximum aggregation error. We focus on row‐column aggregations, and make use of aggregation results for 1‐median problems on the line to do aggregation for 1‐median problems in the plane. The aggregations developed for the 1‐median problem are then used to construct approximate n‐median problems. We test the theory computationally on n‐median problems (n ≥ 1) using both randomly generated, as well as real, data. Every error measure we consider can be well approximated by some power function in the number of aggregate demand points. Each such function exhibits decreasing returns to scale. © 2003 Wiley Periodicals, Inc. Naval Research Logistics 50: 614–637, 2003.  相似文献   
557.
Block replacement and modified block replacement policies for two‐component systems with failure dependence and economic dependence are considered in this paper. Opportunistic maintenance policies are also considered. Where tractable, long‐run costs per unit time are calculated using renewal theory based arguments; otherwise simulation studies are carried out. The management implications for the adoption of the various policies are discussed. The usefulness of the results in the paper is illustrated through application to a particular two‐component system. © 2002 Wiley Periodicals, Inc. Naval Research Logistics, 2003  相似文献   
558.
There are multiple damage functions in the literature to estimate the probability that a single weapon detonation destroys a point target. This paper addresses differences in the tails of four of the more popular damage functions. These four cover the asymptotic tail behaviors of all monotonically decreasing damage functions with well‐behaved hazard functions. The differences in estimates of probability of kill are quite dramatic for large aim‐point offsets. This is particularly important when balancing the number of threats that can be engaged with the chances of fratricide and collateral damage. In general, analysts substituting one damage function for another may badly estimate kill probabilities in offset‐aiming, which could result in poor doctrine. © 2003 Wiley Periodicals, Inc. Naval Research Logistics 50: 306–321, 2003.  相似文献   
559.
We present, analyze, and compare three random search methods for solving stochastic optimization problems with uncountable feasible regions. Our adaptive search with resampling (ASR) approach is a framework for designing provably convergent algorithms that are adaptive and may consequently involve local search. The deterministic and stochastic shrinking ball (DSB and SSB) approaches are also convergent, but they are based on pure random search with the only difference being the estimator of the optimal solution [the DSB method was originally proposed and analyzed by Baumert and Smith]. The three methods use different techniques to reduce the effects of noise in the estimated objective function values. Our ASR method achieves this goal through resampling of already sampled points, whereas the DSB and SSB approaches address it by averaging observations in balls that shrink with time. We present conditions under which the three methods are convergent, both in probability and almost surely, and provide a limited computational study aimed at comparing the methods. Although further investigation is needed, our numerical results suggest that the ASR approach is promising, especially for difficult problems where the probability of identifying good solutions using pure random search is small. © 2010 Wiley Periodicals, Inc. Naval Research Logistics, 2010  相似文献   
560.
Accelerated life testing (ALT) is widely used to determine the failure time distribution of a product and the associated life‐stress relationship in order to predict the product's reliability under normal operating conditions. Many types of stress loadings such as constant‐stress, step‐stress and cyclic‐stress can be utilized when conducting ALT. Extensive research has been conducted on the analysis of ALT data obtained under a specified stress loading. However, the equivalency of ALT experiments involving different stress loadings has not been investigated. In this article, a log‐location‐scale distribution under Type I censoring is considered in planning ALT. An idea is provided for the equivalency of various ALT plans involving different stress loadings. Based on this idea, general equivalent ALT plans and some special types of equivalent ALT plans are explored. For demonstration, a constant‐stress ALT and a ramp‐stress ALT for miniature lamps are presented and their equivalency is investigated. © 2010 Wiley Periodicals, Inc. Naval Research Logistics, 2010  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号