首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   207篇
  免费   41篇
  2021年   2篇
  2019年   8篇
  2018年   4篇
  2017年   5篇
  2016年   13篇
  2015年   14篇
  2014年   9篇
  2013年   59篇
  2012年   9篇
  2011年   7篇
  2010年   5篇
  2009年   7篇
  2008年   9篇
  2007年   12篇
  2006年   8篇
  2005年   12篇
  2004年   10篇
  2003年   13篇
  2002年   10篇
  2001年   6篇
  2000年   5篇
  1999年   7篇
  1995年   2篇
  1994年   2篇
  1993年   2篇
  1991年   1篇
  1990年   2篇
  1988年   1篇
  1987年   1篇
  1985年   2篇
  1981年   1篇
排序方式: 共有248条查询结果,搜索用时 500 毫秒
41.
This paper investigates the nature of two military alliances under Chinese threat. The findings are as follows: First, South Korea does not consider China a significant threat while Japan and the United States have recognized China as a serious threat since the 1990s and the 2000s, respectively. Second, the relationship between South Korea and the United States is a true military alliance for all time periods, but the nature of the alliance has changed since the 1970s. Third, although Japan began to form an alliance relationship with the United States in the 1990s, Japan is considered a more significant ally by the United States. This paper implies that, should China provoke a military confrontation, it might be difficult to deduce a common solution among the three countries because of the different response to military threats from China.  相似文献   
42.
This paper sets up a monetary endogenous growth model, and uses it to explain the ambiguous linkage between the military burden and the inflation rate observed in existing empirical studies. It is found that an expansion in the military burden has an ambiguous effect on the inflation rate depending upon the relative extent of two conflicting forces. More specifically, if the increase in the marginal benefit from holding money exceeds (falls short of) the increase in the marginal product of private capital, the inflation rate will rise (fall) in response. Moreover, it is found that an increase in the military burden will stimulate the balanced growth rate, confirming Benoit’s famous empirical findings.  相似文献   
43.
44.
45.
Demand forecasting performance is subject to the uncertainty underlying the time series an organization is dealing with. There are many approaches that may be used to reduce uncertainty and thus to improve forecasting performance. One intuitively appealing such approach is to aggregate demand in lower‐frequency “time buckets.” The approach under concern is termed to as temporal aggregation, and in this article, we investigate its impact on forecasting performance. We assume that the nonaggregated demand follows either a moving average process of order one or a first‐order autoregressive process and a single exponential smoothing (SES) procedure is used to forecast demand. These demand processes are often encountered in practice and SES is one of the standard estimators used in industry. Theoretical mean‐squared error expressions are derived for the aggregated and nonaggregated demand to contrast the relevant forecasting performances. The theoretical analysis is supported by an extensive numerical investigation and experimentation with an empirical dataset. The results indicate that performance improvements achieved through the aggregation approach are a function of the aggregation level, the smoothing constant, and the process parameters. Valuable insights are offered to practitioners and the article closes with an agenda for further research in this area. © 2013 Wiley Periodicals, Inc. Naval Research Logistics 60: 479–498, 2013  相似文献   
46.
In many practical manufacturing environments, jobs to be processed can be divided into different families such that a setup is required whenever there is a switch from processing a job of one family to another job of a different family. The time for setup could be sequence independent or sequence dependent. We consider two particular scheduling problems relevant to such situations. In both problems, we are given a set of jobs to be processed on a set of identical parallel machines. The objective of the first problem is to minimize total weighted completion time of jobs, and that of the second problem is to minimize weighted number of tardy jobs. We propose column generation based branch and bound exact solution algorithms for the problems. Computational experiments show that the algorithms are capable of solving both problems of medium size to optimality within reasonable computational time. © 2003 Wiley Periodicals, Inc. Naval Research Logistics 50: 823–840, 2003.  相似文献   
47.
For computing an optimal (Q, R) or kindred inventory policy, the current literature provides mixed signals on whether or when it is safe to approximate a nonnormal lead‐time‐demand (“LTD”) distribution by a normal distribution. The first part of this paper examines this literature critically to justify why the issue warrants further investigations, while the second part presents reliable evidence showing that the system‐cost penalty for using the normal approximation can be quite serious even when the LTD‐distribution's coefficient of variation is quite low—contrary to the prevalent view of the literature. We also identify situations that will most likely lead to large system‐cost penalty. Our results indicate that, given today's technology, it is worthwhile to estimate an LTD‐distribution's shape more accurately and to compute optimal inventory policies using statistical distributions that more accurately reflect the LTD‐distributions' actual shapes. © 2003 Wiley Periodicals, Inc. Naval Research Logistics, 2003  相似文献   
48.
Today, many products are designed and manufactured to function for a long period of time before they fail. Determining product reliability is a great challenge to manufacturers of highly reliable products with only a relatively short period of time available for internal life testing. In particular, it may be difficult to determine optimal burn‐in parameters and characterize the residual life distribution. A promising alternative is to use data on a quality characteristic (QC) whose degradation over time can be related to product failure. Typically, product failure corresponds to the first passage time of the degradation path beyond a critical value. If degradation paths can be modeled properly, one can predict failure time and determine the life distribution without actually observing failures. In this paper, we first use a Wiener process to describe the continuous degradation path of the quality characteristic of the product. A Wiener process allows nonconstant variance and nonzero correlation among data collected at different time points. We propose a decision rule for classifying a unit as normal or weak, and give an economic model for determining the optimal termination time and other parameters of a burn‐in test. Next, we propose a method for assessing the product's lifetime distribution of the passed units. The proposed methodologies are all based only on the product's initial observed degradation data. Finally, an example of an electronic product, namely contact image scanner (CIS), is used to illustrate the proposed procedure. © 2002 Wiley Periodicals, Inc. Naval Research Logistics, 2003  相似文献   
49.
Burn‐in is a technique to enhance reliability by eliminating weak items from a population of items having heterogeneous lifetimes. System burn‐in can improve system reliability, but the conditions for system burn‐in to be performed after component burn‐in remain a little understood mathematical challenge. To derive such conditions, we first introduce a general model of heterogeneous system lifetimes, in which the component burn‐in information and assembly problems are related to the prediction of system burn‐in. Many existing system burn‐in models become special cases and two important results are identified. First, heterogeneous system lifetimes can be understood naturally as a consequence of heterogeneous component lifetimes and heterogeneous assembly quality. Second, system burn‐in is effective if assembly quality variation in the components and connections which are arranged in series is greater than a threshold, where the threshold depends on the system structure and component failure rates. © 2003 Wiley Periodicals, Inc. Naval Research Logistics 50: 364–380, 2003.  相似文献   
50.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号