首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   161篇
  免费   70篇
  国内免费   6篇
  2023年   1篇
  2022年   3篇
  2021年   6篇
  2020年   9篇
  2019年   9篇
  2018年   5篇
  2017年   6篇
  2016年   17篇
  2015年   4篇
  2014年   12篇
  2013年   19篇
  2012年   10篇
  2011年   17篇
  2010年   17篇
  2009年   6篇
  2008年   10篇
  2007年   8篇
  2006年   11篇
  2005年   11篇
  2004年   16篇
  2003年   2篇
  2002年   6篇
  2001年   5篇
  2000年   2篇
  1999年   4篇
  1998年   7篇
  1997年   3篇
  1996年   1篇
  1995年   1篇
  1994年   6篇
  1993年   2篇
  1990年   1篇
排序方式: 共有237条查询结果,搜索用时 31 毫秒
161.
This paper considers a two-agent scheduling problem with linear resource-dependent processing times, in which each agent has a set of jobs that compete with that of the other agent for the use of a common processing machine, and each agent aims to minimize the weighted number of its tardy jobs. To meet the due date requirements of the jobs of the two agents, additional amounts of a common resource, which may be in discrete or continuous quantities, can be allocated to the processing of the jobs to compress their processing durations. The actual processing time of a job is a linear function of the amount of the resource allocated to it. The objective is to determine the optimal job sequence and resource allocation strategy so as to minimize the weighted number of tardy jobs of one agent, while keeping the weighted number of tardy jobs of the other agent, and the total resource consumption cost within their respective predetermined limits. It is shown that the problem is -hard in the ordinary sense, and there does not exist a polynomial-time approximation algorithm with performance ratio unless ; however it admits a relaxed fully polynomial time approximation scheme. A proximal bundle algorithm based on Lagrangian relaxation is also presented to solve the problem approximately. To speed up convergence and produce sharp bounds, enhancement strategies including the design of a Tabu search algorithm and integration of a Lagrangian recovery heuristic into the algorithm are devised. Extensive numerical studies are conducted to assess the effectiveness and efficiency of the proposed algorithms.  相似文献   
162.
由于卫星轨道测量数据中含有非线性误差,使用传统的最小二乘多项式拟合方法对其进行预处理必然会降低定轨精度.在半参数回归模型的基础上,应用小波阈值去噪算法估计并消除观测数据中存在的非线性误差,提出了基于小波去噪半参数回归模型的卫星轨道测量数据预处理方法,以提高数据预处理的精度.对某卫星USB跟踪数据应用该方法进行了仿真,仿真结果表明:该方法可以分离出观测数据中的白噪声和非线性误差,从而可以在观测数据中消除非线性误差的影响,提高数据预处理的精度.  相似文献   
163.
喷嘴是产生高压水射流的关键部件,其结构形式对射流动力学性能有很大影响。以圆柱形喷嘴为对象,进行喷嘴结构对高压水射流的影响分析及结构参数优化设计。采用两相流计算流体力学模型进行喷嘴内外的射流流场分析。为节省计算资源,在优化设计时引入Kriging代理模型替代计算流体力学模型。分别采用改进的非劣分类遗传算法和基于分解的多目标进化算法进行单目标和多目标优化设计。研究结果表明:直线型喷嘴总体性能较优,凹型喷嘴的次之,凸型喷嘴性能最差。以直线型喷嘴为设计对象,以射流初始段长度和流量为目标,得到了单目标和多目标优化设计结果。单目标优化时,两个指标较基准外形分别提高14.71%和27.56%。多目标优化时,优化得到的半锥角处于[15.4°,89.8°]区间内。运用代理模型和进化算法的全局优化方法在进行喷嘴的优化设计时是有效的。  相似文献   
164.
165.
Covering models assume that a point is covered if it is within a certain distance from a facility and not covered beyond that distance. In gradual cover models it is assumed that a point is fully covered within a given distance from a facility, then cover gradually declines, and the point is not covered beyond a larger distance. Gradual cover models address the discontinuity in cover which may not be the correct approach in many situations. In the stochastic gradual cover model presented in this article it is assumed that the short and long distances employed in gradual cover models are random variables. This refinement of gradual cover models provides yet a more realistic depiction of actual behavior in many situations. The maximal cover model based on the new concept is analyzed and the single facility location cover problem in the plane is solved. Computational results illustrating the effectiveness of the solution procedures are presented. © 2010 Wiley Periodicals, Inc. Naval Research Logistics, 2010  相似文献   
166.
We consider the problem of scheduling n independent and simultaneously available jobs without preemption on a single machine, where the machine has a fixed maintenance activity. The objective is to find the optimal job sequence to minimize the total amount of late work, where the late work of a job is the amount of processing of the job that is performed after its due date. We first discuss the approximability of the problem. We then develop two pseudo‐polynomial dynamic programming algorithms and a fully polynomial‐time approximation scheme for the problem. Finally, we conduct extensive numerical studies to evaluate the performance of the proposed algorithms. © 2016 Wiley Periodicals, Inc. Naval Research Logistics 63: 172–183, 2016  相似文献   
167.
We present, analyze, and compare three random search methods for solving stochastic optimization problems with uncountable feasible regions. Our adaptive search with resampling (ASR) approach is a framework for designing provably convergent algorithms that are adaptive and may consequently involve local search. The deterministic and stochastic shrinking ball (DSB and SSB) approaches are also convergent, but they are based on pure random search with the only difference being the estimator of the optimal solution [the DSB method was originally proposed and analyzed by Baumert and Smith]. The three methods use different techniques to reduce the effects of noise in the estimated objective function values. Our ASR method achieves this goal through resampling of already sampled points, whereas the DSB and SSB approaches address it by averaging observations in balls that shrink with time. We present conditions under which the three methods are convergent, both in probability and almost surely, and provide a limited computational study aimed at comparing the methods. Although further investigation is needed, our numerical results suggest that the ASR approach is promising, especially for difficult problems where the probability of identifying good solutions using pure random search is small. © 2010 Wiley Periodicals, Inc. Naval Research Logistics, 2010  相似文献   
168.
In this article, we define two different workforce leveling objectives for serial transfer lines. Each job is to be processed on each transfer station for c time periods (e.g., hours). We assume that the number of workers needed to complete each operation of a job in precisely c periods is given. Jobs transfer forward synchronously after every production cycle (i.e., c periods). We study two leveling objectives: maximin workforce size () and min range (R). Leveling objectives produce schedules where the cumulative number of workers needed in all stations of a transfer line does not experience dramatic changes from one production cycle to the next. For and a two‐station system, we develop a fast polynomial algorithm. The range problem is known to be NP‐complete. For the two‐station system, we develop a very fast optimal algorithm that uses a tight lower bound and an efficient procedure for finding complementary Hamiltonian cycles in bipartite graphs. Via a computational experiment, we demonstrate that range schedules are superior because not only do they limit the workforce fluctuations from one production cycle to the next, but they also do so with a minor increase in the total workforce size. We extend our results to the m‐station system and develop heuristic algorithms. We find that these heuristics work poorly for min range (R), which indicates that special structural properties of the m‐station problem need to be identified before we can develop efficient algorithms. © 2016 Wiley Periodicals, Inc. Naval Research Logistics 63: 577–590, 2016  相似文献   
169.
We introduce and study a generalization of the classic sequential testing problem, asking to identify the correct state of a given series system that consists of independent stochastic components. In this setting, costly tests are required to examine the state of individual components, which are sequentially tested until the correct system state can be uniquely identified. The goal is to propose a policy that minimizes the expected testing cost, given a‐priori probabilistic information on the stochastic nature of each individual component. Unlike the classic setting, where variables are tested one after the other, we allow multiple tests to be conducted simultaneously, at the expense of incurring an additional set‐up cost. The main contribution of this article consists in showing that the batch testing problem can be approximated in polynomial time within factor , for any fixed . In addition, we explain how, in spite of its highly nonlinear objective function, the batch testing problem can be formulated as an approximate integer program of polynomial size, while blowing up its expected cost by a factor of at most . Finally, we conduct extensive computational experiments, to demonstrate the practical effectiveness of these algorithms as well as to evaluate their limitations. © 2016 Wiley Periodicals, Inc. Naval Research Logistics 63: 275–286, 2016  相似文献   
170.
Accelerated degradation testing (ADT) is usually conducted under deterministic stresses such as constant‐stress, step‐stress, and cyclic‐stress. Based on ADT data, an ADT model is developed to predict reliability under normal (field) operating conditions. In engineering applications, the “standard” approach for reliability prediction assumes that the normal operating conditions are deterministic or simply uses the mean values of the stresses while ignoring their variability. Such an approach may lead to significant prediction errors. In this paper, we extend an ADT model obtained from constant‐stress ADT experiments to predict field reliability by considering the stress variations. A case study is provided to demonstrate the proposed statistical inference procedure. The accuracy of the procedure is verified by simulation using various distributions of field stresses. © 2006 Wiley Periodicals, Inc. Naval Research Logistics, 2006.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号