988 resultados para Parameter Optimization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study we present a combinatorial optimization method based on particle swarm optimization and local search algorithm on the multi-robot search system. Under this method, in order to create a balance between exploration and exploitation and guarantee the global convergence, at each iteration step if the distance between target and the robot become less than specific measure then a local search algorithm is performed. The local search encourages the particle to explore the local region beyond to reach the target in lesser search time. Experimental results obtained in a simulated environment show that biological and sociological inspiration could be useful to meet the challenges of robotic applications that can be described as optimization problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The co-curing process for advanced grid-stiffened (AGS) composite structure is a promising manufacturing process, which could reduce the manufacturing cost, augment the advantages and improve the performance of AGS composite structure. An improved method named soft-mold aided co-curing process which replaces the expansion molds by a whole rubber mold is adopted in this paper. This co-curing process is capable to co-cure a typical AGS composite structure with the manufacturer’s recommended cure cycle (MRCC). Numerical models are developed to evaluate the variation of temperature and the degree of cure in AGS composite structure during the soft-mold aided co-curing process. The simulation results were validated by experimental results obtained from embedded temperature sensors. Based on the validated modeling framework, the cycle of cure can be optimized by reducing more than half the time of MRCC while obtaining a reliable degree of cure. The shape and size effects of AGS composite structure on the distribution of temperature and degree of cure are also investigated to provide insights for the optimization of soft-mold aided co-curing process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The total entropy utility function is considered for the dual purpose of Bayesian design for model discrimination and parameter estimation. A sequential design setting is proposed where it is shown how to efficiently estimate the total entropy utility for a wide variety of data types. Utility estimation relies on forming particle approximations to a number of intractable integrals which is afforded by the use of the sequential Monte Carlo algorithm for Bayesian inference. A number of motivating examples are considered for demonstrating the performance of total entropy in comparison to utilities for model discrimination and parameter estimation. The results suggest that the total entropy utility selects designs which are efficient under both experimental goals with little compromise in achieving either goal. As such, the total entropy utility is advocated as a general utility for Bayesian design in the presence of model uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we have used simulations to make a conjecture about the coverage of a t-dimensional subspace of a d-dimensional parameter space of size n when performing k trials of Latin Hypercube sampling. This takes the form P(k,n,d,t) = 1 - e^(-k/n^(t-1)). We suggest that this coverage formula is independent of d and this allows us to make connections between building Populations of Models and Experimental Designs. We also show that Orthogonal sampling is superior to Latin Hypercube sampling in terms of allowing a more uniform coverage of the t-dimensional subspace at the sub-block size level. These ideas have particular relevance when attempting to perform uncertainty quantification and sensitivity analyses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stochastic (or random) processes are inherent to numerous fields of human endeavour including engineering, science, and business and finance. This thesis presents multiple novel methods for quickly detecting and estimating uncertainties in several important classes of stochastic processes. The significance of these novel methods is demonstrated by employing them to detect aircraft manoeuvres in video signals in the important application of autonomous mid-air collision avoidance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we provide estimates for the coverage of parameter space when using Latin Hypercube Sampling, which forms the basis of building so-called populations of models. The estimates are obtained using combinatorial counting arguments to determine how many trials, k, are needed in order to obtain specified parameter space coverage for a given value of the discretisation size n. In the case of two dimensions, we show that if the ratio (Ø) of trials to discretisation size is greater than 1, then as n becomes moderately large the fractional coverage behaves as 1-exp-ø. We compare these estimates with simulation results obtained from an implementation of Latin Hypercube Sampling using MATLAB.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper demonstrates the procedures for probabilistic assessment of a pesticide fate and transport model, PCPF-1, to elucidate the modeling uncertainty using the Monte Carlo technique. Sensitivity analyses are performed to investigate the influence of herbicide characteristics and related soil properties on model outputs using four popular rice herbicides: mefenacet, pretilachlor, bensulfuron-methyl and imazosulfuron. Uncertainty quantification showed that the simulated concentrations in paddy water varied more than those of paddy soil. This tendency decreased as the simulation proceeded to a later period but remained important for herbicides having either high solubility or a high 1st-order dissolution rate. The sensitivity analysis indicated that PCPF-1 parameters requiring careful determination are primarily those involve with herbicide adsorption (the organic carbon content, the bulk density and the volumetric saturated water content), secondary parameters related with herbicide mass distribution between paddy water and soil (1st-order desorption and dissolution rates) and lastly, those involving herbicide degradations. © Pesticide Science Society of Japan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a chance-constrained linear programming formulation for reservoir operation of a multipurpose reservoir. The release policy is defined by a chance constraint that the probability of irrigation release in any period equalling or exceeding the irrigation demand is at least equal to a specified value P (called reliability level). The model determines the maximum annual hydropower produced while meeting the irrigation demand at a specified reliability level. The model considers variation in reservoir water level elevation and also the operating range within which the turbine operates. A linear approximation for nonlinear power production function is assumed and the solution obtained within a specified tolerance limit. The inflow into the reservoir is considered random. The chance constraint is converted into its deterministic equivalent using a linear decision rule and inflow probability distribution. The model application is demonstrated through a case study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fuzzy waste-load allocation model, FWLAM, is developed for water quality management of a river system using fuzzy multiple-objective optimization. An important feature of this model is its capability to incorporate the aspirations and conflicting objectives of the pollution control agency and dischargers. The vagueness associated with specifying the water quality criteria and fraction removal levels is modeled in a fuzzy framework. The goals related to the pollution control agency and dischargers are expressed as fuzzy sets. The membership functions of these fuzzy sets are considered to represent the variation of satisfaction levels of the pollution control agency and dischargers in attaining their respective goals. Two formulations—namely, the MAX-MIN and MAX-BIAS formulations—are proposed for FWLAM. The MAX-MIN formulation maximizes the minimum satisfaction level in the system. The MAX-BIAS formulation maximizes a bias measure, giving a solution that favors the dischargers. Maximization of the bias measure attempts to keep the satisfaction levels of the dischargers away from the minimum satisfaction level and that of the pollution control agency close to the minimum satisfaction level. Most of the conventional water quality management models use waste treatment cost curves that are uncertain and nonlinear. Unlike such models, FWLAM avoids the use of cost curves. Further, the model provides the flexibility for the pollution control agency and dischargers to specify their aspirations independently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Some of the well known formulations for topology optimization of compliant mechanisms could lead to lumped compliant mechanisms. In lumped compliance, most of the elastic deformation in a mechanism occurs at few points, while rest of the mechanism remains more or less rigid. Such points are referred to as point-flexures. It has been noted in literature that high relative rotation is associated with point-flexures. In literature we also find a formulation of local constraint on relative rotations to avoid lumped compliance. However it is well known that a global constraint is easier to handle than a local constraint, by a numerical optimization algorithm. The current work presents a way of putting global constraint on relative rotations. This constraint is also simpler to implement since it uses linearized rotation at the center of finite-elements, to compute relative rotations. I show the results obtained by using this constraint oil the following benchmark problems - displacement inverter and gripper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this article is to show the applicability and benefits of the techniques of design of experiments as an optimization tool for discrete simulation models. The simulated systems are computational representations of real-life systems; its characteristics include a constant evolution that follows the occurrence of discrete events along the time. In this study, a production system, designed with the business philosophy JIT (Just in Time) is used, which seeks to achieve excellence in organizations through waste reduction in all the operational aspects. The most typical tool of JIT systems is the KANBAN production control that seeks to synchronize demand with flow of materials, minimize work in process, and define production metrics. Using experimental design techniques for stochastic optimization, the impact of the operational factors on the efficiency of the KANBAN / CONWIP simulation model is analyzed. The results show the effectiveness of the integration of experimental design techniques and discrete simulation models in the calculation of the operational parameters. Furthermore, the reliability of the methodologies found was improved with a new statistical consideration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The LISA Parameter Estimation Taskforce was formed in September 2007 to provide the LISA Project with vetted codes, source distribution models and results related to parameter estimation. The Taskforce's goal is to be able to quickly calculate the impact of any mission design changes on LISA's science capabilities, based on reasonable estimates of the distribution of astrophysical sources in the universe. This paper describes our Taskforce's work on massive black-hole binaries (MBHBs). Given present uncertainties in the formation history of MBHBs, we adopt four different population models, based on (i) whether the initial black-hole seeds are small or large and (ii) whether accretion is efficient or inefficient at spinning up the holes. We compare four largely independent codes for calculating LISA's parameter-estimation capabilities. All codes are based on the Fisher-matrix approximation, but in the past they used somewhat different signal models, source parametrizations and noise curves. We show that once these differences are removed, the four codes give results in extremely close agreement with each other. Using a code that includes both spin precession and higher harmonics in the gravitational-wave signal, we carry out Monte Carlo simulations and determine the number of events that can be detected and accurately localized in our four population models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The theoretical optimization of the design parametersN A ,N D andW P has been done for efficient operation of Au-p-n Si solar cell including thermionic field emission, dependence of lifetime and mobility on impurity concentrations, dependence of absorption coefficient on wavelength, variation of barrier height and hence the optimum thickness ofp region with illumination. The optimized design parametersN D =5×1020 m−3,N A =3×1024 m−3 andW P =11.8 nm yield efficiencyη=17.1% (AM0) andη=19.6% (AM1). These are reduced to 14.9% and 17.1% respectively if the metal layer series resistance and transmittance with ZnS antireflection coating are included. A practical value ofW P =97.0 nm gives an efficiency of 12.2% (AM1).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simultaneous consideration of both performance and reliability issues is important in the choice of computer architectures for real-time aerospace applications. One of the requirements for such a fault-tolerant computer system is the characteristic of graceful degradation. A shared and replicated resources computing system represents such an architecture. In this paper, a combinatorial model is used for the evaluation of the instruction execution rate of a degradable, replicated resources computing system such as a modular multiprocessor system. Next, a method is presented to evaluate the computation reliability of such a system utilizing a reliability graph model and the instruction execution rate. Finally, this computation reliability measure, which simultaneously describes both performance and reliability, is applied as a constraint in an architecture optimization model for such computing systems. Index Terms-Architecture optimization, computation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The method of generalized estimating equations (GEEs) provides consistent estimates of the regression parameters in a marginal regression model for longitudinal data, even when the working correlation model is misspecified (Liang and Zeger, 1986). However, the efficiency of a GEE estimate can be seriously affected by the choice of the working correlation model. This study addresses this problem by proposing a hybrid method that combines multiple GEEs based on different working correlation models, using the empirical likelihood method (Qin and Lawless, 1994). Analyses show that this hybrid method is more efficient than a GEE using a misspecified working correlation model. Furthermore, if one of the working correlation structures correctly models the within-subject correlations, then this hybrid method provides the most efficient parameter estimates. In simulations, the hybrid method's finite-sample performance is superior to a GEE under any of the commonly used working correlation models and is almost fully efficient in all scenarios studied. The hybrid method is illustrated using data from a longitudinal study of the respiratory infection rates in 275 Indonesian children.