167 resultados para Utility optimization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quality of environmental decisions should be gauged according to managers' objectives. Management objectives generally seek to maximize quantifiable measures of system benefit, for instance population growth rate. Reaching these goals often requires a certain degree of learning about the system. Learning can occur by using management action in combination with a monitoring system. Furthermore, actions can be chosen strategically to obtain specific kinds of information. Formal decision making tools can choose actions to favor such learning in two ways: implicitly via the optimization algorithm that is used when there is a management objective (for instance, when using adaptive management), or explicitly by quantifying knowledge and using it as the fundamental project objective, an approach new to conservation.This paper outlines three conservation project objectives - a pure management objective, a pure learning objective, and an objective that is a weighted mixture of these two. We use eight optimization algorithms to choose actions that meet project objectives and illustrate them in a simulated conservation project. The algorithms provide a taxonomy of decision making tools in conservation management when there is uncertainty surrounding competing models of system function. The algorithms build upon each other such that their differences are highlighted and practitioners may see where their decision making tools can be improved. © 2010 Elsevier Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Discounted Cumulative Gain (DCG) is a well-known ranking evaluation measure for models built with multiple relevance graded data. By handling tagging data used in recommendation systems as an ordinal relevance set of {negative,null,positive}, we propose to build a DCG based recommendation model. We present an efficient and novel learning-to-rank method by optimizing DCG for a recommendation model using the tagging data interpretation scheme. Evaluating the proposed method on real-world datasets, we demonstrate that the method is scalable and outperforms the benchmarking methods by generating a quality top-N item recommendation list.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many complex aeronautical design problems can be formulated with efficient multi-objective evolutionary optimization methods and game strategies. This book describes the role of advanced innovative evolution tools in the solution, or the set of solutions of single or multi disciplinary optimization. These tools use the concept of multi-population, asynchronous parallelization and hierarchical topology which allows different models including precise, intermediate and approximate models with each node belonging to the different hierarchical layer handled by a different Evolutionary Algorithm. The efficiency of evolutionary algorithms for both single and multi-objective optimization problems are significantly improved by the coupling of EAs with games and in particular by a new dynamic methodology named “Hybridized Nash-Pareto games”. Multi objective Optimization techniques and robust design problems taking into account uncertainties are introduced and explained in detail. Several applications dealing with civil aircraft and UAV, UCAV systems are implemented numerically and discussed. Applications of increasing optimization complexity are presented as well as two hands-on test cases problems. These examples focus on aeronautical applications and will be useful to the practitioner in the laboratory or in industrial design environments. The evolutionary methods coupled with games presented in this volume can be applied to other areas including surface and marine transport, structures, biomedical engineering, renewable energy and environmental problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For wind farm optimizations with lands belonging to different owners, the traditional penalty method is highly dependent on the type of wind farm land division. The application of the traditional method can be cumbersome if the divisions are complex. To overcome this disadvantage, a new method is proposed in this paper for the first time. Unlike the penalty method which requires the addition of penalizing term when evaluating the fitness function, it is achieved through repairing the infeasible solutions before fitness evaluation. To assess the effectiveness of the proposed method on the optimization of wind farm, the optimizing results of different methods are compared for three different types of wind farm division. Different wind scenarios are also incorporated during optimization which includes (i) constant wind speed and wind direction; (ii) various wind speed and wind direction, and; (iii) the more realisticWeibull distribution. Results show that the performance of the new method varies for different land plots in the tested cases. Nevertheless, it is found that optimum or at least close to optimum results can be obtained with sequential land plot study using the new method for all cases. It is concluded that satisfactory results can be achieved using the proposed method. In addition, it has the advantage of flexibility in managing the wind farm design, which not only frees users to define the penalty parameter but without limitations on the wind farm division.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose To compare small nerve fiber damage in the central cornea and whorl area in participants with diabetic peripheral neuropathy (DPN) and to examine the accuracy of evaluating these 2 anatomical sites for the diagnosis of DPN. Methods A cohort of 187 participants (107 with type 1 diabetes and 80 controls) was enrolled. The neuropathy disability score (NDS) was used for the identification of DPN. The corneal nerve fiber length at the central cornea (CNFLcenter) and whorl (CNFLwhorl) was quantified using corneal confocal microscopy and a fully automated morphometric technique and compared according to the DPN status. Receiver operating characteristic analyses were used to compare the accuracy of the 2 corneal locations for the diagnosis of DPN. Results CNFLcenter and CNFLwhorl were able to differentiate all 3 groups (diabetic participants with and without DPN and controls) (P < 0.001). There was a weak but significant linear relationship for CNFLcenter and CNFLwhorl versus NDS (P < 0.001); however, the corneal location x NDS interaction was not statistically significant (P = 0.17). The area under the receiver operating characteristic curve was similar for CNFLcenter and CNFLwhorl (0.76 and 0.77, respectively, P = 0.98). The sensitivity and specificity of the cutoff points were 0.9 and 0.5 for CNFLcenter and 0.8 and 0.6 for CNFLwhorl. Conclusions Small nerve fiber pathology is comparable at the central and whorl anatomical sites of the cornea. Quantification of CNFL from the corneal center is as accurate as CNFL quantification of the whorl area for the diagnosis of DPN.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study proposes an optimized approach of designing in which a model specially shaped composite tank for spacecrafts is built by applying finite element analysis. The composite layers are preliminarily designed by combining quasi-network design method with numerical simulation, which determines the ratio between the angle and the thickness of layers as the initial value of the optimized design. By adopting an adaptive simulated annealing algorithm, the angles and the numbers of layers at each angle are optimized to minimize the weight of structure. Based on this, the stacking sequence of composite layers is formulated according to the number of layers in the optimized structure by applying the enumeration method and combining the general design parameters. Numerical simulation is finally adopted to calculate the buckling limit of tanks in different designing methods. This study takes a composite tank with a cone-shaped cylinder body as example, in which ellipsoid head section and outer wall plate are selected as the object to validate this method. The result shows that the quasi-network design method can improve the design quality of composite material layer in tanks with complex preliminarily loading conditions. The adaptive simulated annealing algorithm can reduce the initial design weight by 30%, which effectively probes the global optimal solution and optimizes the weight of structure. It can be therefore proved that, this optimization method is capable of designing and optimizing specially shaped composite tanks with complex loading conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Particle Swarm Optimization (PSO) is a biologically inspired computational search and optimization method based on the social behaviors of birds flocking or fish schooling. Although, PSO is represented in solving many well-known numerical test problems, but it suffers from the premature convergence. A number of basic variations have been developed due to solve the premature convergence problem and improve quality of solution founded by the PSO. This study presents a comprehensive survey of the various PSO-based algorithms. As part of this survey, the authors have included a classification of the approaches and they have identify the main features of each proposal. In the last part of the study, some of the topics within this field that are considered as promising areas of future research are listed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Index tracking is an investment approach where the primary objective is to keep portfolio return as close as possible to a target index without purchasing all index components. The main purpose is to minimize the tracking error between the returns of the selected portfolio and a benchmark. In this paper, quadratic as well as linear models are presented for minimizing the tracking error. The uncertainty is considered in the input data using a tractable robust framework that controls the level of conservatism while maintaining linearity. The linearity of the proposed robust optimization models allows a simple implementation of an ordinary optimization software package to find the optimal robust solution. The proposed model of this paper employs Morgan Stanley Capital International Index as the target index and the results are reported for six national indices including Japan, the USA, the UK, Germany, Switzerland and France. The performance of the proposed models is evaluated using several financial criteria e.g. information ratio, market ratio, Sharpe ratio and Treynor ratio. The preliminary results demonstrate that the proposed model lowers the amount of tracking error while raising values of portfolio performance measures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently we are facing an overburdening growth of the number of reliable information sources on the Internet. The quantity of information available to everyone via Internet is dramatically growing each year [15]. At the same time, temporal and cognitive resources of human users are not changing, therefore causing a phenomenon of information overload. World Wide Web is one of the main sources of information for decision makers (reference to my research). However our studies show that, at least in Poland, the decision makers see some important problems when turning to Internet as a source of decision information. One of the most common obstacles raised is distribution of relevant information among many sources, and therefore need to visit different Web sources in order to collect all important content and analyze it. A few research groups have recently turned to the problem of information extraction from the Web [13]. The most effort so far has been directed toward collecting data from dispersed databases accessible via web pages (related to as data extraction or information extraction from the Web) and towards understanding natural language texts by means of fact, entity, and association recognition (related to as information extraction). Data extraction efforts show some interesting results, however proper integration of web databases is still beyond us. Information extraction field has been recently very successful in retrieving information from natural language texts, however it is still lacking abilities to understand more complex information, requiring use of common sense knowledge, discourse analysis and disambiguation techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study we present a combinatorial optimization method based on particle swarm optimization and local search algorithm on the multi-robot search system. Under this method, in order to create a balance between exploration and exploitation and guarantee the global convergence, at each iteration step if the distance between target and the robot become less than specific measure then a local search algorithm is performed. The local search encourages the particle to explore the local region beyond to reach the target in lesser search time. Experimental results obtained in a simulated environment show that biological and sociological inspiration could be useful to meet the challenges of robotic applications that can be described as optimization problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A phylogenetic hypothesis for the lepidopteran superfamily Noctuoidea was inferred based on the complete mitochondrial (mt) genomes of 12 species (six newly sequenced). The monophyly of each noctuoid family in the latest classification was well supported. Novel and robust relationships were recovered at the family level, in contrast to previous analyses using nuclear genes. Erebidae was recovered as sister to (Nolidae+(Euteliidae+Noctuidae)), while Notodontidae was sister to all these taxa (the putatively basalmost lineage Oenosandridae was not included). In order to improve phylogenetic resolution using mt genomes, various analytical approaches were tested: Bayesian inference (BI) vs. maximum likelihood (ML), excluding vs. including RNA genes (rRNA or tRNA), and Gblocks treatment. The evolutionary signal within mt genomes had low sensitivity to analytical changes. Inference methods had the most significant influence. Inclusion of tRNAs positively increased the congruence of topologies, while inclusion of rRNAs resulted in a range of phylogenetic relationships varying depending on other analytical factors. The two Gblocks parameter settings had opposite effects on nodal support between the two inference methods. The relaxed parameter (GBRA) resulted in higher support values in BI analyses, while the strict parameter (GBDH) resulted in higher support values in ML analyses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The co-curing process for advanced grid-stiffened (AGS) composite structure is a promising manufacturing process, which could reduce the manufacturing cost, augment the advantages and improve the performance of AGS composite structure. An improved method named soft-mold aided co-curing process which replaces the expansion molds by a whole rubber mold is adopted in this paper. This co-curing process is capable to co-cure a typical AGS composite structure with the manufacturer’s recommended cure cycle (MRCC). Numerical models are developed to evaluate the variation of temperature and the degree of cure in AGS composite structure during the soft-mold aided co-curing process. The simulation results were validated by experimental results obtained from embedded temperature sensors. Based on the validated modeling framework, the cycle of cure can be optimized by reducing more than half the time of MRCC while obtaining a reliable degree of cure. The shape and size effects of AGS composite structure on the distribution of temperature and degree of cure are also investigated to provide insights for the optimization of soft-mold aided co-curing process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The total entropy utility function is considered for the dual purpose of Bayesian design for model discrimination and parameter estimation. A sequential design setting is proposed where it is shown how to efficiently estimate the total entropy utility for a wide variety of data types. Utility estimation relies on forming particle approximations to a number of intractable integrals which is afforded by the use of the sequential Monte Carlo algorithm for Bayesian inference. A number of motivating examples are considered for demonstrating the performance of total entropy in comparison to utilities for model discrimination and parameter estimation. The results suggest that the total entropy utility selects designs which are efficient under both experimental goals with little compromise in achieving either goal. As such, the total entropy utility is advocated as a general utility for Bayesian design in the presence of model uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optical transmittance and conductivity for thin metallic films, such as Au, are two inversely related and extremely important parameters for its application in organic photovoltaics as the front electrode. We report our findings on how these parameters have been optimized to attain maximum possible efficiencies by fabricating organic solar cells with thin Au film anodes of differing optical transmittances and consequently due to scaling at the nanolevel, varying electrical conductivities. There was an extraordinary improvement in the overall solar cell efficiency (to the order of 49%) when the Au thin film transmittance was increased from 38% to 54%. Surface morphologies of these thin films also have an effect on the critical parameters including, Voc, Jsc and FF.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective National guidelines for management of intermediate risk patients with suspected acute coronary syndrome, in whom AMI has been excluded, advocate provocative testing to final risk stratify these patients into low risk (negative testing) or high risk (positive testing suggestive of unstable angina). Adults less than 40 years have a low pretest probability of acute coronary syndrome. The utility of exercise stress testing in young adults with chest pain suspected of acute coronary syndrome who have National Heart Foundation intermediate risk features was evaluated Methods A retrospective analysis of exercise stress testing performed on patients less than 40 years was evaluated. Patients were enrolled on a chest pain pathway and had negative serial ECGs and cardiac biomarkers before exercise stress testing to rule-out acute coronary syndrome. Chart review was completed on patients with positive stress tests. Results The 3987 patients with suspected intermediate risk acute coronary syndrome underwent exercise stress testing. One thousand and twenty-seven (25.8%) were aged less than 40 years (age 33.3 ± 4.8 years). Four of these 1027 patients had a positive exercise stress test (0.4% incidence of positive exercise stress testing). Of those, three patients had subsequent non-invasive functional testing that yielded a negative result. One patient declined further investigations. Assuming this was a true positive exercise stress test, the incidence of true positive exercise stress testing would have been 0.097% (95% confidence interval: 0.079–0.115%) (one of 1027 patients). Conclusions Routine exercise stress testing has limited value in the risk stratification of adults less than 40 years with suspected intermediate risk of acute coronary syndrome