940 resultados para Evolutionary optimization methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

P>Typing methods to evaluate isolates in relation to their phenotypical and molecular characteristics are essential in epidemiological studies. In this study, Candida albicans biotypes were determined before and after storage in order to verify their stability. Twenty C. albicans isolates were typed by Randomly Amplified Polymorphic DNA (RAPD), production of phospholipase and proteinase exoenzymes (enzymotyping) and morphotyping before and after 180 days of storage in Sabouraud dextrose agar (SDA) and sterilised distilled water. Before the storage, 19 RAPD patterns, two enzymotypes and eight morphotypes were identified. The fragment patterns obtained by RAPD, on the one hand, were not significantly altered after storage. On the other hand, the majority of the isolates changed their enzymotype and morphotype after storage. RAPD typing provided the better discriminatory index (DI) among isolates (DI = 0.995) and maintained the profile identified, thereby confirming its utility in epidemiological surveys. Based on the low reproducibility observed after storage in SDA and distilled water by morphotyping (DI = 0.853) and enzymotyping (DI = 0.521), the use of these techniques is not recommended on stored isolates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The supervised pattern recognition methods K-Nearest Neighbors (KNN), stepwise discriminant analysis (SDA), and soft independent modelling of class analogy (SIMCA) were employed in this work with the aim to investigate the relationship between the molecular structure of 27 cannabinoid compounds and their analgesic activity. Previous analyses using two unsupervised pattern recognition methods (PCA-principal component analysis and HCA-hierarchical cluster analysis) were performed and five descriptors were selected as the most relevants for the analgesic activity of the compounds studied: R (3) (charge density on substituent at position C(3)), Q (1) (charge on atom C(1)), A (surface area), log P (logarithm of the partition coefficient) and MR (molecular refractivity). The supervised pattern recognition methods (SDA, KNN, and SIMCA) were employed in order to construct a reliable model that can be able to predict the analgesic activity of new cannabinoid compounds and to validate our previous study. The results obtained using the SDA, KNN, and SIMCA methods agree perfectly with our previous model. Comparing the SDA, KNN, and SIMCA results with the PCA and HCA ones we could notice that all multivariate statistical methods classified the cannabinoid compounds studied in three groups exactly in the same way: active, moderately active, and inactive.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In mapping the evolutionary process of online news and the socio-cultural factors determining this development, this paper has a dual purpose. First, in reworking the definition of “online communication”, it argues that despite its seemingly sudden emergence in the 1990s, the history of online news started right in the early days of the telegraphs and spread throughout the development of the telephone and the fax machine before becoming computer-based in the 1980s and Web-based in the 1990s. Second, merging macro-perspectives on the dynamic of media evolution by DeFleur and Ball-Rokeach (1989) and Winston (1998), the paper consolidates a critical point for thinking about new media development: that something technically feasible does not always mean that it will be socially accepted and/or demanded. From a producer-centric perspective, the birth and development of pre-Web online news forms have been more or less generated by the traditional media’s sometimes excessive hype about the power of new technologies. However, placing such an emphasis on technological potentials at the expense of their social conditions not only can be misleading but also can be detrimental to the development of new media, including the potential of today’s online news.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate analytically the first and the second law characteristics of fully developed forced convection inside a porous-saturated duct of rectangular cross-section. The Darcy-Brinkman flow model is employed. Three different types of thermal boundary conditions are examined. Expressions for the Nusselt number, the Bejan number, and the dimensionless entropy generation rate are presented in terms of the system parameters. The conclusions of this analytical study will make it possible to compare, evaluate, and optimize alternative rectangular duct design options in terms of heat transfer, pressure drop, and entropy generation. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper critically assesses several loss allocation methods based on the type of competition each method promotes. This understanding assists in determining which method will promote more efficient network operations when implemented in deregulated electricity industries. The methods addressed in this paper include the pro rata [1], proportional sharing [2], loss formula [3], incremental [4], and a new method proposed by the authors of this paper, which is loop-based [5]. These methods are tested on a modified Nordic 32-bus network, where different case studies of different operating points are investigated. The varying results obtained for each allocation method at different operating points make it possible to distinguish methods that promote unhealthy competition from those that encourage better system operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantum computers promise to increase greatly the efficiency of solving problems such as factoring large integers, combinatorial optimization and quantum physics simulation. One of the greatest challenges now is to implement the basic quantum-computational elements in a physical system and to demonstrate that they can be reliably and scalably controlled. One of the earliest proposals for quantum computation is based on implementing a quantum bit with two optical modes containing one photon. The proposal is appealing because of the ease with which photon interference can be observed. Until now, it suffered from the requirement for non-linear couplings between optical modes containing few photons. Here we show that efficient quantum computation is possible using only beam splitters, phase shifters, single photon sources and photo-detectors. Our methods exploit feedback from photo-detectors and are robust against errors from photon loss and detector inefficiency. The basic elements are accessible to experimental investigation with current technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose quadrature rules for the approximation of line integrals possessing logarithmic singularities and show their convergence. In some instances a superconvergence rate is demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The generalized Gibbs sampler (GGS) is a recently developed Markov chain Monte Carlo (MCMC) technique that enables Gibbs-like sampling of state spaces that lack a convenient representation in terms of a fixed coordinate system. This paper describes a new sampler, called the tree sampler, which uses the GGS to sample from a state space consisting of phylogenetic trees. The tree sampler is useful for a wide range of phylogenetic applications, including Bayesian, maximum likelihood, and maximum parsimony methods. A fast new algorithm to search for a maximum parsimony phylogeny is presented, using the tree sampler in the context of simulated annealing. The mathematics underlying the algorithm is explained and its time complexity is analyzed. The method is tested on two large data sets consisting of 123 sequences and 500 sequences, respectively. The new algorithm is shown to compare very favorably in terms of speed and accuracy to the program DNAPARS from the PHYLIP package.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are many techniques for electricity market price forecasting. However, most of them are designed for expected price analysis rather than price spike forecasting. An effective method of predicting the occurrence of spikes has not yet been observed in the literature so far. In this paper, a data mining based approach is presented to give a reliable forecast of the occurrence of price spikes. Combined with the spike value prediction techniques developed by the same authors, the proposed approach aims at providing a comprehensive tool for price spike forecasting. In this paper, feature selection techniques are firstly described to identify the attributes relevant to the occurrence of spikes. A simple introduction to the classification techniques is given for completeness. Two algorithms: support vector machine and probability classifier are chosen to be the spike occurrence predictors and are discussed in details. Realistic market data are used to test the proposed model with promising results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reconstruction of power industries has brought fundamental changes to both power system operation and planning. This paper presents a new planning method using multi-objective optimization (MOOP) technique, as well as human knowledge, to expand the transmission network in open access schemes. The method starts with a candidate pool of feasible expansion plans. Consequent selection of the best candidates is carried out through a MOOP approach, of which multiple objectives are tackled simultaneously, aiming at integrating the market operation and planning as one unified process in context of deregulated system. Human knowledge has been applied in both stages to ensure the selection with practical engineering and management concerns. The expansion plan from MOOP is assessed by reliability criteria before it is finalized. The proposed method has been tested with the IEEE 14-bus system and relevant analyses and discussions have been presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The artificial dissipation effects in some solutions obtained with a Navier-Stokes flow solver are demonstrated. The solvers were used to calculate the flow of an artificially dissipative fluid, which is a fluid having dissipative properties which arise entirely from the solution method itself. This was done by setting the viscosity and heat conduction coefficients in the Navier-Stokes solvers to zero everywhere inside the flow, while at the same time applying the usual no-slip and thermal conducting boundary conditions at solid boundaries. An artificially dissipative flow solution is found where the dissipation depends entirely on the solver itself. If the difference between the solutions obtained with the viscosity and thermal conductivity set to zero and their correct values is small, it is clear that the artificial dissipation is dominating and the solutions are unreliable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conferences that deliver interactive sessions designed to enhance physician participation, such as role play, small discussion groups, workshops, hands-on training, problem- or case-based learning and individualised training sessions, are effective for physician education.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An investigation was undertaken to test the effectiveness of two procedures for recording boundaries and plot positions for scientific studies on farms on Leyte Island, the Philippines. The accuracy of a Garmin 76 Global Positioning System (GPS) unit and a compass and chain was checked under the same conditions. Tree canopies interfered with the ability of the satellite signal to reach the GPS and therefore the GPS survey was less accurate than the compass and chain survey. Where a high degree of accuracy is required, a compass and chain survey remains the most effective method of surveying land underneath tree canopies, providing operator error is minimised. For a large number of surveys and thus large amounts of data, a GPS is more appropriate than a compass and chain survey because data are easily up-loaded into a Geographic Information System (GIS). However, under dense canopies where satellite signals cannot reach the GPS, it may be necessary to revert to a compass survey or a combination of both methods.