945 resultados para Combinatorial optimization algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A space-marching code for the simulation and optimization of inviscid supersonic flow in three dimensions is described. The now in a scramjet module with a relatively complex three-dimensional geometry is examined and wall-pressure estimates are compared with experimental data. Given that viscous effects are not presently included, the comparison is reasonable. The thermodynamic compromise of adding heat in a diverging combustor is also examined. The code is then used to optimize the shape of a thrust surface for a simpler (box-section) scramjet module in the presence of uniform and nonuniform heat distributions. The optimum two-dimensional profiles for the thrust surface are obtained via a perturbation procedure that requires about 30-50 now solutions. It is found that the final shapes are fairly insensitive to the details of the heat distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We tested the effects of four data characteristics on the results of reserve selection algorithms. The data characteristics were nestedness of features (land types in this case), rarity of features, size variation of sites (potential reserves) and size of data sets (numbers of sites and features). We manipulated data sets to produce three levels, with replication, of each of these data characteristics while holding the other three characteristics constant. We then used an optimizing algorithm and three heuristic algorithms to select sites to solve several reservation problems. We measured efficiency as the number or total area of selected sites, indicating the relative cost of a reserve system. Higher nestedness increased the efficiency of all algorithms (reduced the total cost of new reserves). Higher rarity reduced the efficiency of all algorithms (increased the total cost of new reserves). More variation in site size increased the efficiency of all algorithms expressed in terms of total area of selected sites. We measured the suboptimality of heuristic algorithms as the percentage increase of their results over optimal (minimum possible) results. Suboptimality is a measure of the reliability of heuristics as indicative costing analyses. Higher rarity reduced the suboptimality of heuristics (increased their reliability) and there is some evidence that more size variation did the same for the total area of selected sites. We discuss the implications of these results for the use of reserve selection algorithms as indicative and real-world planning tools.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conventionally, protein structure prediction via threading relies on some nonoptimal method to align a protein sequence to each member of a library of known structures. We show how a score function (force field) can be modified so as to allow the direct application of a dynamic programming algorithm to the problem. This involves an approximation whose damage can be minimized by an optimization process during score function parameter determination. The method is compared to sequence to structure alignments using a more conventional pair-wise score function and the frozen approximation. The new method produces results comparable to the frozen approximation, but is faster and has fewer adjustable parameters. It is also free of memory of the template's original amino acid sequence, and does not suffer from a problem of nonconvergence, which can be shown to occur with the frozen approximation. Alignments generated by the simplified score function can then be ranked using a second score function with the approximations removed. (C) 1999 John Wiley & Sons, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A major challenge associated with using large chemical libraries synthesized on microscopic solid support beads is the rapid discrimination of individual compounds in these libraries. This challenge can be overcome by encoding the beads with 1 mum silica colloidal particles (reporters) that contain specific and identifiable combinations of fluorescent byes. The colored bar code generated on support beads during combinatorial library synthesis can be easily, rapidly, and inexpensively decoded through the use of fluorescence microscopy. All reporters are precoated with polyelectrolytes [poly(acrylic acid), PAA, poly(sodium 4-styrenesulfonate PSSS, polyethylenimine, PEI, and/or poly(diallyldimethylammonium chloride), PDADMAC] with the aim of enhancing surface charge, promoting electrostatic attraction to the bead, and facilitating polymer bridging between the bead and reporter for permanent adhesion. As shown in this article, reporters coated with polyelectrolytes clearly outperform uncoated reporters with regard to quantity of attached reporters per bead (54 +/- 23 in 2500 mum(2) area for PEI/PAA coated and 11 +/- 6 for uncoated reporters) and minimization of cross-contamination (1 red reporter in 2500 mum(2) area of green-labeled bead for PEI/PAA coated and 26 +/- 15 red reporters on green-labeled beads for uncoated reporters after 10 days). Examination of various polyelectrolyte systems shows that the magnitude of the xi -potential of polyelectrolyte-coated reporters (-64 mV for PDADMAC/PSSS and -42 mV for PEI/PAA-coated reporters) has no correlation with the number of reporters that adhere to the solid support beads (21 +/- 16 in 2500 mum(2) area for PDADMAC/PSSS and 54 +/- 23 for PEI/PAA-coated reporters). The contribution of polymer bridging to the adhesion has a far greater influence than electrostatic attraction and is demonstrated by modification of the polyelectrolyte multilayers using gamma irradiation of precoated reporters either in aqueous solution or in polyelectrolyte solution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A robust semi-implicit central partial difference algorithm for the numerical solution of coupled stochastic parabolic partial differential equations (PDEs) is described. This can be used for calculating correlation functions of systems of interacting stochastic fields. Such field equations can arise in the description of Hamiltonian and open systems in the physics of nonlinear processes, and may include multiplicative noise sources. The algorithm can be used for studying the properties of nonlinear quantum or classical field theories. The general approach is outlined and applied to a specific example, namely the quantum statistical fluctuations of ultra-short optical pulses in chi((2)) parametric waveguides. This example uses a non-diagonal coherent state representation, and correctly predicts the sub-shot noise level spectral fluctuations observed in homodyne detection measurements. It is expected that the methods used wilt be applicable for higher-order correlation functions and other physical problems as well. A stochastic differencing technique for reducing sampling errors is also introduced. This involves solving nonlinear stochastic parabolic PDEs in combination with a reference process, which uses the Wigner representation in the example presented here. A computer implementation on MIMD parallel architectures is discussed. (C) 1997 Academic Press.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: The study we assessed how often patients who are manifesting a myocardial infarction (MI) would not be considered candidates for intensive lipid-lowering therapy based on the current guidelines. Methods: In 355 consecutive patients manifesting ST elevation MI (STEMI), admission plasma C-reactive protein (CRP) was measured and Framingham risk score (FRS), PROCAM risk score, Reynolds risk score, ASSIGN risk score, QRISK, and SCORE algorithms were applied. Cardiac computed tomography and carotid ultrasound were performed to assess the coronary artery calcium score (CAC), carotid intima-media thickness (cIMT) and the presence of carotid plaques. Results: Less than 50% of STEMI patients would be identified as having high risk before the event by any of these algorithms. With the exception of FRS (9%), all other algorithms would assign low risk to about half of the enrolled patients. Plasma CRP was <1.0 mg/L in 70% and >2 mg/L in 14% of the patients. The average cIMT was 0.8 +/- 0.2 mm and only in 24% of patients was >= 1.0 mm. Carotid plaques were found in 74% of patients. CAC > 100 was found in 66% of patients. Adding CAC >100 plus the presence of carotid plaque, a high-risk condition would be identified in 100% of the patients using any of the above mentioned algorithms. Conclusion: More than half of patients manifesting STEMI would not be considered as candidates for intensive preventive therapy by the current clinical algorithms. The addition of anatomical parameters such as CAC and the presence of carotid plaques can substantially reduce the CVD risk underestimation. (C) 2010 Elsevier Ireland Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Smoothing the potential energy surface for structure optimization is a general and commonly applied strategy. We propose a combination of soft-core potential energy functions and a variation of the diffusion equation method to smooth potential energy surfaces, which is applicable to complex systems such as protein structures; The performance of the method was demonstrated by comparison with simulated annealing using the refinement of the undecapeptide Cyclosporin A as a test case. Simulations were repeated many times using different initial conditions and structures since the methods are heuristic and results are only meaningful in a statistical sense.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a simulated-annealing-based genetic algorithm for solving model parameter estimation problems. The algorithm incorporates advantages of both genetic algorithms and simulated annealing. Tests on computer-generated synthetic data that closely resemble optical constants of a metal were performed to compare the efficiency of plain genetic algorithms against the simulated-annealing-based genetic algorithms. These tests assess the ability of the algorithms to and the global minimum and the accuracy of values obtained for model parameters. Finally, the algorithm with the best performance is used to fit the model dielectric function to data for platinum and aluminum. (C) 1997 Optical Society of America.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a positive correlation between the intensity of use of a given antibiotic and the prevalence of resistant strains. The more you treat, more patients infected with resistant strains appears and, as a consequence, the higher the mortality due to the infection and the longer the hospitalization time. In contrast, the less you treat, the higher the mortality rates and the longer the hospitalization time of patients infected with sensitive strains that could be successfully treated. The hypothesis proposed in this paper is an attempt to solve such a conflict: there must be an optimum treatment intensity that minimizes both the additional mortality and hospitalization time due to the infection by both sensitive and resistant bacteria strains. In order to test this hypothesis we applied a simple mathematical model that allowed us to estimate the optimum proportion of patients to be treated in order to minimize the total number of deaths and hospitalization time due to the infection in a hospital setting. (C) 2007 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We suggest a new notion of behaviour preserving transition refinement based on partial order semantics. This notion is called transition refinement. We introduced transition refinement for elementary (low-level) Petri Nets earlier. For modelling and verifying complex distributed algorithms, high-level (Algebraic) Petri nets are usually used. In this paper, we define transition refinement for Algebraic Petri Nets. This notion is more powerful than transition refinement for elementary Petri nets because it corresponds to the simultaneous refinement of several transitions in an elementary Petri net. Transition refinement is particularly suitable for refinement steps that increase the degree of distribution of an algorithm, e.g. when synchronous communication is replaced by asynchronous message passing. We study how to prove that a replacement of a transition is a transition refinement.

Relevância:

20.00% 20.00%

Publicador: