897 resultados para Minimization Problem, Lattice Model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The work described in this thesis began as an inquiry into the nature and use of optimization programs based on "genetic algorithms." That inquiry led, eventually, to three powerful heuristics that are broadly applicable in gradient-ascent programs: First, remember the locations of local maxima and restart the optimization program at a place distant from previously located local maxima. Second, adjust the size of probing steps to suit the local nature of the terrain, shrinking when probes do poorly and growing when probes do well. And third, keep track of the directions of recent successes, so as to probe preferentially in the direction of most rapid ascent. These algorithms lie at the core of a novel optimization program that illustrates the power to be had from deploying them together. The efficacy of this program is demonstrated on several test problems selected from a variety of fields, including De Jong's famous test-problem suite, the traveling salesman problem, the problem of coordinate registration for image guided surgery, the energy minimization problem for determining the shape of organic molecules, and the problem of assessing the structure of sedimentary deposits using seismic data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Changes in the angle of illumination incident upon a 3D surface texture can significantly alter its appearance, implying variations in the image texture. These texture variations produce displacements of class members in the feature space, increasing the failure rates of texture classifiers. To avoid this problem, a model-based texture recognition system which classifies textures seen from different distances and under different illumination directions is presented in this paper. The system works on the basis of a surface model obtained by means of 4-source colour photometric stereo, used to generate 2D image textures under different illumination directions. The recognition system combines coocurrence matrices for feature extraction with a Nearest Neighbour classifier. Moreover, the recognition allows one to guess the approximate direction of the illumination used to capture the test image

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data assimilation is predominantly used for state estimation; combining observational data with model predictions to produce an updated model state that most accurately approximates the true system state whilst keeping the model parameters fixed. This updated model state is then used to initiate the next model forecast. Even with perfect initial data, inaccurate model parameters will lead to the growth of prediction errors. To generate reliable forecasts we need good estimates of both the current system state and the model parameters. This paper presents research into data assimilation methods for morphodynamic model state and parameter estimation. First, we focus on state estimation and describe implementation of a three dimensional variational(3D-Var) data assimilation scheme in a simple 2D morphodynamic model of Morecambe Bay, UK. The assimilation of observations of bathymetry derived from SAR satellite imagery and a ship-borne survey is shown to significantly improve the predictive capability of the model over a 2 year run. Here, the model parameters are set by manual calibration; this is laborious and is found to produce different parameter values depending on the type and coverage of the validation dataset. The second part of this paper considers the problem of model parameter estimation in more detail. We explain how, by employing the technique of state augmentation, it is possible to use data assimilation to estimate uncertain model parameters concurrently with the model state. This approach removes inefficiencies associated with manual calibration and enables more effective use of observational data. We outline the development of a novel hybrid sequential 3D-Var data assimilation algorithm for joint state-parameter estimation and demonstrate its efficacy using an idealised 1D sediment transport model. The results of this study are extremely positive and suggest that there is great potential for the use of data assimilation-based state-parameter estimation in coastal morphodynamic modelling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Activities like the Coupled Model Intercomparison Project (CMIP) have revolutionized climate modelling in terms of our ability to compare models and to process information about climate projections and their uncertainties. The evaluation of models against observations is now considered a key component of multi-model studies. While there are a number of outstanding scientific issues surrounding model evaluation, notably the open question of how to link model performance to future projections, here we highlight a specific but growing problem in model evaluation - that of uncertainties in the observational data that are used to evaluate the models. We highlight the problem using an example obtained from studies of the South Asian Monsoon but we believe the problem is a generic one which arises in many different areas of climate model evaluation and which requires some attention by the community.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Noncompetitive bids have recently become a major concern in both public and private sector construction contract auctions. Consequently, several models have been developed to help identify bidders potentially involved in collusive practices. However, most of these models require complex calculations and extensive information that is difficult to obtain. The aim of this paper is to utilize recent developments for detecting abnormal bids in capped auctions (auctions with an upper bid limit set by the auctioner) and extend them to the more conventional uncapped auctions (where no such limits are set). To accomplish this, a new method is developed for estimating the values of bid distribution supports by using the solution to what has become known as the German Tank problem. The model is then demonstrated and tested on a sample of real construction bid data, and shown to detect cover bids with high accuracy. This paper contributes to an improved understanding of abnormal bid behavior as an aid to detecting and monitoring potential collusive bid practices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study by numerical simulations the time correlation function of a stochastic lattice model describing the dynamics of coexistence of two interacting biological species that present time cycles in the number of species individuals. Its asymptotic behavior is shown to decrease in time as a sinusoidal exponential function from which we extract the dominant eigenvalue of the evolution operator related to the stochastic dynamics showing that it is complex with the imaginary part being the frequency of the population cycles. The transition from the oscillatory to the nonoscillatory behavior occurs when the asymptotic behavior of the time correlation function becomes a pure exponential, that is, when the real part of the complex eigenvalue equals a real eigenvalue. We also show that the amplitude of the undamped oscillations increases with the square root of the area of the habitat as ordinary random fluctuations. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chagas disease is nowadays the most serious parasitic health problem. This disease is caused by Trypanosoma cruzi. The great number of deaths and the insufficient effectiveness of drugs against this parasite have alarmed the scientific community worldwide. In an attempt to overcome this problem, a model for the design and prediction of new antitrypanosomal agents was obtained. This used a mixed approach, containing simple descriptors based on fragments and topological substructural molecular design descriptors. A data set was made up of 188 compounds, 99 of them characterized an antitrypanosomal activity and 88 compounds that belong to other pharmaceutical categories. The model showed sensitivity, specificity and accuracy values above 85%. Quantitative fragmental contributions were also calculated. Then, and to confirm the quality of the model, 15 structures of molecules tested as antitrypanosomal compounds (that we did not include in this study) were predicted, taking into account the information on the abovementioned calculated fragmental contributions. The model showed an accuracy of 100% which means that the ""in silico"" methodology developed by our team is promising for the rational design of new antitrypanosomal drugs. (C) 2009 Wiley Periodicals, Inc. J Comput Chem 31: 882-894. 2010

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The topology optimization problem characterize and determine the optimum distribution of material into the domain. In other words, after the definition of the boundary conditions in a pre-established domain, the problem is how to distribute the material to solve the minimization problem. The objective of this work is to propose a competitive formulation for optimum structural topologies determination in 3D problems and able to provide high-resolution layouts. The procedure combines the Galerkin Finite Elements Method with the optimization method, looking for the best material distribution along the fixed domain of project. The layout topology optimization method is based on the material approach, proposed by Bendsoe & Kikuchi (1988), and considers a homogenized constitutive equation that depends only on the relative density of the material. The finite element used for the approach is a four nodes tetrahedron with a selective integration scheme, which interpolate not only the components of the displacement field but also the relative density field. The proposed procedure consists in the solution of a sequence of layout optimization problems applied to compliance minimization problems and mass minimization problems under local stress constraint. The microstructure used in this procedure was the SIMP (Solid Isotropic Material with Penalty). The approach reduces considerably the computational cost, showing to be efficient and robust. The results provided a well defined structural layout, with a sharpness distribution of the material and a boundary condition definition. The layout quality was proporcional to the medium size of the element and a considerable reduction of the project variables was observed due to the tetrahedrycal element

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We show that diffusion can play an important role in protein-folding kinetics. We explicitly calculate the diffusion coefficient of protein folding in a lattice model. We found that diffusion typically is configuration- or reaction coordinate-dependent. The diffusion coefficient is found to be decreasing with respect to the progression of folding toward the native state, which is caused by the collapse to a compact state constraining the configurational space for exploration. The configuration- or position-dependent diffusion coefficient has a significant contribution to the kinetics in addition to the thermodynamic free-energy barrier. It effectively changes (increases in this case) the kinetic barrier height as well as the position of the corresponding transition state and therefore modifies the folding kinetic rates as well as the kinetic routes. The resulting folding time, by considering both kinetic diffusion and the thermodynamic folding free-energy profile, thus is slower than the estimation from the thermodynamic free-energy barrier with constant diffusion but is consistent with the results from kinetic simulations. The configuration- or coordinate-dependent diffusion is especially important with respect to fast folding, when there is a small or no free-energy barrier and kinetics is controlled by diffusion. Including the configurational dependence will challenge the transition state theory of protein folding. The classical transition state theory will have to be modified to be consistent. The more detailed folding mechanistic studies involving phi value analysis based on the classical transition state theory also will have to be modified quantitatively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A lattice model is used to study mutations and compacting effects on protein folding rates and folding temperature. In the context of protein evolution, we address the question regarding the best scenario for a polypeptide chain to fold: either a fast nonspecific collapse followed by a slow rearrangement to form the native structure or a specific collapse from the unfolded state with the simultaneous formation of the native state. This question is investigated for optimized sequences, whose native state has no frustrated contacts between monomers, and also for mutated sequences, whose native state has some degree of frustration. It is found that the best scenario for folding may depend on the amount of frustration of the native structure. The implication of this result on protein evolution is discussed. (c) 2006 American Institute of Physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research has shown that applying the T-2 control chart by using a variable parameters (VP) scheme yields rapid detection of out-of-control states. In this paper, the problem of economic statistical design of the VP T-2 control chart is considered as a double-objective minimization problem with the statistical objective being the adjusted average time to signal and the economic objective being expected cost per hour. We then find the Pareto-optimal designs in which the two objectives are met simultaneously by using a multi-objective genetic algorithm. Through an illustrative example, we show that relatively large benefits can be achieved by applying the VP scheme when compared with usual schemes, and in addition, the multi-objective approach provides the user with designs that are flexible and adaptive.