974 resultados para Simulated annealing algorithm
Resumo:
Estudio elaborado a partir de una estancia en el Karolinska University Hospital, Suecia, entre marzo y junio del 2006. En la radioterapia estereotáxica extracraneal (SBRT) de tumores de pulmón existen principalmente dos problemas en el cálculo de la dosis con los sistemas de planificación disponibles: la precisión limitada de los algoritmos de cálculo en presencia de tejidos con densidades muy diferentes y los movimientos debidos a la respiración del paciente durante el tratamiento. El objetivo de este trabajo ha sido llevar a cabo la simulación con el código Monte Carlo PENELOPE de la distribución de dosis en tumores de pulmón en casos representativos de tratamientos con SBRT teniendo en cuenta los movimientos respiratorios y su comparación con los resultados de varios planificadores. Se han estudiado casos representativos de tratamientos de SBRT en el Karolinska University Hospital. Los haces de radiación se han simulado mediante el código PENELOPE y se han usado para la obtención de los resultados MC de perfiles de dosis. Los resultados obtenidos para el caso estático (sin movimiento respiratorio ) ponen de manifiesto que, en comparación con la MC, la dosis (Gy/MU) calculada por los planificadores en el tumor tiene una precisión del 2-3%. En la zona de interfase entre tumor y tejido pulmonar los planificadores basados en el algoritmo PB sobrestiman la dosis en un 10%, mientras que el algoritmo CC la subestima en un 3-4%. Los resultados de la simulación mediante MC de los movimientos respiratorios indican que los resultados de los planificadores son suficientemente precisos en el tumor, aunque en la interfase hay una mayor subestimación de la dosis en comparación con el caso estático. Estos resultados son compatibles con la experiencia clínica adquirida durante 15 años en el Karolinska University Hospital. Los resultados se han publicado en la revista Acta Oncologica.
Resumo:
El present treball fa un anàlisi i desenvolupament sobre les millores en la velocitat i en l’escalabilitat d'un simulador distribuït de grups de peixos. Aquests resultats s’han obtingut fent servir una nova estratègia de comunicació per als processos lògics (LPs) i canvis en l'algoritme de selecció de veïns que s'aplica a cadascun dels peixos en cada pas de simulació. L’idea proposada permet que cada procés lògic anticipi futures necessitats de dades pels seus veïns reduint el temps de comunicació al limitar la quantitat de missatges intercanviats entre els LPs. El nou algoritme de selecció dels veïns es va desenvolupar amb l'objectiu d'evitar treball innecessari permetent la disminució de les instruccions executades en cada pas de simulació i per cadascun del peixos simulats reduint de forma significativa el temps de simulació.
Resumo:
Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.
Resumo:
El objetivo de este proyecto es la predicción de la pérdida de paquete. Para ello necesitaremos el modelado del canal. De esta manera, podremos determinar cuando una transmisión llega con éxito o no. En primer lugar, se han estudiado los algoritmos de adaptación de la tasa. Estos algoritmos mejoran el rendimiento de la comunicación. Por este motivo, el programa de simulación se basa en algunos de estos algoritmos. En paralelo, se han capturado medidas del canal terrestre para realizar el modelado. Finalmente, con un programa mucho más completo se ha simulado el comportamiento de una transmisión con el modelado del canal físico, y se han asumido algunas consideraciones, como las colisiones. Por lo tanto, se ha obtenido un resultado más realista, con el cual se ha analizado teóricamente las posibilidades de un enlace entre el canal terrestre y el canal satélite, para crear una red híbrida.
Resumo:
Informe de investigación elaborado a partir de una estancia en el Laboratorio de Diseño Computacional en Aeroespacial en el Massachusetts Institute of Technology (MIT), Estados Unidos, entre noviembre de 2006 y agosto de 2007. La aerodinámica es una rama de la dinámica de fluidos referida al estudio de los movimientos de los líquidos o gases, cuya meta principal es predecir las fuerzas aerodinámicas en un avión o cualquier tipo de vehículo, incluyendo los automóviles. Las ecuaciones de Navier-Stokes representan un estado dinámico del equilibrio de las fuerzas que actúan en cualquier región dada del fluido. Son uno de los sistemas de ecuaciones más útiles porque describen la física de una gran cantidad de fenómenos como corrientes del océano, flujos alrededor de una superficie de sustentación, etc. En el contexto de una tesis doctoral, se está estudiando un flujo viscoso e incompresible, solucionando las ecuaciones de Navier- Stokes incompresibles de una manera eficiente. Durante la estancia en el MIT, se ha utilizado un método de Galerkin discontinuo para solucionar las ecuaciones de Navier-Stokes incompresibles usando, o bien un parámetro de penalti para asegurar la continuidad de los flujos entre elementos, o bien un método de Galerkin discontinuo compacto. Ambos métodos han dado buenos resultados y varios ejemplos numéricos se han simulado para validar el buen comportamiento de los métodos desarrollados. También se han estudiado elementos particulares, los elementos de Raviart y Thomas, que se podrían utilizar en una formulación mixta para obtener un algoritmo eficiente para solucionar problemas numéricos complejos.
Resumo:
In traditional criminal investigation, uncertainties are often dealt with using a combination of common sense, practical considerations and experience, but rarely with tailored statistical models. For example, in some countries, in order to search for a given profile in the national DNA database, it must have allelic information for six or more of the ten SGM Plus loci for a simple trace. If the profile does not have this amount of information then it cannot be searched in the national DNA database (NDNAD). This requirement (of a result at six or more loci) is not based on a statistical approach, but rather on the feeling that six or more would be sufficient. A statistical approach, however, could be more rigorous and objective and would take into consideration factors such as the probability of adventitious matches relative to the actual database size and/or investigator's requirements in a sensible way. Therefore, this research was undertaken to establish scientific foundations pertaining to the use of partial SGM Plus loci profiles (or similar) for investigation.
Resumo:
The goal of the present work was assess the feasibility of using a pseudo-inverse and null-space optimization approach in the modeling of the shoulder biomechanics. The method was applied to a simplified musculoskeletal shoulder model. The mechanical system consisted in the arm, and the external forces were the arm weight, 6 scapulo-humeral muscles and the reaction at the glenohumeral joint, which was considered as a spherical joint. The muscle wrapping was considered around the humeral head assumed spherical. The dynamical equations were solved in a Lagrangian approach. The mathematical redundancy of the mechanical system was solved in two steps: a pseudo-inverse optimization to minimize the square of the muscle stress and a null-space optimization to restrict the muscle force to physiological limits. Several movements were simulated. The mathematical and numerical aspects of the constrained redundancy problem were efficiently solved by the proposed method. The prediction of muscle moment arms was consistent with cadaveric measurements and the joint reaction force was consistent with in vivo measurements. This preliminary work demonstrated that the developed algorithm has a great potential for more complex musculoskeletal modeling of the shoulder joint. In particular it could be further applied to a non-spherical joint model, allowing for the natural translation of the humeral head in the glenoid fossa.
Resumo:
The implicit projection algorithm of isotropic plasticity is extended to an objective anisotropic elastic perfectly plastic model. The recursion formula developed to project the trial stress on the yield surface, is applicable to any non linear elastic law and any plastic yield function.A curvilinear transverse isotropic model based on a quadratic elastic potential and on Hill's quadratic yield criterion is then developed and implemented in a computer program for bone mechanics perspectives. The paper concludes with a numerical study of a schematic bone-prosthesis system to illustrate the potential of the model.
Resumo:
The broad resonances underlying the entire (1) H NMR spectrum of the brain, ascribed to macromolecules, can influence metabolite quantification. At the intermediate field strength of 3 T, distinct approaches for the determination of the macromolecule signal, previously used at either 1.5 or 7 T and higher, may become equivalent. The aim of this study was to evaluate, at 3 T for healthy subjects using LCModel, the impact on the metabolite quantification of two different macromolecule approaches: (i) experimentally measured macromolecules; and (ii) mathematically estimated macromolecules. Although small, but significant, differences in metabolite quantification (up to 23% for glutamate) were noted for some metabolites, 10 metabolites were quantified reproducibly with both approaches with a Cramer-Rao lower bound below 20%, and the neurochemical profiles were therefore similar. We conclude that the mathematical approximation can provide sufficiently accurate and reproducible estimation of the macromolecule contribution to the (1) H spectrum at 3 T. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
We demonstrate that RecA protein can mediate annealing of complementary DNA strands in vitro by at least two different mechanisms. The first annealing mechanism predominates under conditions where RecA protein causes coaggregation of single-stranded DNA (ssDNA) molecules and where RecA-free ssDNA stretches are present on both reaction partners. Under these conditions annealing can take place between locally concentrated protein-free complementary sequences. Other DNA aggregating agents like histone H1 or ethanol stimulate annealing by the same mechanism. The second mechanism of RecA-mediated annealing of complementary DNA strands is best manifested when preformed saturated RecA-ssDNA complexes interact with protein-free ssDNA. In this case, annealing can occur between the ssDNA strand resident in the complex and the ssDNA strand that interacts with the preformed RecA-ssDNA complex. Here, the action of RecA protein reflects its specific recombination promoting mechanism. This mechanism enables DNA molecules resident in the presynaptic RecA-DNA complexes to be exposed for hydrogen bond formation with DNA molecules contacting the presynaptic RecA-DNA filament.
Resumo:
A family of nonempty closed convex sets is built by using the data of the Generalized Nash equilibrium problem (GNEP). The sets are selected iteratively such that the intersection of the selected sets contains solutions of the GNEP. The algorithm introduced by Iusem-Sosa (2003) is adapted to obtain solutions of the GNEP. Finally some numerical experiments are given to illustrate the numerical behavior of the algorithm.
Resumo:
Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.
Resumo:
Double-strand breaks (DSBs) occur frequently during DNA replication. They are also caused by ionizing radiation, chemical damage or as part of the series of programmed events that occur during meiosis. In yeast, DSB repair requires RAD52, a protein that plays a critical role in homologous recombination. Here we describe the actions of human RAD52 protein in a model system for single-strand annealing (SSA) using tailed (i.e. exonuclease resected) duplex DNA molecules. Purified human RAD52 protein binds resected DSBs and promotes associations between complementary DNA termini. Heteroduplex intermediates of these recombination reactions have been visualized by electron microscopy, revealing the specific binding of multiple rings of RAD52 to the resected termini and the formation of large protein complexes at heteroduplex joints formed by RAD52-mediated annealing.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."