934 resultados para static computer simulation
Resumo:
It is well established that at ambient and supercooled conditions water can be described as a percolating network of H bonds. This work is aimed at identifying, by neutron diffraction experiments combined with computer simulations, a percolation line in supercritical water, where the extension of the H-bond network is in question. It is found that in real supercritical water liquidlike states are observed at or above the percolation threshold, while below this threshold gaslike water forms small, sheetlike configurations. Inspection of the three-dimensional arrangement of water molecules suggests that crossing of this percolation line is accompa- nied by a change of symmetry in the first neighboring shell of molecules from trigonal below the line to tetrahedral above.
Resumo:
The fire ant Solenopsis invicta is a significant pest that was inadvertently introduced into the southern United States almost a century ago and more recently into California and other regions of the world. An assessment of genetic variation at a diverse set of molecular markers in 2144 fire ant colonies from 75 geographic sites worldwide revealed that at least nine separate introductions of S. invicta have occurred into newly invaded areas and that the main southern U.S. population is probably the source of all but one of these introductions. The sole exception involves a putative serial invasion from the southern United States to California to Taiwan. These results illustrate in stark fashion a severe negative consequence of an increasingly massive and interconnected global trade and travel system.
Resumo:
We present molecular dynamics (MD) simulations results for dense fluids of ultrasoft, fully penetrable particles. These are a binary mixture and a polydisperse system of particles interacting via the generalized exponential model, which is known to yield cluster crystal phases for the corresponding monodisperse systems. Because of the dispersity in the particle size, the systems investigated in this work do not crystallize and form disordered cluster phases. The clusteringtransition appears as a smooth crossover to a regime in which particles are mostly located in clusters, isolated particles being infrequent. The analysis of the internal cluster structure reveals microsegregation of the big and small particles, with a strong homo-coordination in the binary mixture. Upon further lowering the temperature below the clusteringtransition, the motion of the clusters" centers-of-mass slows down dramatically, giving way to a cluster glass transition. In the cluster glass, the diffusivities remain finite and display an activated temperature dependence, indicating that relaxation in the cluster glass occurs via particle hopping in a nearly arrested matrix of clusters. Finally we discuss the influence of the microscopic dynamics on the transport properties by comparing the MD results with Monte Carlo simulations.
Resumo:
Microsatellite loci mutate at an extremely high rate and are generally thought to evolve through a stepwise mutation model. Several differentiation statistics taking into account the particular mutation scheme of the microsatellite have been proposed. The most commonly used is R(ST) which is independent of the mutation rate under a generalized stepwise mutation model. F(ST) and R(ST) are commonly reported in the literature, but often differ widely. Here we compare their statistical performances using individual-based simulations of a finite island model. The simulations were run under different levels of gene flow, mutation rates, population number and sizes. In addition to the per locus statistical properties, we compare two ways of combining R(ST) over loci. Our simulations show that even under a strict stepwise mutation model, no statistic is best overall. All estimators suffer to different extents from large bias and variance. While R(ST) better reflects population differentiation in populations characterized by very low gene-exchange, F(ST) gives better estimates in cases of high levels of gene flow. The number of loci sampled (12, 24, or 96) has only a minor effect on the relative performance of the estimators under study. For all estimators there is a striking effect of the number of samples, with the differentiation estimates showing very odd distributions for two samples.
Resumo:
The M-Coffee server is a web server that makes it possible to compute multiple sequence alignments (MSAs) by running several MSA methods and combining their output into one single model. This allows the user to simultaneously run all his methods of choice without having to arbitrarily choose one of them. The MSA is delivered along with a local estimation of its consistency with the individual MSAs it was derived from. The computation of the consensus multiple alignment is carried out using a special mode of the T-Coffee package [Notredame, Higgins and Heringa (T-Coffee: a novel method for fast and accurate multiple sequence alignment. J. Mol. Biol. 2000; 302: 205-217); Wallace, O'Sullivan, Higgins and Notredame (M-Coffee: combining multiple sequence alignment methods with T-Coffee. Nucleic Acids Res. 2006; 34: 1692-1699)] Given a set of sequences (DNA or proteins) in FASTA format, M-Coffee delivers a multiple alignment in the most common formats. M-Coffee is a freeware open source package distributed under a GPL license and it is available either as a standalone package or as a web service from www.tcoffee.org.
Resumo:
We analyze the failure process of a two-component system with widely different fracture strength in the framework of a fiber bundle model with localized load sharing. A fraction 0≤α≤1 of the bundle is strong and it is represented by unbreakable fibers, while fibers of the weak component have randomly distributed failure strength. Computer simulations revealed that there exists a critical composition αc which separates two qualitatively different behaviors: Below the critical point, the failure of the bundle is brittle, characterized by an abrupt damage growth within the breakable part of the system. Above αc, however, the macroscopic response becomes ductile, providing stability during the entire breaking process. The transition occurs at an astonishingly low fraction of strong fibers which can have importance for applications. We show that in the ductile phase, the size distribution of breaking bursts has a power law functional form with an exponent μ=2 followed by an exponential cutoff. In the brittle phase, the power law also prevails but with a higher exponent μ=92. The transition between the two phases shows analogies to continuous phase transitions. Analyzing the microstructure of the damage, it was found that at the beginning of the fracture process cracks nucleate randomly, while later on growth and coalescence of cracks dominate, which give rise to power law distributed crack sizes.
Resumo:
Genotypic frequencies at codominant marker loci in population samples convey information on mating systems. A classical way to extract this information is to measure heterozygote deficiencies (FIS) and obtain the selfing rate s from FIS = s/(2 - s), assuming inbreeding equilibrium. A major drawback is that heterozygote deficiencies are often present without selfing, owing largely to technical artefacts such as null alleles or partial dominance. We show here that, in the absence of gametic disequilibrium, the multilocus structure can be used to derive estimates of s independent of FIS and free of technical biases. Their statistical power and precision are comparable to those of FIS, although they are sensitive to certain types of gametic disequilibria, a bias shared with progeny-array methods but not FIS. We analyse four real data sets spanning a range of mating systems. In two examples, we obtain s = 0 despite positive FIS, strongly suggesting that the latter are artefactual. In the remaining examples, all estimates are consistent. All the computations have been implemented in a open-access and user-friendly software called rmes (robust multilocus estimate of selfing) available at http://ftp.cefe.cnrs.fr, and can be used on any multilocus data. Being able to extract the reliable information from imperfect data, our method opens the way to make use of the ever-growing number of published population genetic studies, in addition to the more demanding progeny-array approaches, to investigate selfing rates.
Resumo:
For the detection and management of osteoporosis and osteoporosis-related fractures, quantitative ultrasound (QUS) is emerging as a relatively low-cost and readily accessible alternative to dual-energy X-ray absorptiometry (DXA) measurement of bone mineral density (BMD) in certain circumstances. The following is a brief, but thorough review of the existing literature with respect to the use of QUS in 6 settings: 1) assessing fragility fracture risk; 2) diagnosing osteoporosis; 3) initiating osteoporosis treatment; 4) monitoring osteoporosis treatment; 5) osteoporosis case finding; and 6) quality assurance and control. Many QUS devices exist that are quite different with respect to the parameters they measure and the strength of empirical evidence supporting their use. In general, heel QUS appears to be most tested and most effective. Overall, some, but not all, heel QUS devices are effective assessing fracture risk in some, but not all, populations, the evidence being strongest for Caucasian females over 55 years old. Otherwise, the evidence is fair with respect to certain devices allowing for the accurate diagnosis of likelihood of osteoporosis, and generally fair to poor in terms of QUS use when initiating or monitoring osteoporosis treatment. A reasonable protocol is proposed herein for case-finding purposes, which relies on a combined assessment of clinical risk factors (CR.F) and heel QUS. Finally, several recommendations are made for quality assurance and control.
Resumo:
From 6 to 8 November 1982 one of the most catastrophic flash-flood events was recorded in the Eastern Pyrenees affecting Andorra and also France and Spain with rainfall accumulations exceeding 400 mm in 24 h, 44 fatalities and widespread damage. This paper aims to exhaustively document this heavy precipitation event and examines mesoscale simulations performed by the French Meso-NH non-hydrostatic atmospheric model. Large-scale simulations show the slow-evolving synoptic environment favourable for the development of a deep Atlantic cyclone which induced a strong southerly flow over the Eastern Pyrenees. From the evolution of the synoptic pattern four distinct phases have been identified during the event. The mesoscale analysis presents the second and the third phase as the most intense in terms of rainfall accumulations and highlights the interaction of the moist and conditionally unstable flows with the mountains. The presence of a SW low level jet (30 m s-1) around 1500 m also had a crucial role on focusing the precipitation over the exposed south slopes of the Eastern Pyrenees. Backward trajectories based on Eulerian on-line passive tracers indicate that the orographic uplift was the main forcing mechanism which triggered and maintained the precipitating systems more than 30 h over the Pyrenees. The moisture of the feeding flow mainly came from the Atlantic Ocean (7-9 g kg-1) and the role of the Mediterranean as a local moisture source was very limited (2-3 g kg-1) due to the high initial water vapour content of the parcels and the rapid passage over the basin along the Spanish Mediterranean coast (less than 12 h).
Resumo:
This work investigates novel alternative means of interaction in a virtual environment (VE).We analyze whether humans can remap established body functions to learn to interact with digital information in an environment that is cross-sensory by nature and uses vocal utterances in order to influence (abstract) virtual objects. We thus establish a correlation among learning, control of the interface, and the perceived sense of presence in the VE. The application enables intuitive interaction by mapping actions (the prosodic aspects of the human voice) to a certain response (i.e., visualization). A series of single-user and multiuser studies shows that users can gain control of the intuitive interface and learn to adapt to new and previously unseen tasks in VEs. Despite the abstract nature of the presented environment, presence scores were generally very high.
Resumo:
Exposure to solar ultraviolet (UV) light is the main causative factor for skin cancer. UV exposure depends on environmental and individual factors. Individual exposure data remain scarce and development of alternative assessment methods is greatly needed. We developed a model simulating human exposure to solar UV. The model predicts the dose and distribution of UV exposure received on the basis of ground irradiation and morphological data. Standard 3D computer graphics techniques were adapted to develop a rendering engine that estimates the solar exposure of a virtual manikin depicted as a triangle mesh surface. The amount of solar energy received by each triangle was calculated, taking into account reflected, direct and diffuse radiation, and shading from other body parts. Dosimetric measurements (n = 54) were conducted in field conditions using a foam manikin as surrogate for an exposed individual. Dosimetric results were compared to the model predictions. The model predicted exposure to solar UV adequately. The symmetric mean absolute percentage error was 13%. Half of the predictions were within 17% range of the measurements. This model provides a tool to assess outdoor occupational and recreational UV exposures, without necessitating time-consuming individual dosimetry, with numerous potential uses in skin cancer prevention and research.
Resumo:
This paper presents a validation study on statistical nonsupervised brain tissue classification techniques in magnetic resonance (MR) images. Several image models assuming different hypotheses regarding the intensity distribution model, the spatial model and the number of classes are assessed. The methods are tested on simulated data for which the classification ground truth is known. Different noise and intensity nonuniformities are added to simulate real imaging conditions. No enhancement of the image quality is considered either before or during the classification process. This way, the accuracy of the methods and their robustness against image artifacts are tested. Classification is also performed on real data where a quantitative validation compares the methods' results with an estimated ground truth from manual segmentations by experts. Validity of the various classification methods in the labeling of the image as well as in the tissue volume is estimated with different local and global measures. Results demonstrate that methods relying on both intensity and spatial information are more robust to noise and field inhomogeneities. We also demonstrate that partial volume is not perfectly modeled, even though methods that account for mixture classes outperform methods that only consider pure Gaussian classes. Finally, we show that simulated data results can also be extended to real data.
Resumo:
A reinforcement learning (RL) method was used to train a virtual character to move participants to a specified location. The virtual environment depicted an alleyway displayed through a wide field-of-view head-tracked stereo head-mounted display. Based on proxemics theory, we predicted that when the character approached within a personal or intimate distance to the participants, they would be inclined to move backwards out of the way. We carried out a between-groups experiment with 30 female participants, with 10 assigned arbitrarily to each of the following three groups: In the Intimate condition the character could approach within 0.38m and in the Social condition no nearer than 1.2m. In the Random condition the actions of the virtual character were chosen randomly from among the same set as in the RL method, and the virtual character could approach within 0.38m. The experiment continued in each case until the participant either reached the target or 7 minutes had elapsed. The distributions of the times taken to reach the target showed significant differences between the three groups, with 9 out of 10 in the Intimate condition reaching the target significantly faster than the 6 out of 10 who reached the target in the Social condition. Only 1 out of 10 in the Random condition reached the target. The experiment is an example of applied presence theory: we rely on the many findings that people tend to respond realistically in immersive virtual environments, and use this to get people to achieve a task of which they had been unaware. This method opens up the door for many such applications where the virtual environment adapts to the responses of the human participants with the aim of achieving particular goals.