60 resultados para GIS BASED SIMULATION


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Connectivity analysis on diffusion MRI data of the whole- brain suffers from distortions caused by the standard echo- planar imaging acquisition strategies. These images show characteristic geometrical deformations and signal destruction that are an important drawback limiting the success of tractography algorithms. Several retrospective correction techniques are readily available. In this work, we use a digital phantom designed for the evaluation of connectivity pipelines. We subject the phantom to a âeurooetheoretically correctâeuro and plausible deformation that resembles the artifact under investigation. We correct data back, with three standard methodologies (namely fieldmap-based, reversed encoding-based, and registration- based). Finally, we rank the methods based on their geometrical accuracy, the dropout compensation, and their impact on the resulting connectivity matrices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been long recognized that highly polymorphic genetic markers can lead to underestimation of divergence between populations when migration is low. Microsatellite loci, which are characterized by extremely high mutation rates, are particularly likely to be affected. Here, we report genetic differentiation estimates in a contact zone between two chromosome races of the common shrew (Sorex araneus), based on 10 autosomal microsatellites, a newly developed Y-chromosome microsatellite, and mitochondrial DNA. These results are compared to previous data on proteins and karyotypes. Estimates of genetic differentiation based on F- and R-statistics are much lower for autosomal microsatellites than for all other genetic markers. We show by simulations that this discrepancy stems mainly from the high mutation rate of microsatellite markers for F-statistics and from deviations from a single-step mutation model for R-statistics. The sex-linked genetic markers show that all gene exchange between races is mediated by females. The absence of male-mediated gene flow most likely results from male hybrid sterility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Carotenoid-based yellowish to red plumage colors are widespread visual signals used in sexual and social communication. To understand their ultimate signaling functions, it is important to identify the proximate mechanism promoting variation in coloration. Carotenoid-based colors combine structural and pigmentary components, but the importance of the contribution of structural components to variation in pigment-based colors (i.e., carotenoid-based colors) has been undervalued. In a field experiment with great tits (Parus major), we combined a brood size manipulation with a simultaneous carotenoid supplementation in order to disentangle the effects of carotenoid availability and early growth condition on different components of the yellow breast feathers. By defining independent measures of feather carotenoid content (absolute carotenoid chroma) and background structure (background reflectance), we demonstrate that environmental factors experienced during the nestling period, namely, early growth conditions and carotenoid availability, contribute independently to variation in yellow plumage coloration. While early growth conditions affected the background reflectance of the plumage, the availability of carotenoids affected the absolute carotenoid chroma, the peak of maximum ultraviolet reflectance, and the overall shape, that is, chromatic information of the reflectance curves. These findings demonstrate that environment-induced variation in background structure contributes significantly to intraspecific variation in yellow carotenoid-based plumage coloration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the scale of a field site represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed downscaling procedure based on a non-linear Bayesian sequential simulation approach. The main objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity logged at collocated wells and surface resistivity measurements, which are available throughout the studied site. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariatekernel density function. Then a stochastic integration of low-resolution, large-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities is applied. The overall viability of this downscaling approach is tested and validated by comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure allows obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification of genetically homogeneous groups of individuals is a long standing issue in population genetics. A recent Bayesian algorithm implemented in the software STRUCTURE allows the identification of such groups. However, the ability of this algorithm to detect the true number of clusters (K) in a sample of individuals when patterns of dispersal among populations are not homogeneous has not been tested. The goal of this study is to carry out such tests, using various dispersal scenarios from data generated with an individual-based model. We found that in most cases the estimated 'log probability of data' does not provide a correct estimation of the number of clusters, K. However, using an ad hoc statistic DeltaK based on the rate of change in the log probability of data between successive K values, we found that STRUCTURE accurately detects the uppermost hierarchical level of structure for the scenarios we tested. As might be expected, the results are sensitive to the type of genetic marker used (AFLP vs. microsatellite), the number of loci scored, the number of populations sampled, and the number of individuals typed in each sample.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods like Event History Analysis can show the existence of diffusion and part of its nature, but do not study the process itself. Nowadays, thanks to the increasing performance of computers, processes can be studied using computational modeling. This thesis presents an agent-based model of policy diffusion mainly inspired from the model developed by Braun and Gilardi (2006). I first start by developing a theoretical framework of policy diffusion that presents the main internal drivers of policy diffusion - such as the preference for the policy, the effectiveness of the policy, the institutional constraints, and the ideology - and its main mechanisms, namely learning, competition, emulation, and coercion. Therefore diffusion, expressed by these interdependencies, is a complex process that needs to be studied with computational agent-based modeling. In a second step, computational agent-based modeling is defined along with its most significant concepts: complexity and emergence. Using computational agent-based modeling implies the development of an algorithm and its programming. When this latter has been developed, we let the different agents interact. Consequently, a phenomenon of diffusion, derived from learning, emerges, meaning that the choice made by an agent is conditional to that made by its neighbors. As a result, learning follows an inverted S-curve, which leads to partial convergence - global divergence and local convergence - that triggers the emergence of political clusters; i.e. the creation of regions with the same policy. Furthermore, the average effectiveness in this computational world tends to follow a J-shaped curve, meaning that not only time is needed for a policy to deploy its effects, but that it also takes time for a country to find the best-suited policy. To conclude, diffusion is an emergent phenomenon from complex interactions and its outcomes as ensued from my model are in line with the theoretical expectations and the empirical evidence.Les méthodes d'analyse de biographie (event history analysis) permettent de mettre en évidence l'existence de phénomènes de diffusion et de les décrire, mais ne permettent pas d'en étudier le processus. Les simulations informatiques, grâce aux performances croissantes des ordinateurs, rendent possible l'étude des processus en tant que tels. Cette thèse, basée sur le modèle théorique développé par Braun et Gilardi (2006), présente une simulation centrée sur les agents des phénomènes de diffusion des politiques. Le point de départ de ce travail met en lumière, au niveau théorique, les principaux facteurs de changement internes à un pays : la préférence pour une politique donnée, l'efficacité de cette dernière, les contraintes institutionnelles, l'idéologie, et les principaux mécanismes de diffusion que sont l'apprentissage, la compétition, l'émulation et la coercition. La diffusion, définie par l'interdépendance des différents acteurs, est un système complexe dont l'étude est rendue possible par les simulations centrées sur les agents. Au niveau méthodologique, nous présenterons également les principaux concepts sous-jacents aux simulations, notamment la complexité et l'émergence. De plus, l'utilisation de simulations informatiques implique le développement d'un algorithme et sa programmation. Cette dernière réalisée, les agents peuvent interagir, avec comme résultat l'émergence d'un phénomène de diffusion, dérivé de l'apprentissage, où le choix d'un agent dépend en grande partie de ceux faits par ses voisins. De plus, ce phénomène suit une courbe en S caractéristique, poussant à la création de régions politiquement identiques, mais divergentes au niveau globale. Enfin, l'efficacité moyenne, dans ce monde simulé, suit une courbe en J, ce qui signifie qu'il faut du temps, non seulement pour que la politique montre ses effets, mais également pour qu'un pays introduise la politique la plus efficace. En conclusion, la diffusion est un phénomène émergent résultant d'interactions complexes dont les résultats du processus tel que développé dans ce modèle correspondent tant aux attentes théoriques qu'aux résultats pratiques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale for the purpose of improving predictions of groundwater flow and solute transport. However, extending corresponding approaches to the regional scale still represents one of the major challenges in the domain of hydrogeophysics. To address this problem, we have developed a regional-scale data integration methodology based on a two-step Bayesian sequential simulation approach. Our objective is to generate high-resolution stochastic realizations of the regional-scale hydraulic conductivity field in the common case where there exist spatially exhaustive but poorly resolved measurements of a related geophysical parameter, as well as highly resolved but spatially sparse collocated measurements of this geophysical parameter and the hydraulic conductivity. To integrate this multi-scale, multi-parameter database, we first link the low- and high-resolution geophysical data via a stochastic downscaling procedure. This is followed by relating the downscaled geophysical data to the high-resolution hydraulic conductivity distribution. After outlining the general methodology of the approach, we demonstrate its application to a realistic synthetic example where we consider as data high-resolution measurements of the hydraulic and electrical conductivities at a small number of borehole locations, as well as spatially exhaustive, low-resolution estimates of the electrical conductivity obtained from surface-based electrical resistivity tomography. The different stochastic realizations of the hydraulic conductivity field obtained using our procedure are validated by comparing their solute transport behaviour with that of the underlying ?true? hydraulic conductivity field. We find that, even in the presence of strong subsurface heterogeneity, our proposed procedure allows for the generation of faithful representations of the regional-scale hydraulic conductivity structure and reliable predictions of solute transport over long, regional-scale distances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given the very large amount of data obtained everyday through population surveys, much of the new research again could use this information instead of collecting new samples. Unfortunately, relevant data are often disseminated into different files obtained through different sampling designs. Data fusion is a set of methods used to combine information from different sources into a single dataset. In this article, we are interested in a specific problem: the fusion of two data files, one of which being quite small. We propose a model-based procedure combining a logistic regression with an Expectation-Maximization algorithm. Results show that despite the lack of data, this procedure can perform better than standard matching procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: Human papillomavirus (HPV) is a sexually transmitted infection of particular interest because of its high prevalence rate and strong causal association with cervical cancer. Two prophylactic vaccines have been developed and different countries have made or will soon make recommendations for the vaccination of girls. Even if there is a consensus to recommend a vaccination before the beginning of sexual activity, there are, however, large discrepancies between countries concerning the perceived usefulness of a catch-up procedure and of boosters. The main objective of this article is to simulate the impact on different vaccination policies upon the mid- and long-term HPV 16/18 age-specific infection rates. METHODS: We developed an epidemiological model based on the susceptible-infective-recovered approach using Swiss data. The mid- and long-term impact of different vaccination scenarios was then compared. RESULTS: The generalization of a catch-up procedure is always beneficial, whatever its extent. Moreover, pending on the length of the protection offered by the vaccine, boosters will also be very useful. CONCLUSIONS: To be really effective, a vaccination campaign against HPV infection should at least include a catch-up to early reach a drop in HPV 16/18 prevalence, and maybe boosters. Otherwise, the protection insured for women in their 20s could be lower than expected, resulting in higher risks to later develop cervical cancer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The availability of high resolution Digital Elevation Models (DEM) at a regional scale enables the analysis of topography with high levels of detail. Hence, a DEM-based geomorphometric approach becomes more accurate for detecting potential rockfall sources. Potential rockfall source areas are identified according to the slope angle distribution deduced from high resolution DEM crossed with other information extracted from geological and topographic maps in GIS format. The slope angle distribution can be decomposed in several Gaussian distributions that can be considered as characteristic of morphological units: rock cliffs, steep slopes, footslopes and plains. A terrain is considered as potential rockfall sources when their slope angles lie over an angle threshold, which is defined where the Gaussian distribution of the morphological unit "Rock cliffs" become dominant over the one of "Steep slopes". In addition to this analysis, the cliff outcrops indicated by the topographic maps were added. They contain however "flat areas", so that only the slope angles values above the mode of the Gaussian distribution of the morphological unit "Steep slopes" were considered. An application of this method is presented over the entire Canton of Vaud (3200 km2), Switzerland. The results were compared with rockfall sources observed on the field and orthophotos analysis in order to validate the method. Finally, the influence of the cell size of the DEM is inspected by applying the methodology over six different DEM resolutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by processbased modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws.We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25m resolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Risks of significant infant drug exposure through human milk arepoorly defined due to lack of large-scale PK data. We propose to useBayesian approach based on population PK (popPK)-guided modelingand simulation for risk prediction. As a proof-of-principle study, weexploited fluoxetine milk concentration data from 25 women. popPKparameters including milk-to-plasma ratio (MP ratio) were estimatedfrom the best model. The dose of fluoxetine the breastfed infant wouldreceive through mother's milk, and infant plasma concentrations wereestimated from 1000 simulated mother-infant pairs, using randomassignment of feeding times and milk volume. A conservative estimateof CYP2D6 activity of 20% of the allometrically-adjusted adult valuewas assumed. Derived model parameters, including MP ratio were consistentwith those reported in the literature. Visual predictive check andother model diagnostics showed no signs of model misspecifications.The model simulation predicted that infant exposure levels to fluoxetinevia mother's milk were below 10% of weight-adjusted maternal therapeuticdoses in >99% of simulated infants. Predicted median ratio ofinfant-mother serum levels at steady state was 0.093 (range 0.033-0.31),consistent with literature reported values (mean=0.07; range 0-0.59).Predicted incidence of relatively high infant-mother ratio (>0.2) ofsteady-state serum fluoxetine concentrations was <1.3%. Overall, ourpredictions are consistent with clinical observations. Our approach maybe valid for other drugs, allowing in silico prediction of infant drugexposure risks through human milk. We will discuss application of thisapproach to another drug used in lactating women.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Meta-analysis of genome-wide association studies (GWASs) has led to the discoveries of many common variants associated with complex human diseases. There is a growing recognition that identifying "causal" rare variants also requires large-scale meta-analysis. The fact that association tests with rare variants are performed at the gene level rather than at the variant level poses unprecedented challenges in the meta-analysis. First, different studies may adopt different gene-level tests, so the results are not compatible. Second, gene-level tests require multivariate statistics (i.e., components of the test statistic and their covariance matrix), which are difficult to obtain. To overcome these challenges, we propose to perform gene-level tests for rare variants by combining the results of single-variant analysis (i.e., p values of association tests and effect estimates) from participating studies. This simple strategy is possible because of an insight that multivariate statistics can be recovered from single-variant statistics, together with the correlation matrix of the single-variant test statistics, which can be estimated from one of the participating studies or from a publicly available database. We show both theoretically and numerically that the proposed meta-analysis approach provides accurate control of the type I error and is as powerful as joint analysis of individual participant data. This approach accommodates any disease phenotype and any study design and produces all commonly used gene-level tests. An application to the GWAS summary results of the Genetic Investigation of ANthropometric Traits (GIANT) consortium reveals rare and low-frequency variants associated with human height. The relevant software is freely available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rockfall hazard zoning is usually achieved using a qualitative estimate of hazard, and not an absolute scale. In Switzerland, danger maps, which correspond to a hazard zoning depending on the intensity of the considered phenomenon (e.g. kinetic energy for rockfalls), are replacing hazard maps. Basically, the danger grows with the mean frequency and with the intensity of the rockfall. This principle based on intensity thresholds may also be applied to other intensity threshold values than those used in Switzerland for rockfall hazard zoning method, i.e. danger mapping. In this paper, we explore the effect of slope geometry and rockfall frequency on the rockfall hazard zoning. First, the transition from 2D zoning to 3D zoning based on rockfall trajectory simulation is examined; then, its dependency on slope geometry is emphasized. The spatial extent of hazard zones is examined, showing that limits may vary widely depending on the rockfall frequency. This approach is especially dedicated to highly populated regions, because the hazard zoning has to be very fine in order to delineate the greatest possible territory containing acceptable risks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new method of measuring joint angle using a combination of accelerometers and gyroscopes is presented. The method proposes a minimal sensor configuration with one sensor module mounted on each segment. The model is based on estimating the acceleration of the joint center of rotation by placing a pair of virtual sensors on the adjacent segments at the center of rotation. In the proposed technique, joint angles are found without the need for integration, so absolute angles can be obtained which are free from any source of drift. The model considers anatomical aspects and is personalized for each subject prior to each measurement. The method was validated by measuring knee flexion-extension angles of eight subjects, walking at three different speeds, and comparing the results with a reference motion measurement system. The results are very close to those of the reference system presenting very small errors (rms = 1.3, mean = 0.2, SD = 1.1 deg) and excellent correlation coefficients (0.997). The algorithm is able to provide joint angles in real-time, and ready for use in gait analysis. Technically, the system is portable, easily mountable, and can be used for long term monitoring without hindrance to natural activities.