989 resultados para Ancestral range estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Osteoporosis (OP) is a systemic skeletal disease characterized by a low bone mineral density (BMD) and a micro-architectural (MA) deterioration. Clinical risk factors (CRF) are often used as a MA approximation. MA is yet evaluable in daily practice by the Trabecular Bone Score (TBS) measure. TBS is a novel grey-level texture measurement reflecting bone micro-architecture based on the use of experimental variograms of 2D projection images. TBS is very simple to obtain, by reanalyzing a lumbar DXA-scan. TBS has proven to have diagnosis and prognosis value, partially independent of CRF and BMD. The aim of the OsteoLaus cohort is to combine in daily practice the CRF and the information given by DXA (BMD, TBS and vertebral fracture assessment (VFA)) to better identify women at high fracture risk. Method: The OsteoLaus cohort (1400 women 50 to 80 years living in Lausanne, Switzerland) started in 2010. This study is derived from the cohort COLAUS who started in Lausanne in 2003. The main goals of COLAUS is to obtain information on the epidemiology and genetic determinants of cardiovascular risk in 6700 men and women. CRF for OP, bone ultrasound of the heel, lumbar spine and hip BMD, VFA by DXA and MA evaluation by TBS are recorded in OsteoLaus. Preliminary results are reported. Results: We included 631 women: mean age 67.4±6.7 y, BMI 26.1±4.6, mean lumbar spine BMD 0.943±0.168 (T-score -1.4 SD), TBS 1.271±0.103. As expected, correlation between BMD and site matched TBS is low (r2=0.16). Prevalence of VFx grade 2/3, major OP Fx and all OP Fx is 8.4%, 17.0% and 26.0% respectively. Age- and BMI-adjusted ORs (per SD decrease) are 1.8 (1.2- 2.5), 1.6 (1.2-2.1), 1.3 (1.1-1.6) for BMD for the different categories of fractures and 2.0 (1.4-3.0), 1.9 (1.4-2.5), 1.4 (1.1-1.7) for TBS respectively. Only 32 to 37% of women with OP Fx have a BMD < -2.5 SD or a TBS < 1.200. If we combine a BMD < -2.5 SD or a TBS < 1.200, 54 to 60% of women with an osteoporotic Fx are identified. Conclusion: As in the already published studies, these preliminary results confirm the partial independence between BMD and TBS. More importantly, a combination of TBS subsequent to BMD increases significantly the identification of women with prevalent OP Fx which would have been miss-classified by BMD alone. For the first time we are able to have complementary information about fracture (VFA), density (BMD), micro- and macro architecture (TBS & HAS) from a simple, low ionizing radiation and cheap device: DXA. Such complementary information is very useful for the patient in the daily practice and moreover will likely have an impact on cost effectiveness analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Performance prediction and application behavior modeling have been the subject of exten- sive research that aim to estimate applications performance with an acceptable precision. A novel approach to predict the performance of parallel applications is based in the con- cept of Parallel Application Signatures that consists in extract an application most relevant parts (phases) and the number of times they repeat (weights). Executing these phases in a target machine and multiplying its exeuction time by its weight an estimation of the application total execution time can be made. One of the problems is that the performance of an application depends on the program workload. Every type of workload affects differently how an application performs in a given system and so affects the signature execution time. Since the workloads used in most scientific parallel applications have dimensions and data ranges well known and the behavior of these applications are mostly deterministic, a model of how the programs workload affect its performance can be obtained. We create a new methodology to model how a program’s workload affect the parallel application signature. Using regression analysis we are able to generalize each phase time execution and weight function to predict an application performance in a target system for any type of workload within predefined range. We validate our methodology using a synthetic program, benchmarks applications and well known real scientific applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report on advanced dual-wavelength digital holographic microscopy (DHM) methods, enabling single-acquisition real-time micron-range measurements while maintaining single-wavelength interferometric resolution in the nanometer regime. In top of the unique real-time capability of our technique, it is shown that axial resolution can be further increased compared to single-wavelength operation thanks to the uncorrelated nature of both recorded wavefronts. It is experimentally demonstrated that DHM topographic investigation within 3 decades measurement range can be achieved with our arrangement, opening new applications possibilities for this interferometric technique. ©2008 COPYRIGHT SPIE

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The imatinib trough plasma concentration (C(min)) correlates with clinical response in cancer patients. Therapeutic drug monitoring (TDM) of plasma C(min) is therefore suggested. In practice, however, blood sampling for TDM is often not performed at trough. The corresponding measurement is thus only remotely informative about C(min) exposure. Objectives: The objectives of this study were to improve the interpretation of randomly measured concentrations by using a Bayesian approach for the prediction of C(min), incorporating correlation between pharmacokinetic parameters, and to compare the predictive performance of this method with alternative approaches, by comparing predictions with actual measured trough levels, and with predictions obtained by a reference method, respectively. Methods: A Bayesian maximum a posteriori (MAP) estimation method accounting for correlation (MAP-ρ) between pharmacokinetic parameters was developed on the basis of a population pharmacokinetic model, which was validated on external data. Thirty-one paired random and trough levels, observed in gastrointestinal stromal tumour patients, were then used for the evaluation of the Bayesian MAP-ρ method: individual C(min) predictions, derived from single random observations, were compared with actual measured trough levels for assessment of predictive performance (accuracy and precision). The method was also compared with alternative approaches: classical Bayesian MAP estimation assuming uncorrelated pharmacokinetic parameters, linear extrapolation along the typical elimination constant of imatinib, and non-linear mixed-effects modelling (NONMEM) first-order conditional estimation (FOCE) with interaction. Predictions of all methods were finally compared with 'best-possible' predictions obtained by a reference method (NONMEM FOCE, using both random and trough observations for individual C(min) prediction). Results: The developed Bayesian MAP-ρ method accounting for correlation between pharmacokinetic parameters allowed non-biased prediction of imatinib C(min) with a precision of ±30.7%. This predictive performance was similar for the alternative methods that were applied. The range of relative prediction errors was, however, smallest for the Bayesian MAP-ρ method and largest for the linear extrapolation method. When compared with the reference method, predictive performance was comparable for all methods. The time interval between random and trough sampling did not influence the precision of Bayesian MAP-ρ predictions. Conclusion: Clinical interpretation of randomly measured imatinib plasma concentrations can be assisted by Bayesian TDM. Classical Bayesian MAP estimation can be applied even without consideration of the correlation between pharmacokinetic parameters. Individual C(min) predictions are expected to vary less through Bayesian TDM than linear extrapolation. Bayesian TDM could be developed in the future for other targeted anticancer drugs and for the prediction of other pharmacokinetic parameters that have been correlated with clinical outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analysis of TRIM5α and APOBEC3G genes suggests that these two restriction factors underwent strong positive selection throughout primate evolution. This pressure was possibly imposed by ancient exogenous retroviruses, of which endogenous retroviruses are remnants. Our study aims to assess in vitro the activity of these factors against ancient retroviruses by reconstructing their ancestral gag sequences, as well as the ancestral TRIM5α and APOBEC3G for primates. Based on evolutionary genomics approach, we reconstructed ancestors of the two largest families of human endogenous retroviruses (HERV), namely HERV-K and HERV-H, as well as primate ancestral TRIM5α and APOBEC3G variants. The oldest TRIM5α sequence was the catarhinne TRIM5α, common ancestor of Old World monkeys and hominoids, dated from 25 million years ago (mya). From the oldest, to the youngest, ancestral TRIM5α variants showed less restriction of HIV-1 in vitro [1]. Likewise three ancestral APOBEC3Gs sequences common to hominoids (18 mya), Old World monkeys, and catarhinnes (25 mya) were reconstructed. All ancestral APOBEC3G variants inhibited efficiently HIV-1Δvif in vitro, compared to modern APOBEC3Gs. The ability of Vif proteins (HIV-1, HIV-2, SIVmac and SIVagm) to counteract their activity tallied with the residue 128 on ancestral APOBEC3Gs. Moreover we are attempting to reconstruct older ancestral sequences of both restriction factors by using prosimian orthologue sequences. An infectious onemillion- years-old HERV-KCON previously reconstituted was shown to be resistant to modern TRIM5α and APOBEC3G [2]. Our ancestral TRIM5α and APOBEC3G variants were inactive against HERV-KCON. Besides we reconstructed chimeric HERV-K bearing ancestral capsids (up to 7 mya) that resulted in infectious viruses resistant to modern and ancestral TRIM5α. Likewise HERV-K viruses bearing ancestral nucleocapsids will be tested for ancestral and modern APOBEC3G restriction. In silico reconstruction and structural modeling of ancestral HERV-H capsids resulted in structures homologous to that of the gammaretrovirus MLV. Thus we are attempting to construct chimeric MLV virus bearing HERV-H ancestral capsids. These chimeric ancestral HERVs will be tested for infectivity and restriction by ancestral TRIM5α. Similarly chimeric MLV viruses bearing ancestral HERV-H nucleocapsids will be reconstructed and tested for APOBEC3G restriction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increase of malaria transmission in the Pacific Coast of Colombia during the occurrence of El Niño warm event has been found not to be linked to increases in the density of the vector Anopheles albimanus, but to other temperature-sensitive variables such as longevity, duration of the gonotrophic cycle or the sporogonic period of Plasmodium. The present study estimated the effects of temperature on duration of the gonotrophic cycle and on maturation of the ovaries of An. albimanus. Blood fed adult mosquitoes were exposed to temperatures of 24, 27, and 30°C, held individually in oviposition cages and assessed at 12 h intervals. At 24, 27, and 30°C the mean development time of the oocytes was 91.2 h (95% C.I.: 86.5-96), 66.2 h (61.5-70.8), and 73.1 h (64-82.3), respectively. The mean duration of the gonotrophic cycle for these three temperatures was 88.4 h (81.88-94.9), 75 h (71.4-78.7), and 69.1 h (64.6-73.6) respectively. These findings indicate that both parameters in An. albimanus are reduced when temperatures rose from 24 to 30°C, in a nonlinear manner. According to these results the increase in malaria transmission during El Niño in Colombia could be associated with a shortening of the gonotrophic cycle in malaria vectors, which could enhance the frequency of man-vector contact, affecting the incidence of the disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In vertebrates, genome size has been shown to correlate with nuclear and cell sizes, and influences phenotypic features, such as brain complexity. In three different anuran families, advertisement calls of polyploids exhibit longer notes and intervals than diploids, and difference in cellular dimensions have been hypothesized to cause these modifications. We investigated this phenomenon in green toads (Bufo viridis subgroup) of three ploidy levels, in a different call type (release calls) that may evolve independently from advertisement calls, examining 1205 calls, from ten species, subspecies, and hybrid forms. Significant differences between pulse rates of six diploid and four polyploid (3n, 4n) green toad forms across a range of temperatures from 7 to 27 °C were found. Laboratory data supported differences in pulse rates of triploids vs. tetraploids, but failed to reach significance when including field recordings. This study supports the idea that genome size, irrespective of call type, phylogenetic context, and geographical background, might affect call properties in anurans and suggests a common principle governing this relationship. The nuclear-cell size ratio, affected by genome size, seems the most plausible explanation. However, we cannot rule out hypotheses under which call-influencing genes from an unexamined diploid ancestral species might also affect call properties in the hybrid-origin polyploids.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pulse wave velocity (PWV) is a surrogate of arterial stiffness and represents a non-invasive marker of cardiovascular risk. The non-invasive measurement of PWV requires tracking the arrival time of pressure pulses recorded in vivo, commonly referred to as pulse arrival time (PAT). In the state of the art, PAT is estimated by identifying a characteristic point of the pressure pulse waveform. This paper demonstrates that for ambulatory scenarios, where signal-to-noise ratios are below 10 dB, the performance in terms of repeatability of PAT measurements through characteristic points identification degrades drastically. Hence, we introduce a novel family of PAT estimators based on the parametric modeling of the anacrotic phase of a pressure pulse. In particular, we propose a parametric PAT estimator (TANH) that depicts high correlation with the Complior(R) characteristic point D1 (CC = 0.99), increases noise robustness and reduces by a five-fold factor the number of heartbeats required to obtain reliable PAT measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Estimation of the time since death based on the gastric content is still a controversy subject. Many studies have been achieved leaving the same incertitude: the intra- and inter-individual variability. Aim: After a homicidal case where a specialized gastroenterologist was cited to estimate the time of death based on the gastric contents and his experience in clinical practice. Consequently we decided to make a review of the scientific literature to see if that method was more reliable nowadays. Material and methods: We chose articles from 1979 that describe the estimation of the gastric emptying rate according to several factors and the forensic articles about the estimation of the time of death in relation with the gastric content. Results: Most of the articles cited by the specialized gastroenterologist were studies about living healthy people and the effects of several factors (medication, supine versus upside-down position, body mass index or different type of food). Forensic articles frequently concluded that the estimation of the time since death by analyzing the gastric content can be used but not as the unique method. Conclusion: Estimation of the time since death by analyze of the gastric contents is a method that can be used nowadays. But it cannot be the only method as the inter- and intra-individual variability remains an important bias.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The apicomplexan parasite Toxoplasma gondii is unusual in being able to infect almost any cell from almost any warm-blooded animal it encounters. This extraordinary host-range contrasts with its far more particular cousins such as the various species of the malaria parasite Plasmodium where each species of parasite has a single genus or even species of host that it can infect. Genetic and genomic studies have revealed a key role for a number of gene families in how Toxoplasma invades a host cell, modulates gene expression of that cell and successfully evades the resulting immune response. In this review, I will explore the hypothesis that a combination of sexual recombination and expansion of host range may be the major driving forces in the evolution of some of these gene families and the specific genes they encompass. These ideas stem from results and thoughts published by several labs in the last few years but especially recent papers on the role of different forms of rhoptry proteins in the relative virulence of F1 Toxoplasma progeny in a particular host species (mice).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resveratrol has been shown to have beneficial effects on diseases related to oxidant and/or inflammatory processes and extends the lifespan of simple organisms including rodents. The objective of the present study was to estimate the dietary intake of resveratrol and piceid (R&P) present in foods, and to identify the principal dietary sources of these compounds in the Spanish adult population. For this purpose, a food composition database (FCDB) of R&P in Spanish foods was compiled. The study included 40 685 subjects aged 35–64 years from northern and southern regions of Spain who were included in the European Prospective Investigation into Cancer and Nutrition (EPIC)-Spain cohort. Usual food intake was assessed by personal interviews using a computerised version of a validated diet history method. An FCDB with 160 items was compiled. The estimated median and mean of R&P intake were 100 and 933 μg/d respectively. Approximately, 32 % of the population did not consume R&P. The most abundant of the four stilbenes studied was trans-piceid (53·6 %), followed by trans-resveratrol (20·9 %), cis-piceid (19·3 %) and cis-resveratrol (6·2 %). The most important source of R&P was wines (98·4 %) and grape and grape juices (1·6 %), whereas peanuts, pistachios and berries contributed to less than 0·01 %. For this reason the pattern of intake of R&P was similar to the wine pattern. This is the first time that R&P intake has been estimated in a Mediterranean country.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although the genome of Trypanosoma cruzi has been completely sequenced, little is known about its population structure and evolution. Since 1999, two major evolutionary lineages presenting distinct epidemiological characteristics have been recognised: T. cruzi I and T. cruzi II. We describe new and important aspects of the population structure of the parasite, and unequivocally characterise a third ancestral lineage that we propose to name T. cruzi III. Through a careful analysis of haplotypes (blocks of genes that are stably transmitted from generation to generation of the parasite), we inferred at least two hybridisation events between the parental lineages T. cruzi II and T. cruzi III. The strain CL Brener, whose genome was sequenced, is one such hybrid. Based on these results, we propose a simple evolutionary model based on three ancestral genomes, T. cruzi I, T. cruzi II and T. cruzi III. At least two hybridisation events produced evolutionarily viable progeny, and T. cruzi III was the cytoplasmic donor for the resulting offspring (as identified by the mitochondrial clade of the hybrid strains) in both events. This model should be useful to inform evolutionary and pathogenetic hypotheses regarding T. cruzi.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interaction effects are usually modeled by means of moderated regression analysis. Structural equation models with non-linear constraints make it possible to estimate interaction effects while correcting formeasurement error. From the various specifications, Jöreskog and Yang's(1996, 1998), likely the most parsimonious, has been chosen and further simplified. Up to now, only direct effects have been specified, thus wasting much of the capability of the structural equation approach. This paper presents and discusses an extension of Jöreskog and Yang's specification that can handle direct, indirect and interaction effects simultaneously. The model is illustrated by a study of the effects of an interactive style of use of budgets on both company innovation and performance