16 resultados para Link variable method
em Université de Lausanne, Switzerland
Resumo:
The use of Geographic Information Systems has revolutionalized the handling and the visualization of geo-referenced data and has underlined the critic role of spatial analysis. The usual tools for such a purpose are geostatistics which are widely used in Earth science. Geostatistics are based upon several hypothesis which are not always verified in practice. On the other hand, Artificial Neural Network (ANN) a priori can be used without special assumptions and are known to be flexible. This paper proposes to discuss the application of ANN in the case of the interpolation of a geo-referenced variable.
Resumo:
We present a novel hybrid (or multiphysics) algorithm, which couples pore-scale and Darcy descriptions of two-phase flow in porous media. The flow at the pore-scale is described by the Navier?Stokes equations, and the Volume of Fluid (VOF) method is used to model the evolution of the fluid?fluid interface. An extension of the Multiscale Finite Volume (MsFV) method is employed to construct the Darcy-scale problem. First, a set of local interpolators for pressure and velocity is constructed by solving the Navier?Stokes equations; then, a coarse mass-conservation problem is constructed by averaging the pore-scale velocity over the cells of a coarse grid, which act as control volumes; finally, a conservative pore-scale velocity field is reconstructed and used to advect the fluid?fluid interface. The method relies on the localization assumptions used to compute the interpolators (which are quite straightforward extensions of the standard MsFV) and on the postulate that the coarse-scale fluxes are proportional to the coarse-pressure differences. By numerical simulations of two-phase problems, we demonstrate that these assumptions provide hybrid solutions that are in good agreement with reference pore-scale solutions and are able to model the transition from stable to unstable flow regimes. Our hybrid method can naturally take advantage of several adaptive strategies and allows considering pore-scale fluxes only in some regions, while Darcy fluxes are used in the rest of the domain. Moreover, since the method relies on the assumption that the relationship between coarse-scale fluxes and pressure differences is local, it can be used as a numerical tool to investigate the limits of validity of Darcy's law and to understand the link between pore-scale quantities and their corresponding Darcy-scale variables.
Resumo:
Whole-body (WB) planar imaging has long been one of the staple methods of dosimetry, and its quantification has been formalized by the MIRD Committee in pamphlet no 16. One of the issues not specifically addressed in the formalism occurs when the count rates reaching the detector are sufficiently high to result in camera count saturation. Camera dead-time effects have been extensively studied, but all of the developed correction methods assume static acquisitions. However, during WB planar (sweep) imaging, a variable amount of imaged activity exists in the detector's field of view as a function of time and therefore the camera saturation is time dependent. A new time-dependent algorithm was developed to correct for dead-time effects during WB planar acquisitions that accounts for relative motion between detector heads and imaged object. Static camera dead-time parameters were acquired by imaging decaying activity in a phantom and obtaining a saturation curve. Using these parameters, an iterative algorithm akin to Newton's method was developed, which takes into account the variable count rate seen by the detector as a function of time. The algorithm was tested on simulated data as well as on a whole-body scan of high activity Samarium-153 in an ellipsoid phantom. A complete set of parameters from unsaturated phantom data necessary for count rate to activity conversion was also obtained, including build-up and attenuation coefficients, in order to convert corrected count rate values to activity. The algorithm proved successful in accounting for motion- and time-dependent saturation effects in both the simulated and measured data and converged to any desired degree of precision. The clearance half-life calculated from the ellipsoid phantom data was calculated to be 45.1 h after dead-time correction and 51.4 h with no correction; the physical decay half-life of Samarium-153 is 46.3 h. Accurate WB planar dosimetry of high activities relies on successfully compensating for camera saturation which takes into account the variable activity in the field of view, i.e. time-dependent dead-time effects. The algorithm presented here accomplishes this task.
Resumo:
Analyzing the relationship between the baseline value and subsequent change of a continuous variable is a frequent matter of inquiry in cohort studies. These analyses are surprisingly complex, particularly if only two waves of data are available. It is unclear for non-biostatisticians where the complexity of this analysis lies and which statistical method is adequate.With the help of simulated longitudinal data of body mass index in children,we review statistical methods for the analysis of the association between the baseline value and subsequent change, assuming linear growth with time. Key issues in such analyses are mathematical coupling, measurement error, variability of change between individuals, and regression to the mean. Ideally, it is better to rely on multiple repeated measurements at different times and a linear random effects model is a standard approach if more than two waves of data are available. If only two waves of data are available, our simulations show that Blomqvist's method - which consists in adjusting for measurement error variance the estimated regression coefficient of observed change on baseline value - provides accurate estimates. The adequacy of the methods to assess the relationship between the baseline value and subsequent change depends on the number of data waves, the availability of information on measurement error, and the variability of change between individuals.
Resumo:
PURPOSE: Few studies compare the variabilities that characterize environmental (EM) and biological monitoring (BM) data. Indeed, comparing their respective variabilities can help to identify the best strategy for evaluating occupational exposure. The objective of this study is to quantify the biological variability associated with 18 bio-indicators currently used in work environments. METHOD: Intra-individual (BV(intra)), inter-individual (BV(inter)), and total biological variability (BV(total)) were quantified using validated physiologically based toxicokinetic (PBTK) models coupled with Monte Carlo simulations. Two environmental exposure profiles with different levels of variability were considered (GSD of 1.5 and 2.0). RESULTS: PBTK models coupled with Monte Carlo simulations were successfully used to predict the biological variability of biological exposure indicators. The predicted values follow a lognormal distribution, characterized by GSD ranging from 1.1 to 2.3. Our results show that there is a link between biological variability and the half-life of bio-indicators, since BV(intra) and BV(total) both decrease as the biological indicator half-lives increase. BV(intra) is always lower than the variability in the air concentrations. On an individual basis, this means that the variability associated with the measurement of biological indicators is always lower than the variability characterizing airborne levels of contaminants. For a group of workers, BM is less variable than EM for bio-indicators with half-lives longer than 10-15 h. CONCLUSION: The variability data obtained in the present study can be useful in the development of BM strategies for exposure assessment and can be used to calculate the number of samples required for guiding industrial hygienists or medical doctors in decision-making.
Resumo:
Laudisa (Found. Phys. 38:1110-1132, 2008) claims that experimental research on the class of non-local hidden-variable theories introduced by Leggett is misguided, because these theories are irrelevant for the foundations of quantum mechanics. I show that Laudisa's arguments fail to establish the pessimistic conclusion he draws from them. In particular, it is not the case that Leggett-inspired research is based on a mistaken understanding of Bell's theorem, nor that previous no-hidden-variable theorems already exclude Leggett's models. Finally, I argue that the framework of Bohmian mechanics brings out the importance of Leggett tests, rather than proving their irrelevance, as Laudisa supposes.
Resumo:
Idiopathic hypogonadotropic hypogonadism (IHH) is defined by absent or incomplete puberty and characterised biochemically by low levels of sex steroids, with low or inappropriately normal gonadotropin hormones. IHH is frequently accompanied by non-reproductive abnormalities, most commonly anosmia, which is present in 50-60% of cases and defines Kallmann syndrome. The understanding of IHH has undergone rapid evolution, both in respect of genetics and breadth of phenotype. Once considered in monogenic Mendelian terms, it is now more coherently understood as a complex genetic condition. Oligogenic and complex genetic-environmental interactions have now been identified, with physiological and environmental factors interacting in genetically susceptible individuals to alter the clinical course and phenotype. These potentially link IHH to ancient evolutionary pressures on the ancestral human genome.
Resumo:
This paper introduces a nonlinear measure of dependence between random variables in the context of remote sensing data analysis. The Hilbert-Schmidt Independence Criterion (HSIC) is a kernel method for evaluating statistical dependence. HSIC is based on computing the Hilbert-Schmidt norm of the cross-covariance operator of mapped samples in the corresponding Hilbert spaces. The HSIC empirical estimator is very easy to compute and has good theoretical and practical properties. We exploit the capabilities of HSIC to explain nonlinear dependences in two remote sensing problems: temperature estimation and chlorophyll concentration prediction from spectra. Results show that, when the relationship between random variables is nonlinear or when few data are available, the HSIC criterion outperforms other standard methods, such as the linear correlation or mutual information.
Resumo:
False identity documents constitute a potential powerful source of forensic intelligence because they are essential elements of transnational crime and provide cover for organized crime. In previous work, a systematic profiling method using false documents' visual features has been built within a forensic intelligence model. In the current study, the comparison process and metrics lying at the heart of this profiling method are described and evaluated. This evaluation takes advantage of 347 false identity documents of four different types seized in two countries whose sources were known to be common or different (following police investigations and dismantling of counterfeit factories). Intra-source and inter-sources variations were evaluated through the computation of more than 7500 similarity scores. The profiling method could thus be validated and its performance assessed using two complementary approaches to measuring type I and type II error rates: a binary classification and the computation of likelihood ratios. Very low error rates were measured across the four document types, demonstrating the validity and robustness of the method to link documents to a common source or to differentiate them. These results pave the way for an operational implementation of a systematic profiling process integrated in a developed forensic intelligence model.
Resumo:
Although the molecular typing of Pseudomonas aeruginosa is important to understand the local epidemiology of this opportunistic pathogen, it remains challenging. Our aim was to develop a simple typing method based on the sequencing of two highly variable loci. Single-strand sequencing of three highly variable loci (ms172, ms217, and oprD) was performed on a collection of 282 isolates recovered between 1994 and 2007 (from patients and the environment). As expected, the resolution of each locus alone [number of types (NT) = 35-64; index of discrimination (ID) = 0.816-0.964] was lower than the combination of two loci (NT = 78-97; ID = 0.966-0.971). As each pairwise combination of loci gave similar results, we selected the most robust combination with ms172 [reverse; R] and ms217 [R] to constitute the double-locus sequence typing (DLST) scheme for P. aeruginosa. This combination gave: (i) a complete genotype for 276/282 isolates (typability of 98%), (ii) 86 different types, and (iii) an ID of 0.968. Analysis of multiple isolates from the same patients or taps showed that DLST genotypes are generally stable over a period of several months. The high typability, discriminatory power, and ease of use of the proposed DLST scheme makes it a method of choice for local epidemiological analyses of P. aeruginosa. Moreover, the possibility to give unambiguous definition of types allowed to develop an Internet database ( http://www.dlst.org ) accessible by all.
Resumo:
OBJECTIVE: Although genetic factors have been implicated in the etiology of bipolar disorder, no specific gene has been conclusively identified. Given the link between abnormalities in serotonergic neurotransmission and bipolar disorder, a candidate gene association approach was applied to study the involvement of the monoamine oxidase A (MAOA) gene, which codes for a catabolic enzyme of serotonin, in the susceptibility to bipolar disorder. METHOD: In France and Switzerland, 272 patients with bipolar disorder and 122 healthy subjects were typed for three polymorphic markers of the MAOA gene: the MAOA-CA repeat, the MAOA restriction fragment length polymorphism (RFLP), and a repeat directly adjacent to the variable number of tandem repeats (VNTR) locus. RESULTS: A significant difference in the distribution of the alleles for the MAOA-CA repeat was observed between the female bipolar patients and comparison group. CONCLUSIONS: The results obtained in the French and Swiss population confirm findings from two studies conducted in the United Kingdom.
Resumo:
The flow of two immiscible fluids through a porous medium depends on the complex interplay between gravity, capillarity, and viscous forces. The interaction between these forces and the geometry of the medium gives rise to a variety of complex flow regimes that are difficult to describe using continuum models. Although a number of pore-scale models have been employed, a careful investigation of the macroscopic effects of pore-scale processes requires methods based on conservation principles in order to reduce the number of modeling assumptions. In this work we perform direct numerical simulations of drainage by solving Navier-Stokes equations in the pore space and employing the Volume Of Fluid (VOF) method to track the evolution of the fluid-fluid interface. After demonstrating that the method is able to deal with large viscosity contrasts and model the transition from stable flow to viscous fingering, we focus on the macroscopic capillary pressure and we compare different definitions of this quantity under quasi-static and dynamic conditions. We show that the difference between the intrinsic phase-average pressures, which is commonly used as definition of Darcy-scale capillary pressure, is subject to several limitations and it is not accurate in presence of viscous effects or trapping. In contrast, a definition based on the variation of the total surface energy provides an accurate estimate of the macroscopic capillary pressure. This definition, which links the capillary pressure to its physical origin, allows a better separation of viscous effects and does not depend on the presence of trapped fluid clusters.
Resumo:
Introduction: In children with cystic fibrosis (CF), low immunoglobulin (IgG) levels have been reported to be associated with significantly less severe lung disease. However, decreased IgG can be a sign for common variable immunodeficiency (CVID) and affect clinical outcome. The aim of this study was to analyze clinical and serological data of patients having low IgG levels in routine blood tests at annual assessment, particularly their antibody response to polysaccharide antigens. Method: Retrospective chart review of demographic data of CF patients followed at the pediatric CF clinic throughout 2009. Clinical parameters (genotype, pancreas sufficiency, FEV1), presence of Pseudomonas aeruginosa (PA) and number of exacerbations per year were correlated with immunoglobulin and vaccination antibodies levels (antibodies to pneumococcal serotypes 14, 19, 23, 1, 5 and 7F measured by enzyme-linked immune-sorbent assay). Results: 4 out of 60 patients (6.7%) had lower IgG-levels for age. Ages ranged from 1 year 8 months to 11 years, 2 boys, 2 girls. Three patients were delF508 homozygotes, one heterozygote composite delF508/G542X. All were pancreatic insufficient. FEV1 ranged from 74 to 108%. One patient never had colonization by PA, 2 had intermittent PA colonization and one was chronically infected. After conjugated vaccination all patients had protective antibodies against serotypes 14, 19, 23F. For serotypes not included in the vaccine, only one patient had protective titers for 1 out of 3 serotypes. None of the patients had received unconjugated pneumococcal vaccine. There was no significant clinical difference in FEV1, PA colonization or number of exacerbations according to IgG and vaccination antibody levels. Conclusion: Cystic Fibrosis patients with low immunoglobulin levels have normal antibody response to protein antigens. However, despite recurrent infections, there seems to be delayed or deficient antibody response to polysaccharide antigens. Prospective studies are needed to evaluate the development of polysaccharide antibody responses in CF-patients to monitor for CVID. With early detection of CF by newborn screening program, long term follow up could be started early in childhood.
Resumo:
Currently, there is no simple direct screening method for the misuse of blood transfusions in sports. In this study, we investigated whether the measurement of iron in EDTA-plasma can serve as biomarker for such purpose. Our results revealed an increase of the plasma iron level up to 25-fold 6 h after blood re-infusion. The variable remained elevated 10-fold one day after the procedure. A specificity of 100% and a sensitivity of 93% were obtained with a proposed threshold at 45 µg/dL of plasma iron. Therefore, our test could be used as a simple, cost effective biomarker for the screening for blood transfusion misuse in sports. Copyright © 2014 John Wiley & Sons, Ltd.
Resumo:
The genus Prunus L. is large and economically important. However, phylogenetic relationships within Prunus at low taxonomic level, particularly in the subgenus Amygdalus L. s.l., remain poorly investigated. This paper attempts to document the evolutionary history of Amygdalus s.l. and establishes a temporal framework, by assembling molecular data from conservative and variable molecular markers. The nuclear s6pdh gene in combination with the plastid trnSG spacer are analyzed with bayesian and maximum likelihood methods. Since previous phylogenetic analysis with these markers lacked resolution, we additionally analyzed 13 nuclear SSR loci with the δµ2 distance, followed by an unweighted pair group method using arithmetic averages algorithm. Our phylogenetic analysis with both sequence and SSR loci confirms the split between sections Amygdalus and Persica, comprising almonds and peaches, respectively. This result is in agreement with biogeographic data showing that each of the two sections is naturally distributed on each side of the Central Asian Massif chain. Using coalescent based estimations, divergence times between the two sections strongly varied when considering sequence data only or combined with SSR. The sequence-only based estimate (5 million years ago) was congruent with the Central Asian Massif orogeny and subsequent climate change. Given the low level of differentiation within the two sections using both marker types, the utility of combining microsatellites and data sequences to address phylogenetic relationships at low taxonomic level within Amygdalus is discussed. The recent evolutionary histories of almond and peach are discussed in view of the domestication processes that arose in these two phenotypically-diverging gene pools: almonds and peaches were domesticated from the Amygdalus s.s. and Persica sections, respectively. Such economically important crops may serve as good model to study divergent domestication process in close genetic pool.