164 resultados para Towards Seamless Integration of Geoscience Models and Data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Simulated-annealing-based conditional simulations provide a flexible means of quantitatively integrating diverse types of subsurface data. Although such techniques are being increasingly used in hydrocarbon reservoir characterization studies, their potential in environmental, engineering and hydrological investigations is still largely unexploited. Here, we introduce a novel simulated annealing (SA) algorithm geared towards the integration of high-resolution geophysical and hydrological data which, compared to more conventional approaches, provides significant advancements in the way that large-scale structural information in the geophysical data is accounted for. Model perturbations in the annealing procedure are made by drawing from a probability distribution for the target parameter conditioned to the geophysical data. This is the only place where geophysical information is utilized in our algorithm, which is in marked contrast to other approaches where model perturbations are made through the swapping of values in the simulation grid and agreement with soft data is enforced through a correlation coefficient constraint. Another major feature of our algorithm is the way in which available geostatistical information is utilized. Instead of constraining realizations to match a parametric target covariance model over a wide range of spatial lags, we constrain the realizations only at smaller lags where the available geophysical data cannot provide enough information. Thus we allow the larger-scale subsurface features resolved by the geophysical data to have much more due control on the output realizations. Further, since the only component of the SA objective function required in our approach is a covariance constraint at small lags, our method has improved convergence and computational efficiency over more traditional methods. Here, we present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on a synthetic data set, and then applied to data collected at the Boise Hydrogeophysical Research Site.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustainable resource use is one of the most important environmental issues of our times. It is closely related to discussions on the 'peaking' of various natural resources serving as energy sources, agricultural nutrients, or metals indispensable in high-technology applications. Although the peaking theory remains controversial, it is commonly recognized that a more sustainable use of resources would alleviate negative environmental impacts related to resource use. In this thesis, sustainable resource use is analysed from a practical standpoint, through several different case studies. Four of these case studies relate to resource metabolism in the Canton of Geneva in Switzerland: the aim was to model the evolution of chosen resource stocks and flows in the coming decades. The studied resources were copper (a bulk metal), phosphorus (a vital agricultural nutrient), and wood (a renewable resource). In addition, the case of lithium (a critical metal) was analysed briefly in a qualitative manner and in an electric mobility perspective. In addition to the Geneva case studies, this thesis includes a case study on the sustainability of space life support systems. Space life support systems are systems whose aim is to provide the crew of a spacecraft with the necessary metabolic consumables over the course of a mission. Sustainability was again analysed from a resource use perspective. In this case study, the functioning of two different types of life support systems, ARES and BIORAT, were evaluated and compared; these systems represent, respectively, physico-chemical and biological life support systems. Space life support systems could in fact be used as a kind of 'laboratory of sustainability' given that they represent closed and relatively simple systems compared to complex and open terrestrial systems such as the Canton of Geneva. The chosen analysis method used in the Geneva case studies was dynamic material flow analysis: dynamic material flow models were constructed for the resources copper, phosphorus, and wood. Besides a baseline scenario, various alternative scenarios (notably involving increased recycling) were also examined. In the case of space life support systems, the methodology of material flow analysis was also employed, but as the data available on the dynamic behaviour of the systems was insufficient, only static simulations could be performed. The results of the case studies in the Canton of Geneva show the following: were resource use to follow population growth, resource consumption would be multiplied by nearly 1.2 by 2030 and by 1.5 by 2080. A complete transition to electric mobility would be expected to only slightly (+5%) increase the copper consumption per capita while the lithium demand in cars would increase 350 fold. For example, phosphorus imports could be decreased by recycling sewage sludge or human urine; however, the health and environmental impacts of these options have yet to be studied. Increasing the wood production in the Canton would not significantly decrease the dependence on wood imports as the Canton's production represents only 5% of total consumption. In the comparison of space life support systems ARES and BIORAT, BIORAT outperforms ARES in resource use but not in energy use. However, as the systems are dimensioned very differently, it remains questionable whether they can be compared outright. In conclusion, the use of dynamic material flow analysis can provide useful information for policy makers and strategic decision-making; however, uncertainty in reference data greatly influences the precision of the results. Space life support systems constitute an extreme case of resource-using systems; nevertheless, it is not clear how their example could be of immediate use to terrestrial systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Little is known about the financial burden of individuals with depressive symptoms. This study explored that burden, using data from the Survey of Health, Ageing, and Retirement in Europe. To assess the association between depressive symptoms and the individuals' financial burden for medical care and whether they forwent medical care because of costs, logistic regressions were performed that adjusted for age, gender, marital status, education, and chronic diseases. A total of 16,696 noninstitutionalized individuals aged 50-79 years were included in the study. Individuals with depressive symptoms and those without such symptoms bore a similar financial burden. However, individuals with depressive symptoms were at increased risk of forgoing care because of costs, which may worsen their health and financial situation

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Risk maps summarizing landscape suitability of novel areas for invading species can be valuable tools for preventing species' invasions or controlling their spread, but methods employed for development of such maps remain variable and unstandardized. We discuss several considerations in development of such models, including types of distributional information that should be used, the nature of explanatory variables that should be incorporated, and caveats regarding model testing and evaluation. We highlight that, in the case of invasive species, such distributional predictions should aim to derive the best hypothesis of the potential distribution of the species by using (1) all distributional information available, including information from both the native range and other invaded regions; (2) predictors linked as directly as is feasible to the physiological requirements of the species; and (3) modelling procedures that carefully avoid overfitting to the training data. Finally, model testing and evaluation should focus on well-predicted presences, and less on efficient prediction of absences; a k-fold regional cross-validation test is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A human in vivo toxicokinetic model was built to allow a better understanding of the toxicokinetics of folpet fungicide and its key ring biomarkers of exposure: phthalimide (PI), phthalamic acid (PAA) and phthalic acid (PA). Both PI and the sum of ring metabolites, expressed as PA equivalents (PAeq), may be used as biomarkers of exposure. The conceptual representation of the model was based on the analysis of the time course of these biomarkers in volunteers orally and dermally exposed to folpet. In the model, compartments were also used to represent the body burden of folpet and experimentally relevant PI, PAA and PA ring metabolites in blood and in key tissues as well as in excreta, hence urinary and feces. The time evolution of these biomarkers in each compartment of the model was then mathematically described by a system of coupled differential equations. The mathematical parameters of the model were then determined from best fits to the time courses of PI and PAeq in blood and urine of five volunteers administered orally 1 mg kg(-1) and dermally 10 mg kg(-1) of folpet. In the case of oral administration, the mean elimination half-life of PI from blood (through feces, urine or metabolism) was found to be 39.9 h as compared with 28.0 h for PAeq. In the case of a dermal application, mean elimination half-life of PI and PAeq was estimated to be 34.3 and 29.3 h, respectively. The average final fractions of administered dose recovered in urine as PI over the 0-96 h period were 0.030 and 0.002%, for oral and dermal exposure, respectively. Corresponding values for PAeq were 24.5 and 1.83%, respectively. Finally, the average clearance rate of PI from blood calculated from the oral and dermal data was 0.09 ± 0.03 and 0.13 ± 0.05 ml h(-1) while the volume of distribution was 4.30 ± 1.12 and 6.05 ± 2.22 l, respectively. It was not possible to obtain the corresponding values from PAeq data owing to the lack of blood time course data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Successful pregnancy depends on well coordinated developmental events involving both maternal and embryonic components. Although a host of signaling pathways participate in implantation, decidualization, and placentation, whether there is a common molecular link that coordinates these processes remains unknown. By exploiting genetic, molecular, pharmacological, and physiological approaches, we show here that the nuclear transcription factor peroxisome proliferator-activated receptor (PPAR) delta plays a central role at various stages of pregnancy, whereas maternal PPARdelta is critical to implantation and decidualization, and embryonic PPARdelta is vital for placentation. Using trophoblast stem cells, we further elucidate that a reciprocal relationship between PPARdelta-AKT and leukemia inhibitory factor-STAT3 signaling pathways serves as a cell lineage sensor to direct trophoblast cell fates during placentation. This novel finding of stage-specific integration of maternal and embryonic PPARdelta signaling provides evidence that PPARdelta is a molecular link that coordinates implantation, decidualization, and placentation crucial to pregnancy success. This study is clinically relevant because deferral of on time implantation leads to spontaneous pregnancy loss, and defective trophoblast invasion is one cause of preeclampsia in humans.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a validation study on statistical nonsupervised brain tissue classification techniques in magnetic resonance (MR) images. Several image models assuming different hypotheses regarding the intensity distribution model, the spatial model and the number of classes are assessed. The methods are tested on simulated data for which the classification ground truth is known. Different noise and intensity nonuniformities are added to simulate real imaging conditions. No enhancement of the image quality is considered either before or during the classification process. This way, the accuracy of the methods and their robustness against image artifacts are tested. Classification is also performed on real data where a quantitative validation compares the methods' results with an estimated ground truth from manual segmentations by experts. Validity of the various classification methods in the labeling of the image as well as in the tissue volume is estimated with different local and global measures. Results demonstrate that methods relying on both intensity and spatial information are more robust to noise and field inhomogeneities. We also demonstrate that partial volume is not perfectly modeled, even though methods that account for mixture classes outperform methods that only consider pure Gaussian classes. Finally, we show that simulated data results can also be extended to real data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic variants influence the risk to develop certain diseases or give rise to differences in drug response. Recent progresses in cost-effective, high-throughput genome-wide techniques, such as microarrays measuring Single Nucleotide Polymorphisms (SNPs), have facilitated genotyping of large clinical and population cohorts. Combining the massive genotypic data with measurements of phenotypic traits allows for the determination of genetic differences that explain, at least in part, the phenotypic variations within a population. So far, models combining the most significant variants can only explain a small fraction of the variance, indicating the limitations of current models. In particular, researchers have only begun to address the possibility of interactions between genotypes and the environment. Elucidating the contributions of such interactions is a difficult task because of the large number of genetic as well as possible environmental factors.In this thesis, I worked on several projects within this context. My first and main project was the identification of possible SNP-environment interactions, where the phenotypes were serum lipid levels of patients from the Swiss HIV Cohort Study (SHCS) treated with antiretroviral therapy. Here the genotypes consisted of a limited set of SNPs in candidate genes relevant for lipid transport and metabolism. The environmental variables were the specific combinations of drugs given to each patient over the treatment period. My work explored bioinformatic and statistical approaches to relate patients' lipid responses to these SNPs, drugs and, importantly, their interactions. The goal of this project was to improve our understanding and to explore the possibility of predicting dyslipidemia, a well-known adverse drug reaction of antiretroviral therapy. Specifically, I quantified how much of the variance in lipid profiles could be explained by the host genetic variants, the administered drugs and SNP-drug interactions and assessed the predictive power of these features on lipid responses. Using cross-validation stratified by patients, we could not validate our hypothesis that models that select a subset of SNP-drug interactions in a principled way have better predictive power than the control models using "random" subsets. Nevertheless, all models tested containing SNP and/or drug terms, exhibited significant predictive power (as compared to a random predictor) and explained a sizable proportion of variance, in the patient stratified cross-validation context. Importantly, the model containing stepwise selected SNP terms showed higher capacity to predict triglyceride levels than a model containing randomly selected SNPs. Dyslipidemia is a complex trait for which many factors remain to be discovered, thus missing from the data, and possibly explaining the limitations of our analysis. In particular, the interactions of drugs with SNPs selected from the set of candidate genes likely have small effect sizes which we were unable to detect in a sample of the present size (<800 patients).In the second part of my thesis, I performed genome-wide association studies within the Cohorte Lausannoise (CoLaus). I have been involved in several international projects to identify SNPs that are associated with various traits, such as serum calcium, body mass index, two-hour glucose levels, as well as metabolic syndrome and its components. These phenotypes are all related to major human health issues, such as cardiovascular disease. I applied statistical methods to detect new variants associated with these phenotypes, contributing to the identification of new genetic loci that may lead to new insights into the genetic basis of these traits. This kind of research will lead to a better understanding of the mechanisms underlying these pathologies, a better evaluation of disease risk, the identification of new therapeutic leads and may ultimately lead to the realization of "personalized" medicine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Unless effective preventive strategies are implemented, aging of the population will result in a significant worsening of the heart failure (HF) epidemic. Few data exist on whether baseline electrocardiographic (ECG) abnormalities can refine risk prediction for HF. METHODS: We examined a prospective cohort of 2,915 participants aged 70 to 79 years without preexisting HF, enrolled between April 1997 and June 1998 in the Health, Aging, and Body Composition (Health ABC) study. Minnesota Code was used to define major and minor ECG abnormalities at baseline and at year 4 follow-up. Using Cox models, we assessed (1) the association between ECG abnormalities and incident HF and (2) the incremental value of adding ECG to the Health ABC HF Risk Score using the net reclassification index. RESULTS: At baseline, 380 participants (13.0%) had minor, and 620 (21.3%) had major ECG abnormalities. During a median follow-up of 11.4 years, 485 participants (16.6%) developed incident HF. After adjusting for the Health ABC HF Risk Score variables, the hazard ratio (HR) was 1.27 (95% CI 0.96-1.68) for minor and 1.99 (95% CI 1.61-2.44) for major ECG abnormalities. At year 4, 263 participants developed new and 549 had persistent abnormalities; both were associated with increased subsequent HF risk (HR 1.94, 95% CI 1.38-2.72 for new and HR 2.35, 95% CI 1.82-3.02 for persistent ECG abnormalities). Baseline ECG correctly reclassified 10.5% of patients with HF events, 0.8% of those without HF events, and 1.4% of the overall population. The net reclassification index across the Health ABC HF risk categories was 0.11 (95% CI 0.03-0.19). CONCLUSIONS: Among older adults, baseline and new ECG abnormalities are independently associated with increased risk of HF. The contribution of ECG screening for targeted prevention of HF should be evaluated in clinical trials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using data from the Public Health Service, we studied the demographic and clinical characteristics of 1,782 patients enrolled in methadone maintenance treatment (MMT) during 2001 in the Swiss Canton of Vaud, comparing our findings with the results of a previous study from 1976 to 1986. In 2001, most patients (76.9%) were treated in general practice. Mortality is low in this MMT population (1%/year). While patient age and sex profiles were similar to those found in the earlier study, we did observe a substantial increase in the number of patients and the number of practitioners treating MMT patients, probably reflecting the low-threshold governmental policies and the creation of specialized centers. In conclusion, easier access to MMT enhances the number of patients, but new concerns about the quality of management emerge: benzodiazepine as a concomitant prescription; low rates of screening for hepatitis B, C and HIV, and social and psychiatric preoccupations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a viscometric affinity biosensor that can potentially allow continuous multi-analyte monitoring in biological fluids like blood or plasma. The sensing principle is based on the detection of viscosity changes of a polymeric solution which has a selective affinity for the analyte of interest. The chemico-mechanical sensor incorporates an actuating piezoelectric diaphragm, a sensing piezoelectric diaphragm and a flow-resisting microchannel for viscosity detection. A free-standing Anodic Alumina Oxide (AAO) porous nano-membrane is used as selective interface. A glucose-sensitive sensor was fabricated and extensively assessed in buffer solution. The sensor reversibility, stability and sensitivity were excellent during at least 65 hours. Results showed also a good degree of stability for a long term measurement (25 days). The sensor behaviour was furthermore tested in fetal bovine serum (FBS). The obtained results for glucose sensing are very promising, indicating that the developed sensor is a candidate for continuous monitoring in biological fluids. Sensitive solutions for ionized calcium and pH are currently under development and should allow multi-analyte sensing in the near future.