86 resultados para Coupling and Integration of Hydrologic Models II

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Drawing on Social Representations Theory, this study investigates focalisation and anchoring during the diffusion of information concerning the Large Hadron Collider (LHC), the particle accelerator at the European Organisation for Nuclear Research (CERN). We hypothesised that people focus on striking elements of the message, abandoning others, that the nature of the initial information affects diffusion of information, and that information is anchored in prior attitudes toward CERN and science. A serial reproduction experiment with two generations and four chains of reproduction diffusing controversial versus descriptive information about the LHC shows a reduction of information through generations, the persistence of terminology regarding the controversy and a decrease of other elements for participants exposed to polemical information. Concerning anchoring, positive attitudes toward CERN and science increase the use of expert terminology unrelated to the controversy. This research highlights the relevance of a social representational approach in the public understanding of science.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To assess the feasibility and activity of radio-chemotherapy with mitomycin C (MMC) and cisplatin (CDDP) in locally advanced squamous cell anal carcinoma with reference to radiotherapy (RT) combined with MMC and fluorouracil (5-FU). PATIENTS AND METHODS: Patients with measurable disease >4 cmN0 or N+ received RT (36Gy+2 week gap+23.4Gy) with either MMC/CDDP or MMC/5-FU (MMC 10mg/m(2) d1 of each sequence; 5-FU 200mg/m(2)/day c.i.v. daily; CDDP 25mg/m(2) weekly). Forty patients/arm were needed to exclude a RECIST objective response rate (ORR), 8 weeks after treatment, of <75% (Fleming 1, alpha=10%, beta=10%). RESULTS: The ORR was 79.5% (31/39) (lower bound confidence interval [CI]: 68.8%) with MMC/5-FU versus 91.9% (34/ 37) (lower bound CI: 82.8%) with MMC/CDDP. In the MMC/5-FU group, two patients (5.1%) discontinued treatment due to toxicity versus 11 (29.7%) in the MMC/CDDP group. Nine grade 3 haematological events occurred with MMC/CDDP versus none with 5-FU/MMC. The rate of other toxicities did not differ. There was no toxic death. Thirty-one patients in the MMC/5-FU arm (79.5%) and 18 in the MMC/CDDP arm (48.6%) were fully compliant with the protocol treatment (p=0.005). CONCLUSIONS: Radio-chemotherapy with MMC/CDDP seems promising as only MMC/CDDP demonstrated enough activity (RECIST ORR >75%) to be tested further in phase III trials; MMC/5-FU did not. MMC/CDDP also had an overall acceptable toxicity profile.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Occupational exposure modeling is widely used in the context of the E.U. regulation on the registration, evaluation, authorization, and restriction of chemicals (REACH). First tier tools, such as European Centre for Ecotoxicology and TOxicology of Chemicals (ECETOC) targeted risk assessment (TRA) or Stoffenmanager, are used to screen a wide range of substances. Those of concern are investigated further using second tier tools, e.g., Advanced REACH Tool (ART). Local sensitivity analysis (SA) methods are used here to determine dominant factors for three models commonly used within the REACH framework: ECETOC TRA v3, Stoffenmanager 4.5, and ART 1.5. Based on the results of the SA, the robustness of the models is assessed. For ECETOC, the process category (PROC) is the most important factor. A failure to identify the correct PROC has severe consequences for the exposure estimate. Stoffenmanager is the most balanced model and decision making uncertainties in one modifying factor are less severe in Stoffenmanager. ART requires a careful evaluation of the decisions in the source compartment since it constitutes ∼75% of the total exposure range, which corresponds to an exposure estimate of 20-22 orders of magnitude. Our results indicate that there is a trade off between accuracy and precision of the models. Previous studies suggested that ART may lead to more accurate results in well-documented exposure situations. However, the choice of the adequate model should ultimately be determined by the quality of the available exposure data: if the practitioner is uncertain concerning two or more decisions in the entry parameters, Stoffenmanager may be more robust than ART.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The nuclear receptors are a large family of eukaryotic transcription factors that constitute major pharmacological targets. They exert their combinatorial control through homotypic heterodimerisation. Elucidation of this dimerisation network is vital in order to understand the complex dynamics and potential cross-talk involved. RESULTS: Phylogeny, protein-protein interactions, protein-DNA interactions and gene expression data have been integrated to provide a comprehensive and up-to-date description of the topology and properties of the nuclear receptor interaction network in humans. We discriminate between DNA-binding and non-DNA-binding dimers, and provide a comprehensive interaction map, that identifies potential cross-talk between the various pathways of nuclear receptors. CONCLUSION: We infer that the topology of this network is hub-based, and much more connected than previously thought. The hub-based topology of the network and the wide tissue expression pattern of NRs create a highly competitive environment for the common heterodimerising partners. Furthermore, a significant number of negative feedback loops is present, with the hub protein SHP [NR0B2] playing a major role. We also compare the evolution, topology and properties of the nuclear receptor network with the hub-based dimerisation network of the bHLH transcription factors in order to identify both unique themes and ubiquitous properties in gene regulation. In terms of methodology, we conclude that such a comprehensive picture can only be assembled by semi-automated text-mining, manual curation and integration of data from various sources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research considers the problem of spatial data classification using machine learning algorithms: probabilistic neural networks (PNN) and support vector machines (SVM). As a benchmark model simple k-nearest neighbor algorithm is considered. PNN is a neural network reformulation of well known nonparametric principles of probability density modeling using kernel density estimator and Bayesian optimal or maximum a posteriori decision rules. PNN is well suited to problems where not only predictions but also quantification of accuracy and integration of prior information are necessary. An important property of PNN is that they can be easily used in decision support systems dealing with problems of automatic classification. Support vector machine is an implementation of the principles of statistical learning theory for the classification tasks. Recently they were successfully applied for different environmental topics: classification of soil types and hydro-geological units, optimization of monitoring networks, susceptibility mapping of natural hazards. In the present paper both simulated and real data case studies (low and high dimensional) are considered. The main attention is paid to the detection and learning of spatial patterns by the algorithms applied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: Several studies observed a female advantage in the prognosis of cutaneous melanoma, for which behavioral factors or an underlying biologic mechanism might be responsible. Using complete and reliable follow-up data from four phase III trials of the European Organisation for Research and Treatment of Cancer (EORTC) Melanoma Group, we explored the female advantage across multiple end points and in relation to other important prognostic indicators. PATIENTS AND METHODS: Patients diagnosed with localized melanoma were included in EORTC adjuvant treatment trials 18832, 18871, 18952, and 18961 and randomly assigned during the period of 1984 to 2005. Cox proportional hazard models were used to calculate hazard ratios (HRs) and 95% CIs for women compared with men, adjusted for age, Breslow thickness, body site, ulceration, performed lymph node dissection, and treatment. RESULTS: A total of 2,672 patients with stage I/II melanoma were included. Women had a highly consistent and independent advantage in overall survival (adjusted HR, 0.70; 95% CI, 0.59 to 0.83), disease-specific survival (adjusted HR, 0.74; 95% CI, 0.62 to 0.88), time to lymph node metastasis (adjusted HR, 0.70; 95% CI, 0.51 to 0.96), and time to distant metastasis (adjusted HR, 0.69; 95% CI, 0.59 to 0.81). Subgroup analysis showed that the female advantage was consistent across all prognostic subgroups (with the possible exception of head and neck melanomas) and in pre- and postmenopausal age groups. CONCLUSION: Women have a consistent and independent relative advantage in all aspects of the progression of localized melanoma of approximately 30%, most likely caused by an underlying biologic sex difference.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work is to present a multitechnique approach to define the geometry, the kinematics, and the failure mechanism of a retrogressive large landslide (upper part of the La Valette landslide, South French Alps) by the combination of airborne and terrestrial laser scanning data and ground-based seismic tomography data. The advantage of combining different methods is to constrain the geometrical and failure mechanism models by integrating different sources of information. Because of an important point density at the ground surface (4. 1 points m?2), a small laser footprint (0.09 m) and an accurate three-dimensional positioning (0.07 m), airborne laser scanning data are adapted as a source of information to analyze morphological structures at the surface. Seismic tomography surveys (P-wave and S-wave velocities) may highlight the presence of low-seismic-velocity zones that characterize the presence of dense fracture networks at the subsurface. The surface displacements measured from the terrestrial laser scanning data over a period of 2 years (May 2008?May 2010) allow one to quantify the landslide activity at the direct vicinity of the identified discontinuities. An important subsidence of the crown area with an average subsidence rate of 3.07 m?year?1 is determined. The displacement directions indicate that the retrogression is controlled structurally by the preexisting discontinuities. A conceptual structural model is proposed to explain the failure mechanism and the retrogressive evolution of the main scarp. Uphill, the crown area is affected by planar sliding included in a deeper wedge failure system constrained by two preexisting fractures. Downhill, the landslide body acts as a buttress for the upper part. Consequently, the progression of the landslide body downhill allows the development of dip-slope failures, and coherent blocks start sliding along planar discontinuities. The volume of the failed mass in the crown area is estimated at 500,000 m3 with the sloping local base level method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The authors examined the associations of social support with socioeconomic status (SES) and with mortality, as well as how SES differences in social support might account for SES differences in mortality. Analyses were based on 9,333 participants from the British Whitehall II Study cohort, a longitudinal cohort established in 1985 among London-based civil servants who were 35-55 years of age at baseline. SES was assessed using participant's employment grades at baseline. Social support was assessed 3 times in the 24.4-year period during which participants were monitored for death. In men, marital status, and to a lesser extent network score (but not low perceived support or high negative aspects of close relationships), predicted both all-cause and cardiovascular mortality. Measures of social support were not associated with cancer mortality. Men in the lowest SES category had an increased risk of death compared with those in the highest category (for all-cause mortality, hazard ratio = 1.59, 95% confidence interval: 1.21, 2.08; for cardiovascular mortality, hazard ratio = 2.48, 95% confidence interval: 1.55, 3.92). Network score and marital status combined explained 27% (95% confidence interval: 14, 43) and 29% (95% confidence interval: 17, 52) of the associations between SES and all-cause and cardiovascular mortality, respectively. In women, there was no consistent association between social support indicators and mortality. The present study suggests that in men, social isolation is not only an important risk factor for mortality but is also likely to contribute to differences in mortality by SES.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The MIGCLIM R package is a function library for the open source R software that enables the implementation of species-specific dispersal constraints into projections of species distribution models under environmental change and/or landscape fragmentation scenarios. The model is based on a cellular automaton and the basic modeling unit is a cell that is inhabited or not. Model parameters include dispersal distance and kernel, long distance dispersal, barriers to dispersal, propagule production potential and habitat invasibility. The MIGCLIM R package has been designed to be highly flexible in the parameter values it accepts, and to offer good compatibility with existing species distribution modeling software. Possible applications include the projection of future species distributions under environmental change conditions and modeling the spread of invasive species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Accurate characterization of the spatial distribution of hydrological properties in heterogeneous aquifers at a range of scales is a key prerequisite for reliable modeling of subsurface contaminant transport, and is essential for designing effective and cost-efficient groundwater management and remediation strategies. To this end, high-resolution geophysical methods have shown significant potential to bridge a critical gap in subsurface resolution and coverage between traditional hydrological measurement techniques such as borehole log/core analyses and tracer or pumping tests. An important and still largely unresolved issue, however, is how to best quantitatively integrate geophysical data into a characterization study in order to estimate the spatial distribution of one or more pertinent hydrological parameters, thus improving hydrological predictions. Recognizing the importance of this issue, the aim of the research presented in this thesis was to first develop a strategy for the assimilation of several types of hydrogeophysical data having varying degrees of resolution, subsurface coverage, and sensitivity to the hydrologic parameter of interest. In this regard a novel simulated annealing (SA)-based conditional simulation approach was developed and then tested in its ability to generate realizations of porosity given crosshole ground-penetrating radar (GPR) and neutron porosity log data. This was done successfully for both synthetic and field data sets. A subsequent issue that needed to be addressed involved assessing the potential benefits and implications of the resulting porosity realizations in terms of groundwater flow and contaminant transport. This was investigated synthetically assuming first that the relationship between porosity and hydraulic conductivity was well-defined. Then, the relationship was itself investigated in the context of a calibration procedure using hypothetical tracer test data. Essentially, the relationship best predicting the observed tracer test measurements was determined given the geophysically derived porosity structure. Both of these investigations showed that the SA-based approach, in general, allows much more reliable hydrological predictions than other more elementary techniques considered. Further, the developed calibration procedure was seen to be very effective, even at the scale of tomographic resolution, for predictions of transport. This also held true at locations within the aquifer where only geophysical data were available. This is significant because the acquisition of hydrological tracer test measurements is clearly more complicated and expensive than the acquisition of geophysical measurements. Although the above methodologies were tested using porosity logs and GPR data, the findings are expected to remain valid for a large number of pertinent combinations of geophysical and borehole log data of comparable resolution and sensitivity to the hydrological target parameter. Moreover, the obtained results allow us to have confidence for future developments in integration methodologies for geophysical and hydrological data to improve the 3-D estimation of hydrological properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major challenge in community ecology is a thorough understanding of the processes that govern the assembly and composition of communities in time and space. The growing threat of climate change to the vascular plant biodiversity of fragile ecosystems such as mountains has made it equally imperative to develop comprehensive methodologies to provide insights into how communities are assembled. In this perspective, the primary objective of this PhD thesis is to contribute to the theoretical and methodological development of community ecology, by proposing new solutions to better detect the ecological and evolutionary processes that govern community assembly. As phylogenetic trees provide by far, the most advanced tools to integrate the spatial, ecological and evolutionary dynamics of plant communities, they represent the cornerstone on which this work was based. In this thesis, I proposed new solutions to: (i) reveal trends in community assembly on phylogenies, depicted by the transition of signals at the nodes of the different species and lineages responsible for community assembly, (ii) contribute to evidence the importance of evolutionarily labile traits in the distribution of mountain plant species. More precisely, I demonstrated that phylogenetic and functional compositional turnover in plant communities was driven by climate and human land use gradients mostly influenced by evolutionarily labile traits, (iii) predict and spatially project the phylogenetic structure of communities using species distribution models, to identify the potential distribution of phylogenetic diversity, as well as areas of high evolutionary potential along elevation. The altitudinal setting of the Diablerets mountains (Switzerland) provided an appropriate model for this study. The elevation gradient served as a compression of large latitudinal variations similar to a collection of islands within a single area, and allowed investigations on a large number of plant communities. Overall, this thesis highlights that stochastic and deterministic environmental filtering processes mainly influence the phylogenetic structure of plant communities in mountainous areas. Negative density-dependent processes implied through patterns of phylogenetic overdispersion were only detected at the local scale, whereas environmental filtering implied through phylogenetic clustering was observed at both the regional and local scale. Finally, the integration of indices of phylogenetic community ecology with species distribution models revealed the prospects of providing novel and insightful explanations on the potential distribution of phylogenetic biodiversity in high mountain areas. These results generally demonstrate the usefulness of phylogenies in inferring assembly processes, and are worth considering in the theoretical and methodological development of tools to better understand phylogenetic community structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Sagopilone (ZK 219477), a lipophylic and synthetic analog of epothilone B, that crosses the blood-brain barrier has demonstrated preclinical activity in glioma models.Patients and methods: Patients with first recurrence/progression of glioblastoma were eligible for this early phase II and pharmacokinetic study exploring single-agent sagopilone (16 mg/m(2) over 3 h every 21 days). Primary end point was a composite of either tumor response or being alive and progression free at 6 months. Overall survival, toxicity and safety and pharmacokinetics were secondary end points.Results: Thirty-eight (evaluable 37) patients were included. Treatment was well tolerated, and neuropathy occurred in 46% patients [mild (grade 1) : 32%]. No objective responses were seen. The progression-free survival (PFS) rate at 6 months was 6.7% [95% confidence interval (CI) 1.3-18.7], the median PFS was just over 6 weeks, and the median overall survival was 7.6 months (95% CI 5.3-12.3), with a 1-year survival rate of 31.6% (95% CI 17.7-46.4). Maximum plasma concentrations were reached at the end of the 3-h infusion, with rapid declines within 30 min after termination.Conclusions: No evidence of relevant clinical antitumor activity against recurrent glioblastoma could be detected. Sagopilone was well tolerated, and moderate-to-severe peripheral neuropathy was observed in despite prolonged administration.