43 resultados para soil data requirements
em Université de Lausanne, Switzerland
Resumo:
The DRG classification provides a useful tool for the evaluation of hospital care. Indicators such as readmissions and mortality rates adjusted for the hospital Casemix could be adopted in Switzerland at the price of minor additions to the hospital discharge record. The additional information required to build patients histories and to identify the deaths occurring after hospital discharge is detailed.
Resumo:
The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by processbased modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws.We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25m resolution.
Resumo:
Debris flows are among the most dangerous processes in mountainous areas due to their rapid rate of movement and long runout zone. Sudden and rather unexpected impacts produce not only damages to buildings and infrastructure but also threaten human lives. Medium- to regional-scale susceptibility analyses allow the identification of the most endangered areas and suggest where further detailed studies have to be carried out. Since data availability for larger regions is mostly the key limiting factor, empirical models with low data requirements are suitable for first overviews. In this study a susceptibility analysis was carried out for the Barcelonnette Basin, situated in the southern French Alps. By means of a methodology based on empirical rules for source identification and the empirical angle of reach concept for the 2-D runout computation, a worst-case scenario was first modelled. In a second step, scenarios for high, medium and low frequency events were developed. A comparison with the footprints of a few mapped events indicates reasonable results but suggests a high dependency on the quality of the digital elevation model. This fact emphasises the need for a careful interpretation of the results while remaining conscious of the inherent assumptions of the model used and quality of the input data.
Resumo:
Time-lapse geophysical data acquired during transient hydrological experiments are being increasingly employed to estimate subsurface hydraulic properties at the field scale. In particular, crosshole ground-penetrating radar (GPR) data, collected while water infiltrates into the subsurface either by natural or artificial means, have been demonstrated in a number of studies to contain valuable information concerning the hydraulic properties of the unsaturated zone. Previous work in this domain has considered a variety of infiltration conditions and different amounts of time-lapse GPR data in the estimation procedure. However, the particular benefits and drawbacks of these different strategies as well as the impact of a variety of key and common assumptions remain unclear. Using a Bayesian Markov-chain-Monte-Carlo stochastic inversion methodology, we examine in this paper the information content of time-lapse zero-offset-profile (ZOP) GPR traveltime data, collected under three different infiltration conditions, for the estimation of van Genuchten-Mualem (VGM) parameters in a layered subsurface medium. Specifically, we systematically analyze synthetic and field GPR data acquired under natural loading and two rates of forced infiltration, and we consider the value of incorporating different amounts of time-lapse measurements into the estimation procedure. Our results confirm that, for all infiltration scenarios considered, the ZOP GPR traveltime data contain important information about subsurface hydraulic properties as a function of depth, with forced infiltration offering the greatest potential for VGM parameter refinement because of the higher stressing of the hydrological system. Considering greater amounts of time-lapse data in the inversion procedure is also found to help refine VGM parameter estimates. Quite importantly, however, inconsistencies observed in the field results point to the strong possibility that posterior uncertainties are being influenced by model structural errors, which in turn underlines the fundamental importance of a systematic analysis of such errors in future related studies.
Resumo:
Nowadays, Species Distribution Models (SDMs) are a widely used tool. Using different statistical approaches these models reconstruct the realized niche of a species using presence data and a set of variables, often topoclimatic. There utilization range is quite large from understanding single species requirements, to the creation of nature reserve based on species hotspots, or modeling of climate change impact, etc... Most of the time these models are using variables at a resolution of 50km x 50km or 1 km x 1 km. However in some cases these models are used with resolutions below the kilometer scale and thus called high resolution models (100 m x 100 m or 25 m x 25 m). Quite recently a new kind of data has emerged enabling precision up to lm x lm and thus allowing very high resolution modeling. However these new variables are very costly and need an important amount of time to be processed. This is especially the case when these variables are used in complex calculation like models projections over large areas. Moreover the importance of very high resolution data in SDMs has not been assessed yet and is not well understood. Some basic knowledge on what drive species presence-absences is still missing. Indeed, it is not clear whether in mountain areas like the Alps coarse topoclimatic gradients are driving species distributions or if fine scale temperature or topography are more important or if their importance can be neglected when balance to competition or stochasticity. In this thesis I investigated the importance of very high resolution data (2-5m) in species distribution models using either very high resolution topographic, climatic or edaphic variables over a 2000m elevation gradient in the Western Swiss Alps. I also investigated more local responses of these variables for a subset of species living in this area at two precise elvation belts. During this thesis I showed that high resolution data necessitates very good datasets (species and variables for the models) to produce satisfactory results. Indeed, in mountain areas, temperature is the most important factor driving species distribution and needs to be modeled at very fine resolution instead of being interpolated over large surface to produce satisfactory results. Despite the instinctive idea that topographic should be very important at high resolution, results are mitigated. However looking at the importance of variables over a large gradient buffers the importance of the variables. Indeed topographic factors have been shown to be highly important at the subalpine level but their importance decrease at lower elevations. Wether at the mountane level edaphic and land use factors are more important high resolution topographic data is more imporatant at the subalpine level. Finally the biggest improvement in the models happens when edaphic variables are added. Indeed, adding soil variables is of high importance and variables like pH are overpassing the usual topographic variables in SDMs in term of importance in the models. To conclude high resolution is very important in modeling but necessitate very good datasets. Only increasing the resolution of the usual topoclimatic predictors is not sufficient and the use of edaphic predictors has been highlighted as fundamental to produce significantly better models. This is of primary importance, especially if these models are used to reconstruct communities or as basis for biodiversity assessments. -- Ces dernières années, l'utilisation des modèles de distribution d'espèces (SDMs) a continuellement augmenté. Ces modèles utilisent différents outils statistiques afin de reconstruire la niche réalisée d'une espèce à l'aide de variables, notamment climatiques ou topographiques, et de données de présence récoltées sur le terrain. Leur utilisation couvre de nombreux domaines allant de l'étude de l'écologie d'une espèce à la reconstruction de communautés ou à l'impact du réchauffement climatique. La plupart du temps, ces modèles utilisent des occur-rences issues des bases de données mondiales à une résolution plutôt large (1 km ou même 50 km). Certaines bases de données permettent cependant de travailler à haute résolution, par conséquent de descendre en dessous de l'échelle du kilomètre et de travailler avec des résolutions de 100 m x 100 m ou de 25 m x 25 m. Récemment, une nouvelle génération de données à très haute résolution est apparue et permet de travailler à l'échelle du mètre. Les variables qui peuvent être générées sur la base de ces nouvelles données sont cependant très coûteuses et nécessitent un temps conséquent quant à leur traitement. En effet, tout calcul statistique complexe, comme des projections de distribution d'espèces sur de larges surfaces, demande des calculateurs puissants et beaucoup de temps. De plus, les facteurs régissant la distribution des espèces à fine échelle sont encore mal connus et l'importance de variables à haute résolution comme la microtopographie ou la température dans les modèles n'est pas certaine. D'autres facteurs comme la compétition ou la stochasticité naturelle pourraient avoir une influence toute aussi forte. C'est dans ce contexte que se situe mon travail de thèse. J'ai cherché à comprendre l'importance de la haute résolution dans les modèles de distribution d'espèces, que ce soit pour la température, la microtopographie ou les variables édaphiques le long d'un important gradient d'altitude dans les Préalpes vaudoises. J'ai également cherché à comprendre l'impact local de certaines variables potentiellement négligées en raison d'effets confondants le long du gradient altitudinal. Durant cette thèse, j'ai pu monter que les variables à haute résolution, qu'elles soient liées à la température ou à la microtopographie, ne permettent qu'une amélioration substantielle des modèles. Afin de distinguer une amélioration conséquente, il est nécessaire de travailler avec des jeux de données plus importants, tant au niveau des espèces que des variables utilisées. Par exemple, les couches climatiques habituellement interpolées doivent être remplacées par des couches de température modélisées à haute résolution sur la base de données de terrain. Le fait de travailler le long d'un gradient de température de 2000m rend naturellement la température très importante au niveau des modèles. L'importance de la microtopographie est négligeable par rapport à la topographie à une résolution de 25m. Cependant, lorsque l'on regarde à une échelle plus locale, la haute résolution est une variable extrêmement importante dans le milieu subalpin. À l'étage montagnard par contre, les variables liées aux sols et à l'utilisation du sol sont très importantes. Finalement, les modèles de distribution d'espèces ont été particulièrement améliorés par l'addition de variables édaphiques, principalement le pH, dont l'importance supplante ou égale les variables topographique lors de leur ajout aux modèles de distribution d'espèces habituels.
Resumo:
The oxalatecarbonate pathway involves the oxidation of calcium oxalate to low-magnesium calcite and represents a potential long-term terrestrial sink for atmospheric CO2. In this pathway, bacterial oxalate degradation is associated with a strong local alkalinization and subsequent carbonate precipitation. In order to test whether this process occurs in soil, the role of bacteria, fungi and calcium oxalate amendments was studied using microcosms. In a model system with sterile soil amended with laboratory cultures of oxalotrophic bacteria and fungi, the addition of calcium oxalate induced a distinct pH shift and led to the final precipitation of calcite. However, the simultaneous presence of bacteria and fungi was essential to drive this pH shift. Growth of both oxalotrophic bacteria and fungi was confirmed by qPCR on the frc (oxalotrophic bacteria) and 16S rRNA genes, and the quantification of ergosterol (active fungal biomass) respectively. The experiment was replicated in microcosms with non-sterilized soil. In this case, the bacterial and fungal contribution to oxalate degradation was evaluated by treatments with specific biocides (cycloheximide and bronopol). Results showed that the autochthonous microflora oxidized calcium oxalate and induced a significant soil alkalinization. Moreover, data confirmed the results from the model soil showing that bacteria are essentially responsible for the pH shift, but require the presence of fungi for their oxalotrophic activity. The combined results highlight that the interaction between bacteria and fungi is essential to drive metabolic processes in complex environments such as soil.
Resumo:
Time-lapse crosshole ground-penetrating radar (GPR) data, collected while infiltration occurs, can provide valuable information regarding the hydraulic properties of the unsaturated zone. In particular, the stochastic inversion of such data provides estimates of parameter uncertainties, which are necessary for hydrological prediction and decision making. Here, we investigate the effect of different infiltration conditions on the stochastic inversion of time-lapse, zero-offset-profile, GPR data. Inversions are performed using a Bayesian Markov-chain-Monte-Carlo methodology. Our results clearly indicate that considering data collected during a forced infiltration test helps to better refine soil hydraulic properties compared to data collected under natural infiltration conditions
Resumo:
RESUME Durant les dernières années, les méthodes électriques ont souvent été utilisées pour l'investigation des structures de subsurface. L'imagerie électrique (Electrical Resistivity Tomography, ERT) est une technique de prospection non-invasive et spatialement intégrée. La méthode ERT a subi des améliorations significatives avec le développement de nouveaux algorithmes d'inversion et le perfectionnement des techniques d'acquisition. La technologie multicanale et les ordinateurs de dernière génération permettent la collecte et le traitement de données en quelques heures. Les domaines d'application sont nombreux et divers: géologie et hydrogéologie, génie civil et géotechnique, archéologie et études environnementales. En particulier, les méthodes électriques sont souvent employées dans l'étude hydrologique de la zone vadose. Le but de ce travail est le développement d'un système de monitorage 3D automatique, non- invasif, fiable, peu coûteux, basé sur une technique multicanale et approprié pour suivre les variations de résistivité électrique dans le sous-sol lors d'événements pluvieux. En raison des limitations techniques et afin d'éviter toute perturbation physique dans la subsurface, ce dispositif de mesure emploie une installation non-conventionnelle, où toutes les électrodes de courant sont placées au bord de la zone d'étude. Le dispositif le plus approprié pour suivre les variations verticales et latérales de la résistivité électrique à partir d'une installation permanente a été choisi à l'aide de modélisations numériques. Les résultats démontrent que le dispositif pôle-dipôle offre une meilleure résolution que le dispositif pôle-pôle et plus apte à détecter les variations latérales et verticales de la résistivité électrique, et cela malgré la configuration non-conventionnelle des électrodes. Pour tester l'efficacité du système proposé, des données de terrain ont été collectées sur un site d'étude expérimental. La technique de monitorage utilisée permet de suivre le processus d'infiltration 3D pendant des événements pluvieux. Une bonne corrélation est observée entre les résultats de modélisation numérique et les données de terrain, confirmant par ailleurs que le dispositif pôle-dipôle offre une meilleure résolution que le dispositif pôle-pôle. La nouvelle technique de monitorage 3D de résistivité électrique permet de caractériser les zones d'écoulement préférentiel et de caractériser le rôle de la lithologie et de la pédologie de manière quantitative dans les processus hydrologiques responsables d'écoulement de crue. ABSTRACT During the last years, electrical methods were often used for the investigation of subsurface structures. Electrical resistivity tomography (ERT) has been reported to be a useful non-invasive and spatially integrative prospecting technique. The ERT method provides significant improvements, with the developments of new inversion algorithms, and the increasing efficiency of data collection techniques. Multichannel technology and powerful computers allow collecting and processing resistivity data within few hours. Application domains are numerous and varied: geology and hydrogeology, civil engineering and geotechnics, archaeology and environmental studies. In particular, electrical methods are commonly used in hydrological studies of the vadose zone. The aim of this study was to develop a multichannel, automatic, non-invasive, reliable and inexpensive 3D monitoring system designed to follow electrical resistivity variations in soil during rainfall. Because of technical limitations and in order to not disturb the subsurface, the proposed measurement device uses a non-conventional electrode set-up, where all the current electrodes are located near the edges of the survey grid. Using numerical modelling, the most appropriate arrays were selected to detect vertical and lateral variations of the electrical resistivity in the framework of a permanent surveying installation system. The results show that a pole-dipole array has a better resolution than a pole-pole array and can successfully follow vertical and lateral resistivity variations despite the non-conventional electrode configuration used. Field data are then collected at a test site to assess the efficiency of the proposed monitoring technique. The system allows following the 3D infiltration processes during a rainfall event. A good correlation between the results of numerical modelling and field data results can be observed since the field pole-dipole data give a better resolution image than the pole-pole data. The new device and technique makes it possible to better characterize the zones of preferential flow and to quantify the role of lithology and pedology in flood- generating hydrological processes.
Resumo:
Water movement in unsaturated soils gives rise to measurable electrical potential differences that are related to the flow direction and volumetric fluxes, as well as to the soil properties themselves. Laboratory and field data suggest that these so-called streaming potentials may be several orders of magnitudes larger than theoretical predictions that only consider the influence of the relative permeability and electrical conductivity on the self potential (SP) data. Recent work has improved predictions somewhat by considering how the volumetric excess charge in the pore space scales with the inverse of water saturation. We present a new theoretical approach that uses the flux-averaged excess charge, not the volumetric excess charge, to predict streaming potentials. We present relationships for how this effective excess charge varies with water saturation for typical soil properties using either the water retention or the relative permeability function. We find large differences between soil types and the predictions based on the relative permeability function display the best agreement with field data. The new relationships better explain laboratory data than previous work and allow us to predict the recorded magnitudes of the streaming potentials following a rainfall event in sandy loam, whereas previous models predict values that are three orders of magnitude too small. We suggest that the strong signals in unsaturated media can be used to gain information about fluxes (including very small ones related to film flow), but also to constrain the relative permeability function, the water retention curve, and the relative electrical conductivity function.
Resumo:
A Gram-negative, rod-shaped, aerobic bacterium, designated strain RP007(T), was isolated from a polycyclic aromatic hydrocarbon-contaminated soil in New Zealand. Two additional strains were recovered from a compost heap in Belgium (LMG 18808) and from the rhizosphere of maize in the Netherlands (LMG 24204). The three strains had virtually identical 16S rRNA gene sequences and whole-cell protein profiles, and they were identified as members of the genus Burkholderia, with Burkholderia phenazinium as their closest relative. Strain RP007(T) had a DNA G+C content of 63.5 mol% and could be distinguished from B. phenazinium based on a range of biochemical characteristics. Strain RP007(T) showed levels of DNA-DNA relatedness towards the type strain of B. phenazinium and those of other recognized Burkholderia species of less than 30 %. The results of 16S rRNA gene sequence analysis, DNA-DNA hybridization experiments and physiological and biochemical tests allowed the differentiation of strain RP007(T) from all recognized species of the genus Burkholderia. Strains RP007(T), LMG 18808 and LMG 24204 are therefore considered to represent a single novel species of the genus Burkholderia, for which the name Burkholderia sartisoli sp. nov. is proposed. The type strain is RP007(T) (=LMG 24000(T) =CCUG 53604(T) =ICMP 13529(T)).
Resumo:
There is a significant potential to improve the plant-beneficial effects of root-colonizing pseudomonads by breeding wheat genotypes with a greater capacity to sustain interactions with these bacteria. However, the interaction between pseudomonads and crop plants at the cultivar level, as well as the conditions which favor the accumulation of beneficial microorganisms in the wheat rhizosphere, is largely unknown. Therefore, we characterized the three Swiss winter wheat (Triticum aestivum) cultivars Arina, Zinal, and Cimetta for their ability to accumulate naturally occurring plant-beneficial pseudomonads in the rhizosphere. Cultivar performance was measured also by the ability to select for specific genotypes of 2,4-diacetylphloroglucinol (DAPG) producers in two different soils. Cultivar-specific differences were found; however, these were strongly influenced by the soil type. Denaturing gradient gel electrophoresis (DGGE) analysis of fragments of the DAPG biosynthetic gene phlD amplified from natural Pseudomonas rhizosphere populations revealed that phlD diversity substantially varied between the two soils and that there was a cultivar-specific accumulation of certain phlD genotypes in one soil but not in the other. Furthermore, the three cultivars were tested for their ability to benefit from Pseudomonas inoculants. Interestingly, Arina, which was best protected against Pythium ultimum infection by inoculation with Pseudomonas fluorescens biocontrol strain CHA0, was the cultivar which profited the least from the bacterial inoculant in terms of plant growth promotion in the absence of the pathogen. Knowledge gained of the interactions between wheat cultivars, beneficial pseudomonads, and soil types allows us to optimize cultivar-soil combinations for the promotion of growth through beneficial pseudomonads. Additionally, this information can be implemented by breeders into a new and unique breeding strategy for low-input and organic conditions.
Resumo:
This paper presents general problems and approaches for the spatial data analysis using machine learning algorithms. Machine learning is a very powerful approach to adaptive data analysis, modelling and visualisation. The key feature of the machine learning algorithms is that they learn from empirical data and can be used in cases when the modelled environmental phenomena are hidden, nonlinear, noisy and highly variable in space and in time. Most of the machines learning algorithms are universal and adaptive modelling tools developed to solve basic problems of learning from data: classification/pattern recognition, regression/mapping and probability density modelling. In the present report some of the widely used machine learning algorithms, namely artificial neural networks (ANN) of different architectures and Support Vector Machines (SVM), are adapted to the problems of the analysis and modelling of geo-spatial data. Machine learning algorithms have an important advantage over traditional models of spatial statistics when problems are considered in a high dimensional geo-feature spaces, when the dimension of space exceeds 5. Such features are usually generated, for example, from digital elevation models, remote sensing images, etc. An important extension of models concerns considering of real space constrains like geomorphology, networks, and other natural structures. Recent developments in semi-supervised learning can improve modelling of environmental phenomena taking into account on geo-manifolds. An important part of the study deals with the analysis of relevant variables and models' inputs. This problem is approached by using different feature selection/feature extraction nonlinear tools. To demonstrate the application of machine learning algorithms several interesting case studies are considered: digital soil mapping using SVM, automatic mapping of soil and water system pollution using ANN; natural hazards risk analysis (avalanches, landslides), assessments of renewable resources (wind fields) with SVM and ANN models, etc. The dimensionality of spaces considered varies from 2 to more than 30. Figures 1, 2, 3 demonstrate some results of the studies and their outputs. Finally, the results of environmental mapping are discussed and compared with traditional models of geostatistics.
Resumo:
Validated in vitro methods for skin corrosion and irritation were adopted by the OECD and by the European Union during the last decade. In the EU, Switzerland and countries adopting the EU legislation, these assays may allow the full replacement of animal testing for identifying and classifying compounds as skin corrosives, skin irritants, and non irritants. In order to develop harmonised recommendations on the use of in vitro data for regulatory assessment purposes within the European framework, a workshop was organized by the Swiss Federal Office of Public Health together with ECVAM and the BfR. It comprised stakeholders from various European countries involved in the process from in vitro testing to the regulatory assessment of in vitro data. Discussions addressed the following questions: (1) the information requirements considered useful for regulatory assessment; (2) the applicability of in vitro skin corrosion data to assign the corrosive subcategories as implemented by the EU Classification, Labelling and Packaging Regulation; (3) the applicability of testing strategies for determining skin corrosion and irritation hazards; and (4) the applicability of the adopted in vitro assays to test mixtures, preparations and dilutions. Overall, a number of agreements and recommendations were achieved in order to clarify and facilitate the assessment and use of in vitro data from regulatory accepted methods, and ultimately help regulators and scientists facing with the new in vitro approaches to evaluate skin irritation and corrosion hazards and risks without animal data.
Resumo:
Overexpression of the tumor necrosis factor (TNF)-related apoptosis-inducing ligand (TRAIL) receptors, TRAIL-R1 and TRAIL-R2, induces apoptosis and activation of NF-kappaB in cultured cells. In this study, we have demonstrated differential signaling capacities by both receptors using either epitope-tagged soluble TRAIL (sTRAIL) or sTRAIL that was cross-linked with a monoclonal antibody. Interestingly, sTRAIL was sufficient for induction of apoptosis only in cell lines that were killed by agonistic TRAIL-R1- and TRAIL-R2-specific IgG preparations. Moreover, in these cell lines interleukin-6 secretion and NF-kappaB activation were induced by cross-linked or non-cross-linked anti-TRAIL, as well as by both receptor-specific IgGs. However, cross-linking of sTRAIL was required for induction of apoptosis in cell lines that only responded to the agonistic anti-TRAIL-R2-IgG. Interestingly, activation of c-Jun N-terminal kinase (JNK) was only observed in response to either cross-linked sTRAIL or anti-TRAIL-R2-IgG even in cell lines where both receptors were capable of signaling apoptosis and NF-kappaB activation. Taken together, our data suggest that TRAIL-R1 responds to either cross-linked or non-cross-linked sTRAIL which signals NF-kappaB activation and apoptosis, whereas TRAIL-R2 signals NF-kappaB activation, apoptosis, and JNK activation only in response to cross-linked TRAIL.