336 resultados para Exposure Modeling
Resumo:
The unstable rock slope, Stampa, above the village of Flåm, Norway, shows signs of both active and postglacial gravitational deformation over an area of 11 km2. Detailed structural field mapping, annual differential Global Navigation Satellite System (GNSS) surveys, as well as geomorphic analysis of high-resolution digital elevation models based on airborne and terrestrial laser scanning indicate that slope deformation is complex and spatially variable. Numerical modeling was used to investigate the influence of former rockslide activity and to better understand the failure mechanism. Field observations, kinematic analysis and numerical modeling indicate a strong structural control of the unstable area. Based on the integration of the above analyses, we propose that the failure mechanism is dominated by (1) a toppling component, (2) subsiding bilinear wedge failure and (3) planar sliding along the foliation at the toe of the unstable slope. Using differential GNSS, 18 points were measured annually over a period of up to 6 years. Two of these points have an average yearly movement of around 10 mm/year. They are located at the frontal cliff on almost completely detached blocks with volumes smaller than 300,000 m3. Large fractures indicate deep-seated gravitational deformation of volumes reaching several 100 million m3, but the movement rates in these areas are below 2 mm/year. Two different lobes of prehistoric rock slope failures were dated with terrestrial cosmogenic nuclides. While the northern lobe gave an average age of 4,300 years BP, the southern one resulted in two different ages (2,400 and 12,000 years BP), which represent most likely multiple rockfall events. This reflects the currently observable deformation style with unstable blocks in the northern part in between Joasete and Furekamben and no distinct blocks but a high rockfall activity around Ramnanosi in the south. With a relative susceptibility analysis it is concluded that small collapses of blocks along the frontal cliff will be more frequent. Larger collapses of free-standing blocks along the cliff with volumes > 100,000 m3, thus large enough to reach the fjord, cannot be ruled out. A larger collapse involving several million m3 is presently considered of very low likelihood.
Resumo:
Pesticide run-off into the ocean represents a potential threat to marine organisms, especially bivalves living in coastal environments. However, little is known about the effects of environmentally relevant concentrations of pesticides at the individual level. In this study, the suppression subtractive hybridisation technique was used to discover the main physiological function affected by a cocktail of three pesticides (lindane, metolachlor and carbofuran) in the Pacific oyster Crassostrea gigas. Two oyster populations exposed to different pollution levels in the wild were investigated. The pesticide concentrations used to induce stress were close to those found in the wild. In a time course experiment, the expression of three genes implicated in iron metabolism and oxidative stress as well as that of two ubiquitous stress proteins was examined. No clear regulation of gene or protein expression was found, potentially due to a low-dose effect. However, we detected a strong site- and organ-specific response to the pesticides. This study thus (1) provides insight into bivalve responses to pesticide pollution at the level of the transcriptome, which is the first level of response for organisms facing pollution, and (2) raises interesting questions concerning the importance of the sites and organs studied in the toxicogenomic field.
Resumo:
Although exposure to secondhand smoke (SHS) is reportedly high in prison, few studies have measured this in the prison environment, and none have done so in Europe. We measured two indicators of SHS exposure (particulate matter PM10 and nicotine) in fixed locations before (2009) and after (2010) introduction of a partial smoking ban in a Swiss prison. Access to smoking cessation support was available to detainees throughout the study. Objectives To measure SHS before and after the introduction of a partial smoking ban. Methods Assessment of particulate matter PM10 (suspended microparticles of 10 μm) and nicotine in ambient air, collected by real-time aerosol monitor and nicotine monitoring devices. Results The authors observed a significant improvement of nicotine concentrations in the air after the introduction of the smoking ban (before: 7.0 μg/m(3), after: 2.1 μg/m(3), difference 4.9 μg/m(3), 95% CI for difference: 0.52 to 9.8, p=0.03) but not in particulate matter PM10 (before: 0.11 mg/m(3), after: 0.06 mg/m(3), difference 0.06 mg/m(3), 95% CI for difference of means: -0.07 to 0.19, p=0.30). Conclusions The partial smoking ban was followed by a decrease in nicotine concentrations in ambient air. These improvements can be attributed to the introduction of the smoking ban since no other policy change occurred during this period. Although this shows that concentrations of SHS decreased significantly, protection was still incomplete and further action is necessary to improve indoor air quality.
Resumo:
Within the ORAMED project a coordinated measurement program for occupationally exposed medical staff was performed in different hospitals in Europe. The main objectives of ORAMED were to obtain a set of standardized data on doses for staff in interventional cardiology and radiology and to optimize staff protection. Doses were measured with thermoluminescent dosemeters on the ring finger and wrist of both hands, on legs and at the level of the eyes of the main operator performing interventional procedures. In this paper an overview of the doses per procedure measured during 646 interventional cardiology procedures is given for cardiac angiographies and angioplasties (CA/PTCA), radiofrequency ablations (RFA) and pacemaker and defibrillator implantations (PM/ICD). 31% of the monitored procedures were associated with no collective protective equipment, whereas 44% involved a ceiling screen and a table curtain. Although associated with the smallest air kerma - area product (KAP), PM/ICD procedures led to the highest doses. As expected, KAP and doses values exhibited a very large variability. The left side of the operator, most frequently the closest to the X-ray scattering region, was more exposed than his right side. An analysis of the effect of parameters influencing the doses, namely collective protective equipment, X-ray tube configuration and catheter access route, was performed on the doses normalized to KAP. Ceiling screen and table curtain were observed to reduce normalized doses by atmost a factor 4, much smaller than theoretical attenuation factors typical for such protections, i.e. from 10 to 100. This observation was understood as their inappropriate use by the operators and their non-optimized design. Configurations with tube above the patient led to higher normalized doses to the operator than tube below, but the effect of using a biplane X-ray suite was more complex to analyze. For CA/PTCA procedures, the upper part of the operator's body received higher normalized doses for radial than for femoral catheter access, by atmost a factor 5. This could be seen for cases with no collective protection. The eyes were observed to receive the maximum fraction of the annual dose limit almost as frequently as legs and hands, and clearly the most frequently, if the former 150 mSv and new 20 mSv recommended limits for the lens of the eye are considered, respectively.
Resumo:
In occupational exposure assessment of airborne contaminants, exposure levels can either be estimated through repeated measurements of the pollutant concentration in air, expert judgment or through exposure models that use information on the conditions of exposure as input. In this report, we propose an empirical hierarchical Bayesian model to unify these approaches. Prior to any measurement, the hygienist conducts an assessment to generate prior distributions of exposure determinants. Monte-Carlo samples from these distributions feed two level-2 models: a physical, two-compartment model, and a non-parametric, neural network model trained with existing exposure data. The outputs of these two models are weighted according to the expert's assessment of their relevance to yield predictive distributions of the long-term geometric mean and geometric standard deviation of the worker's exposure profile (level-1 model). Bayesian inferences are then drawn iteratively from subsequent measurements of worker exposure. Any traditional decision strategy based on a comparison with occupational exposure limits (e.g. mean exposure, exceedance strategies) can then be applied. Data on 82 workers exposed to 18 contaminants in 14 companies were used to validate the model with cross-validation techniques. A user-friendly program running the model is available upon request.
Resumo:
Tractography is a class of algorithms aiming at in vivo mapping the major neuronal pathways in the white matter from diffusion magnetic resonance imaging (MRI) data. These techniques offer a powerful tool to noninvasively investigate at the macroscopic scale the architecture of the neuronal connections of the brain. However, unfortunately, the reconstructions recovered with existing tractography algorithms are not really quantitative even though diffusion MRI is a quantitative modality by nature. As a matter of fact, several techniques have been proposed in recent years to estimate, at the voxel level, intrinsic microstructural features of the tissue, such as axonal density and diameter, by using multicompartment models. In this paper, we present a novel framework to reestablish the link between tractography and tissue microstructure. Starting from an input set of candidate fiber-tracts, which are estimated from the data using standard fiber-tracking techniques, we model the diffusion MRI signal in each voxel of the image as a linear combination of the restricted and hindered contributions generated in every location of the brain by these candidate tracts. Then, we seek for the global weight of each of them, i.e., the effective contribution or volume, such that they globally fit the measured signal at best. We demonstrate that these weights can be easily recovered by solving a global convex optimization problem and using efficient algorithms. The effectiveness of our approach has been evaluated both on a realistic phantom with known ground-truth and in vivo brain data. Results clearly demonstrate the benefits of the proposed formulation, opening new perspectives for a more quantitative and biologically plausible assessment of the structural connectivity of the brain.
Resumo:
Recent studies have pointed out a similarity between tectonics and slope tectonic-induced structures. Numerous studies have demonstrated that structures and fabrics previously interpreted as of purely geodynamical origin are instead the result of large slope deformation, and this led in the past to erroneous interpretations. Nevertheless, their limit seems not clearly defined, but it is somehow transitional. Some studies point out continuity between failures developing at surface with upper crust movements. In this contribution, the main studies which examine the link between rock structures and slope movements are reviewed. The aspects regarding model and scale of observation are discussed together with the role of pre-existing weaknesses in the rock mass. As slope failures can develop through progressive failure, structures and their changes in time and space can be recognized. Furthermore, recognition of the origin of these structures can help in avoiding misinterpretations of regional geology. This also suggests the importance of integrating different slope movement classifications based on distribution and pattern of deformation and the application of structural geology techniques. A structural geology approach in the landslide community is a tool that can greatly support the hazard quantification and related risks, because most of the physical parameters, which are used for landslide modeling, are derived from geotechnical tests or the emerging geophysical approaches.
Resumo:
Natural selection is typically exerted at some specific life stages. If natural selection takes place before a trait can be measured, using conventional models can cause wrong inference about population parameters. When the missing data process relates to the trait of interest, a valid inference requires explicit modeling of the missing process. We propose a joint modeling approach, a shared parameter model, to account for nonrandom missing data. It consists of an animal model for the phenotypic data and a logistic model for the missing process, linked by the additive genetic effects. A Bayesian approach is taken and inference is made using integrated nested Laplace approximations. From a simulation study we find that wrongly assuming that missing data are missing at random can result in severely biased estimates of additive genetic variance. Using real data from a wild population of Swiss barn owls Tyto alba, our model indicates that the missing individuals would display large black spots; and we conclude that genes affecting this trait are already under selection before it is expressed. Our model is a tool to correctly estimate the magnitude of both natural selection and additive genetic variance.