16 resultados para individual exposure
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Waterproofing agents are widely used to protect leather and textiles in both domestic and occupational activities. An outbreak of acute respiratory syndrome following exposure to waterproofing sprays occurred during the winter 2002-2003 in Switzerland. About 180 cases were reported by the Swiss Toxicological Information Centre between October 2002 and March 2003, whereas fewer than 10 cases per year had been recorded previously. The reported cases involved three brands of sprays containing a common waterproofing mixture, that had undergone a formulation change in the months preceding the outbreak. A retrospective analysis was undertaken in collaboration with the Swiss Toxicological Information Centre and the Swiss Registries for Interstitial and Orphan Lung Diseases to clarify the circumstances and possible causes of the observed health effects. Individual exposure data were generated with questionnaires and experimental emission measurements. The collected data was used to conduct numeric simulation for 102 cases of exposure. A classical two-zone model was used to assess the aerosol dispersion in the near- and far-field during spraying. The resulting assessed dose and exposure levels obtained were spread on large scales, of several orders of magnitude. No dose-response relationship was found between exposure indicators and health effects indicators (perceived severity and clinical indicators). Weak relationships were found between unspecific inflammatory response indicators (leukocytes, C-reactive protein) and the maximal exposure concentration. The results obtained disclose a high interindividual response variability and suggest that some indirect mechanism(s) predominates in the respiratory disease occurrence. Furthermore, no threshold could be found to define a safe level of exposure. These findings suggest that the improvement of environmental exposure conditions during spraying alone does not constitute a sufficient measure to prevent future outbreaks of waterproofing spray toxicity. More efficient preventive measures are needed prior to the marketing and distribution of new waterproofing agents.
Resumo:
Rice has the predilection to take up arsenic in the form of methylated arsenic (o-As) and inorganic arsenic species (i-As). Plants defend themselves using i-As efflux systems and the production of phytochelatins (PCs) to complex i-As. Our study focused on the identification and quantification of phytochelatins by HPLC-ICP-MS/ESI-MS, relating them to the several variables linked to As exposure. GSH, 11 PCs, and As–PC complexes from the roots of six rice cultivars (Italica Carolina, Dom Sofid, 9524, Kitrana 508, YRL-1, and Lemont) exposed to low and high levels of i-As were compared with total, i-As, and o-As in roots, shoots, and grains. Only Dom Sofid, Kitrana 508, and 9524 were found to produce higher levels of PCs even when exposed to low levels of As. PCs were only correlated to i-As in the roots (r=0.884, P <0.001). However, significant negative correlations to As transfer factors (TF) roots–grains (r= –0.739, P <0.05) and shoots–grains (r= –0.541, P <0.05), suggested that these peptides help in trapping i-As but not o-As in the roots, reducing grains’ i-As. Italica Carolina reduced i-As in grains after high exposure, where some specific PCs had a special role in this reduction. In Lemont, exposure to elevated levels of i-As did not result in higher i-As levels in the grains and there were no significant increases in PCs or thiols. Finally, the high production of PCs in Kitrana 508 and Dom Sofid in response to high As treatment did not relate to a reduction of i-As in grains, suggesting that other mechanisms such as As–PC release and transport seems to be important in determining grain As in these cultivars.
Resumo:
Objective To examine the associations between pet keeping in early childhood and asthma and allergies in children aged 6–10 years. Design Pooled analysis of individual participant data of 11 prospective European birth cohorts that recruited a total of over 22,000 children in the 1990s. Exposure definition Ownership of only cats, dogs, birds, rodents, or cats/dogs combined during the first 2 years of life. Outcome definition Current asthma (primary outcome), allergic asthma, allergic rhinitis and allergic sensitization during 6–10 years of age. Data synthesis Three-step approach: (i) Common definition of outcome and exposure variables across cohorts; (ii) calculation of adjusted effect estimates for each cohort; (iii) pooling of effect estimates by using random effects meta-analysis models. Results We found no association between furry and feathered pet keeping early in life and asthma in school age. For example, the odds ratio for asthma comparing cat ownership with “no pets” (10 studies, 11489 participants) was 1.00 (95% confidence interval 0.78 to 1.28) (I2 = 9%; p = 0.36). The odds ratio for asthma comparing dog ownership with “no pets” (9 studies, 11433 participants) was 0.77 (0.58 to 1.03) (I2 = 0%, p = 0.89). Owning both cat(s) and dog(s) compared to “no pets” resulted in an odds ratio of 1.04 (0.59 to 1.84) (I2 = 33%, p = 0.18). Similarly, for allergic asthma and for allergic rhinitis we did not find associations regarding any type of pet ownership early in life. However, we found some evidence for an association between ownership of furry pets during the first 2 years of life and reduced likelihood of becoming sensitized to aero-allergens. Conclusions Pet ownership in early life did not appear to either increase or reduce the risk of asthma or allergic rhinitis symptoms in children aged 6–10. Advice from health care practitioners to avoid or to specifically acquire pets for primary prevention of asthma or allergic rhinitis in children should not be given.
Resumo:
With increasing life expectancy and active lifestyles, the longevity of arthroplasties has become an important problem in orthopaedic surgery and will remain so until novel approaches to joint preservation have been developed. The sensitivity of the recipient to the metal alloys may be one of the factors limiting the lifespan of implants. In the present study, the response of human monocytes from peripheral blood to an exposure to metal ions was investigated, using the method of real-time polymerase chain reaction (PCR)-based low-density arrays. Upon stimulation with bivalent (Co2+ and Ni2+) and trivalent (Ti3+) cations and with the calcium antagonist LaCl3, the strength of the elicited monocytic response was in the order of Co2+ > or = Ni2+ > Ti3+ > or = LaCl3. The transcriptional regulation of the majority of genes affected by the exposure of monocytes to Co2+ and Ni2+ was similar. Some genes critically involved in the processes of inflammation and bone resorption, however, were found to be differentially regulated by these bivalent cations. The data demonstrate that monocytic gene expression is adapted in response to metal ions and that this response is, in part, specific for the individual metals. It is suggested that metal alloys used in arthroplasties may affect the extent of inflammation and bone resorption in the peri-implant tissues in dependence of their chemical composition.
Resumo:
AIMS: The objective of the present study was to investigate the relationship between extremely low-frequency magnetic field (ELF-MF) exposure and mortality from several neurodegenerative conditions in Swiss railway employees. METHODS: We studied a cohort of 20,141 Swiss railway employees with 464,129 person-years of follow-up between 1972 and 2002. For each individual, cumulative exposure was calculated from on-site measurements and modelling of past exposure. We compared cause-specific mortality in highly exposed train drivers (mean exposure: 21 microT) with less exposed occupational groups (for example station masters: 1 microT). RESULTS: The hazard ratio for train drivers compared to station masters was 1.96 [95% confidence interval (CI) = 0.98-3.92] for senile dementia and 3.15 (95% CI = 0.90-11.04) for Alzheimer's disease. For every 10 microT years of cumulative exposure senile dementia mortality increased by 5.7% (95% CI = 1.3-10.4), Alzheimer's disease by 9.4% (95% CI = 2.7-16.4) and amyotrophic lateral sclerosis by 2.1% (95% CI = -6.8 to 11.7). There was no evidence for an increase in mortality from Parkinson's disease and multiple sclerosis. CONCLUSIONS: This study suggests a link between exposure to ELF-MF and Alzheimer's disease and indicates that ELF-MF might act in later stages of the disease process.
Resumo:
AIMS: To investigate the relationship between extremely low frequency magnetic field (ELF-MF) exposure and mortality from leukaemia and brain tumour in a cohort of Swiss railway workers. METHODS: 20,141 Swiss railway employees with 464,129 person-years of follow-up between 1972 and 2002 were studied. Mortality rates for leukaemia and brain tumour of highly exposed train drivers (21 muT average annual exposure) were compared with medium and low exposed occupational groups (i.e. station masters with an average exposure of 1 muT). In addition, individual cumulative exposure was calculated from on-site measurements and modelling of past exposures. RESULTS: The hazard ratio (HR) for leukaemia mortality of train drivers was 1.43 (95% CI 0.74 to 2.77) compared with station masters. For myeloid leukaemia the HR of train drivers was 4.74 (95% CI 1.04 to 21.60) and for Hodgkin's disease 3.29 (95% CI 0.69 to 15.63). Lymphoid leukaemia, non-Hodgkin's disease and brain tumour mortality were not associated with magnetic field exposure. Concordant results were obtained from analyses based on individual cumulative exposure. CONCLUSIONS: Some evidence of an exposure-response association was found for myeloid leukaemia and Hodgkin's disease, but not for other haematopoietic and lymphatic malignancies and brain tumours.
Resumo:
A new study is presently being conducted on the exposure of the Swiss population to radiation by diagnostic measures. This study is performed by the Department of Medical Radiation Physics of the University of Berne in collaboration with the Federal Health Bureau and the Swiss Institute for Health and Hospital Matters. In earlier studies the genetically significant exposure of the population and subsequently the median exposure of the red bone marrow had been investigated, whereas now the risk exposure to radiation of as far as possible practically all the risk-relevant organs will be studied. Prior to the initiation of the study, all results of earlier investigations during 1957, 1971 and 1978 were collected and analysed. It was found that the published results are hardly comparable, since the first study was based on individual X-ray examinations and the two subsequent studies on the localised X-ray examinations. To ensure that all data are now comparable, the results of the three studies were appropriately recalculated. Although certain assumptions had to be made that cannot be fully verified any more in view of the time that has elapsed, the collected results will provide a fairly reliable overview of the present-day state of knowledge in this particular field.
Resumo:
Despite the important role of the Central Andes (15–30° S) for climate reconstruction, knowledge about the Quaternary glaciation is very limited due to the scarcity of organic material for radiocarbon dating. We applied 10Be surface exposure dating (SED) on 22 boulders from moraines in the Cordon de Doña Rosa, Northern/Central Chile (~31° S). The results show that several glacial advances in the southern Central Andes occurred during the Late Glacial between ~14.7±1.5 and 11.6±1.2 ka. A much more extensive glaciation is dated to ~32±3 ka, predating the temperature minimum of the global LGM (Last Glacial Maximum: ~20 ka). Reviewing these results in the paleoclimatic context, we conclude that the Late Glacial advances were most likely caused by an intensification of the tropical circulation and a corresponding increase in summer precipitation. High-latitude temperatures minima, e.g. the Younger Dryas (YD) and the Antarctic Cold Reversal (ACR) may have triggered individual advances, but current systematic exposure age uncertainties limit precise correlations. The absence of LGM moraines indicates that moisture advection was too limited to allow significant glacial advances at ~20 ka. The tropical circulation was less intensive despite the maximum in austral summer insolation. Winter precipitation was apparently also insufficient, although pollen and marine studies indicate a northward shift of the westerlies at that time. The dominant pre-LGM glacial advances in Northern/Central Chile at ~32 ka required lower temperatures and increased precipitation than today. We conclude that the westerlies were more intense and/or shifted equatorward, possibly due to increased snow and ice cover at higher southern latitudes coinciding with a minimum of insolation.
Resumo:
BACKGROUND: Little is known about the population's exposure to radio frequency electromagnetic fields (RF-EMF) in industrialized countries. OBJECTIVES: To examine levels of exposure and the importance of different RF-EMF sources and settings in a sample of volunteers living in a Swiss city. METHODS: RF-EMF exposure of 166 volunteers from Basel, Switzerland, was measured with personal exposure meters (exposimeters). Participants carried an exposimeter for 1 week (two separate weeks in 32 participants) and completed an activity diary. Mean values were calculated using the robust regression on order statistics (ROS) method. RESULTS: Mean weekly exposure to all RF-EMF sources was 0.13 mW/m(2) (0.22 V/m) (range of individual means 0.014-0.881 mW/m(2)). Exposure was mainly due to mobile phone base stations (32.0%), mobile phone handsets (29.1%) and digital enhanced cordless telecommunications (DECT) phones (22.7%). Persons owning a DECT phone (total mean 0.15 mW/m(2)) or mobile phone (0.14 mW/m(2)) were exposed more than those not owning a DECT or mobile phone (0.10 mW/m(2)). Mean values were highest in trains (1.16 mW/m(2)), airports (0.74 mW/m(2)) and tramways or buses (0.36 mW/m(2)), and higher during daytime (0.16 mW/m(2)) than nighttime (0.08 mW/m(2)). The Spearman correlation coefficient between mean exposure in the first and second week was 0.61. CONCLUSIONS: Exposure to RF-EMF varied considerably between persons and locations but was fairly consistent within persons. Mobile phone handsets, mobile phone base stations and cordless phones were important sources of exposure in urban Switzerland.
Resumo:
Exposure to polycyclic aromatic hydrocarbons (PAH) and DNA damage were analyzed in coke oven (n = 37), refractory (n = 96), graphite electrode (n = 26), and converter workers (n = 12), whereas construction workers (n = 48) served as referents. PAH exposure was assessed by personal air sampling during shift and biological monitoring in urine post shift (1-hydroxypyrene, 1-OHP and 1-, 2 + 9-, 3-, 4-hydroxyphenanthrenes, SigmaOHPHE). DNA damage was measured by 8-oxo-7,8-dihydro-2'-deoxyguanosine (8-oxodGuo) and DNA strand breaks in blood post shift. Median 1-OHP and SigmaOHPHE were highest in converter workers (13.5 and 37.2 microg/g crea). The industrial setting contributed to the metabolite concentrations rather than the air-borne concentration alone. Other routes of uptake, probably dermal, influenced associations between air-borne concentrations and levels of PAH metabolites in urine making biomonitoring results preferred parameters to assess exposure to PAH. DNA damage in terms of 8-oxo-dGuo and DNA strand breaks was higher in exposed workers compared to referents ranking highest for graphite-electrode production. The type of industry contributed to genotoxic DNA damage and DNA damage was not unequivocally associated to PAH on the individual level most likely due to potential contributions of co-exposures.
Resumo:
Background: In contrast with established evidence linking high doses of ionizing radiation with childhood cancer, research on low-dose ionizing radiation and childhood cancer has produced inconsistent results. Objective: We investigated the association between domestic radon exposure and childhood cancers, particularly leukemia and central nervous system (CNS) tumors. Methods: We conducted a nationwide census-based cohort study including all children < 16 years of age living in Switzerland on 5 December 2000, the date of the 2000 census. Follow-up lasted until the date of diagnosis, death, emigration, a child’s 16th birthday, or 31 December 2008. Domestic radon levels were estimated for each individual home address using a model developed and validated based on approximately 45,000 measurements taken throughout Switzerland. Data were analyzed with Cox proportional hazard models adjusted for child age, child sex, birth order, parents’ socioeconomic status, environmental gamma radiation, and period effects. Results: In total, 997 childhood cancer cases were included in the study. Compared with children exposed to a radon concentration below the median (< 77.7 Bq/m3), adjusted hazard ratios for children with exposure ≥ the 90th percentile (≥ 139.9 Bq/m3) were 0.93 (95% CI: 0.74, 1.16) for all cancers, 0.95 (95% CI: 0.63, 1.43) for all leukemias, 0.90 (95% CI: 0.56, 1.43) for acute lymphoblastic leukemia, and 1.05 (95% CI: 0.68, 1.61) for CNS tumors. Conclusions: We did not find evidence that domestic radon exposure is associated with childhood cancer, despite relatively high radon levels in Switzerland.
Resumo:
This paper surveys the currency risk management practices of Swiss industrial corporations. We find tha industrials do not quantify their currency risk exposure and investigate possible reasons. One possibility is that firms do not think they need to know because they use on-balance-sheet instruments to protect themselves before and after currency rates reach troublesome levels. This is puzzling because a rough estimate of at least cash flow exposure is not a prohibitive task and could be helpful. It is also puzzling that firms use currency derivatives to hedge/insure individual short-term transactions, without apparently trying to estimate aggregate transaction exposure.
Resumo:
Radon plays an important role for human exposure to natural sources of ionizing radiation. The aim of this article is to compare two approaches to estimate mean radon exposure in the Swiss population: model-based predictions at individual level and measurement-based predictions based on measurements aggregated at municipality level. A nationwide model was used to predict radon levels in each household and for each individual based on the corresponding tectonic unit, building age, building type, soil texture, degree of urbanization, and floor. Measurement-based predictions were carried out within a health impact assessment on residential radon and lung cancer. Mean measured radon levels were corrected for the average floor distribution and weighted with population size of each municipality. Model-based predictions yielded a mean radon exposure of the Swiss population of 84.1 Bq/m(3) . Measurement-based predictions yielded an average exposure of 78 Bq/m(3) . This study demonstrates that the model- and the measurement-based predictions provided similar results. The advantage of the measurement-based approach is its simplicity, which is sufficient for assessing exposure distribution in a population. The model-based approach allows predicting radon levels at specific sites, which is needed in an epidemiological study, and the results do not depend on how the measurement sites have been selected.
Resumo:
The strength of top-down indirect effects of carnivores on plants (trophic cascades) varies greatly and may depend on the identity of the intermediate (herbivore) species. If the effect strength is linked to functional traits of the herbivores then this would allow for more general predictions. Due to the generally sub-lethal effects of herbivory in terrestrial systems, trophic cascades manifest themselves in the first instance in the fitness of individual plants, affecting both their numerical and genetic contributions to the population. We directly compare the indirect predator effects on growth and reproductive output of individual Vicia faba plants mediated by the presence of two aphid species: Acyrtosiphon pisum is characterised by a boom and bust strategy whereby colonies grow fast and overexploit their host plant individual while Megoura viciae appear to follow a more prudent strategy that avoids over-exploitation and death of the host plant.Plants in the field were infested with A. pisum, M. viciae or both and half the plants were protected from predators. Exposure to predators had a strong impact on the biomass of individual plants and the strength of this effect differed significantly between the different herbivore treatments.A. pisum had a greater direct impact on plants and this was coupled with a significantly stronger indirect predator effect on plant biomass.Although the direct impact of predators was strongest on M. viciae, this was not transmitted to the plant level, indicating that the predator-prey interactions strength is not as important as the plant-herbivore link for the magnitude of the indirect predator impact. At the individual plant level, the indirect predator effect was purely due to consumptive effects on herbivore densities with no evidence for increased herbivore dispersal in response to presence of predators. The nature of plant-herbivore interactions is the key to terrestrial trophic cascade strength. The two herbivores that we compared were similar in feeding mode and body size but differed their way how they exploit host plants, which was the important trait explaining the strength of the trophic cascade.
Resumo:
BACKGROUND Observational studies of a putative association between hormonal contraception (HC) and HIV acquisition have produced conflicting results. We conducted an individual participant data (IPD) meta-analysis of studies from sub-Saharan Africa to compare the incidence of HIV infection in women using combined oral contraceptives (COCs) or the injectable progestins depot-medroxyprogesterone acetate (DMPA) or norethisterone enanthate (NET-EN) with women not using HC. METHODS AND FINDINGS Eligible studies measured HC exposure and incident HIV infection prospectively using standardized measures, enrolled women aged 15-49 y, recorded ≥15 incident HIV infections, and measured prespecified covariates. Our primary analysis estimated the adjusted hazard ratio (aHR) using two-stage random effects meta-analysis, controlling for region, marital status, age, number of sex partners, and condom use. We included 18 studies, including 37,124 women (43,613 woman-years) and 1,830 incident HIV infections. Relative to no HC use, the aHR for HIV acquisition was 1.50 (95% CI 1.24-1.83) for DMPA use, 1.24 (95% CI 0.84-1.82) for NET-EN use, and 1.03 (95% CI 0.88-1.20) for COC use. Between-study heterogeneity was mild (I2 < 50%). DMPA use was associated with increased HIV acquisition compared with COC use (aHR 1.43, 95% CI 1.23-1.67) and NET-EN use (aHR 1.32, 95% CI 1.08-1.61). Effect estimates were attenuated for studies at lower risk of methodological bias (compared with no HC use, aHR for DMPA use 1.22, 95% CI 0.99-1.50; for NET-EN use 0.67, 95% CI 0.47-0.96; and for COC use 0.91, 95% CI 0.73-1.41) compared to those at higher risk of bias (pinteraction = 0.003). Neither age nor herpes simplex virus type 2 infection status modified the HC-HIV relationship. CONCLUSIONS This IPD meta-analysis found no evidence that COC or NET-EN use increases women's risk of HIV but adds to the evidence that DMPA may increase HIV risk, underscoring the need for additional safe and effective contraceptive options for women at high HIV risk. A randomized controlled trial would provide more definitive evidence about the effects of hormonal contraception, particularly DMPA, on HIV risk.